Tag Archives: Trends

The End of Being Organized

Software is getting better and better at helping manage your life, so that you don’t need stay as organized as you used to. In fact, this article from one tech journalist is even titled “stay disorganized.” In the abstract, this is a good thing. Why should people have to remember stuff, or spend time organizing their lives, when computers can do it for them? Isn’t that one of the reasons we have computers…to do the boring stuff for us? But for people who are really organized, like me, this means that technology is taking away one of our comparative advantages. Historically I have been more productive than average, since I was really organized about my work. Now technology has reduced that advantage.

I think the first step in this direction was when Google introduced Desktop in 2004. It searched your PC much faster than the old Windows Explorer search, so that you could find files (word docs, spreadsheets, etc.) even if you couldn’t remember where you saved them or what you named them. That was awesome, except that I already knew where all my files were, because I was an aggressive user of folders and subfolders (sharp-eyed readers know that I have previously commented on folder people vs. non-folder people). Thanks to Desktop, the five minutes I spent working while someone else was searching for a spreadsheet was reduced to five seconds. My productivity advantage disappeared.

Then Google brought that functionality to email (no folders at all when Gmail launched), again eliminating the advantage of my clever folder systems. And now we are seeing apps that apply that same computerized organization to your entire life. What if you forgot to print a travel itinerary, or even write down your flight number? No problem, Google Now will do all that for you. So much for my advantage of having a detailed itinerary prepared, breezing me to my destination ahead of everyone else. EasilyDo and its competitors will help manage your duplicate contacts, remind you of your mom’s birthday, and even buzz you when you haven’t returned your CEO’s phone call.

For society, this is great, freeing up space in people’s brains to write, or cure cancer, or develop more organizational apps. For me, it’s a disaster. I had one claim to fame – being organized – and now it’s gone. I guess I need to find an old has-been app.

Native Advertising: Creative is King

In the internet media world, “native advertising” is all the rage. Native ads are ads that have enough content to be interesting to readers, and are more organically embedded in the content of the website. In other words, advertorials rather than intrusive banner ads. Some people complain that native advertising breaches the editorial wall, and others say that such breaches are the only way for publishers to make money online. There is truth to both sides of that argument.

But editorial purity is not the point of this post. Instead, I want to focus on how the success of native advertising shows that, yet again, content is king. When you read about how and why native advertising is working (like here and here), you’ll see it constantly described as great content you want to read, as adding value to consumers, etc. Native advertising works to the extent that the content is actually good. Because what consumers care about is content.

Or, in the context of an advertising agency, creative is king. If you want to produce good content, you have to have good creative people doing the producing. I think that the real development here is not native vs. non-native, but rather that marketers and their ad agencies are finally putting serious creativity into online advertising.

Since the beginning of the online era, I’ve felt that the main problem with banner ads was not that they were intrusive, but that they sucked. Nobody put any creativity into them. Partly that was driven by size – there just isn’t enough real estate to do much with them – but partly it’s because online campaigns aren’t the glory campaigns in an agency. TV is where the glory is earned for ad execs. TV commercials win awards and get talked about. Online, on the other hand, is boring. It’s basically direct mail, for god’s sake.

Native ads, however, are larger and have a variety of possible formats. There is room for far more creativity than you have in banners. It’s still not a super bowl TV spot, but it’s a much broader canvas than a banner ad. Moreover, with so much of advertising budgets shifting to online, web advertising is getting more respect within agencies, and so top people are working on the web ads.

Yes, native ads are in the middle of the content, but that isn’t their innovation. Their innovation is in size and flexibility, which enables creativity. Thus, the real key to native is that it gives marketers room to focus on quality. And when quality work is done, people pay attention.

Technology, Hubris and Lunch

As you may know, here in Silicon Valley the latest thing is for companies to provide all their employees with free lunches (and often breakfast and dinner too). I think Google was the first to do this, and Facebook followed them, and now even small startups bring in a catered lunch every day, or even hire their own chef. This week the WSJ reported that the IRS is looking into whether this perk should be taxed like most employee perks are. After all, the IRS thinking goes, this is effectively compensation.

I’m no expert on tax law, so I can’t really say whether these lunches should be taxed or not. The way the WSJ laid out the issue, it certainly seems like taxation is the legal path, but the article may have not framed the issue properly.

But one of the arguments that tech companies are making is that the lunches aren’t compensation, but an essential part of the collaborative culture of Silicon Valley. As one tax attorney put it, “there are real benefits for knowledge workers in having unplanned, face to face interaction.” This is complete crap.

Can anyone say with a straight face that it’s essential for an engineer to run into a marketer at Facebook, but that doesn’t matter at Procter & Gamble, or at Caterpillar? That somehow cooperation is more impactful at technology companies than other companies? Sheer idiocy. Having interaction between various constituents of a company is valuable no matter what the company does. To claim that somehow it’s different in Silicon Valley is just the height of hubris.

Spamming Your Friends on Facebook

There are many great things about social media, but there are definitely some pretty crappy elements too.

One of those crappy elements is the tendency of people to use their news feeds to promote their business. You see this on Facebook, LinkedIn, Twitter – someone puts into their feed a blurb about their company or their professional life:

  • My store was just mentioned in People magazine!
  • Vote for my tech company in this best-startup competition
  • Check out my interview on CNN regarding spamming your friends

It takes the self-glorification that already pervades social media – “look how great my life is!” – and adds a professional component. Seeing these items in a friend’s news feed, where I can’t avoid them, is sort of like the friend giving my email address to a spammer, but instead of some stranger peddling me Viagra, it’s my own friend doing the spamming.

When you bring money and career into the news feed glory wall, it commercializes friendships; people are turning friends into customers. And I’m not sure that transformation is reversible. Once you’ve monetized our relationship, can I ever see you as just a friend again?

Y Combinator: Living the Bubble Dream

Two of the higher profile technology incubator programs – Y Combinator and Tech Stars – recently announced their graduating classes (read about them here and here), and in looking at the companies, I saw, yet again, some reminders of the 1999-like frenzy that the technology industry is currently experiencing.

A few thoughts:

  • Not everything needs to happen online; some things (eg. grocery shopping) satisfy a ton of people in their offline incarnation
  • Lots of things are already online and don’t need a new vendor. Just because you call yourself the Airbnb of vacation rentals doesn’t mean that VRBO, the very successful existing vacation rental website, needs to be “disrupte.”
  • Vertical slicing doesn’t work online. It turns out that the Yelp for contractors is Yelp.

We saw this back in 1999: remember “vertical portals?” Yahoo for gays was PlanetOut, and that didn’t work out too well at all. Vertical slices sound good on paper, but they just don’t work; online it’s just too easy to move from site to site to get what you want. We also saw in 1999 the dot-coming of everything. “We’re going to take your garden online!” Umm, no, you aren’t.

The article about the Y Combinator class even admitted that these companies aren’t world changers, but “perhaps they’ll save a headache or two.” When this is the best that a boosterish tech reporter can come up with, you’ve got problems.

Kids Say Hey: The Casualization of America

Here is something interesting I’ve noticed over the last year or so. The generations younger than mine – let’s say everyone under the age of 25 or so – use the word “hey” the same way my generation used “hello” and “dear” and “to whom it may concern.” When I get a cold email from a recent college grad who wants an informational interview, she starts it “Hey Thoughtbasket.” When I was in her shoes, I started such letters “Dear Thoughtbasket.” When my nephew sends me an email, he uses ‘hey” instead of “hi” or “hello” or just “Thoughtbasket.” When there are notes in the common areas of my building, they begin with “hey fellow tenants.”

I don’t love “hey” as a word; it’s too vague for me; I prefer more precision in my language. However, the real point of this post is to use “hey” as an example of the casualization of our society. Rather than a formal structure, in which the younger generation uses respectful language toward their elders, our society has eased into a more casual stance, in which we’re all pals who can say “hey” and then high-five each other. I’m not saying this is a bad thing….I am generally in favor of breaking down barriers, whether they are class-based or age-based. But it does seem kind of coarse. Like when there was that controversy a few years ago because a women’s athletic team wore flip-flops to the White House. There are situations where a little respect can go a long way, and respect is not conveyed by the word “hey.”

And of course, regular Thoughtbasket readers know how I feel about flip-flops; they were the topic of my first blog posting ever.

Go casual! Flip flops at the White House.

Go casual! Flip-flops at the White House.

Is Gun Culture Just As Bad As Hollywood?

When Wayne LaPierre of the NRA held his famous press conference after the Sandy Hook massacre, he criticized and cast blame on Hollywood and the videogame industry and their violent products. This is a common trope of the NRA and certain elements of the gun crowd: that our society’s media products glorify violence and create a culture where massacres are bound to happen.  According to LaPierre, the videogame industry is “a callous, corrupt and corrupting shadow industry that sells and stows violence against its own people” and thus we have “killers, robbers, rapists, gang members who have spread like cancer in every community across our nation.”

And here’s the thing: I don’t totally disagree. The data from studies of this are inconclusive, including a new study that just came out: see more here and here. But to me it seems hard to believe that a person, especially a malleable teenager, can keep watching grotesquely violent movies like the Saw series, or playing shoot ‘em up games like Doom or Killzone, and not become slightly inured to violence. Maybe more violent, maybe not, but certainly with a greater tolerance for violence.

But if you buy into the concept that violent memes in culture could play a role, then the NRA itself, and those same certain elements of the gun crowd, are just as culpable as Hollywood and videogame makers.  I mean, look at the NRA’s favorite saying: “I’ll give you my gun when you pry it from my cold, dead hands.” Kind of violent, right? Or another quote from LaPierre at his press conference: “The only thing that stops a bad guy with a gun is a good guy with a gun.” How is that statement less glorifying of violence than Killzone? Or remember Sharron Angle and her call for “2nd amendment remedies” to government decisions she didn’t like? That sounds like a glorification of violence to me. How about the claim of gun rights activists that the 2nd amendment is all about fighting tyranny by enabling armed revolt against the government; that too sounds like promoting violence against laws you don’t like. [Note: I am not versed in the history of the 2nd amendment, so I have no idea if this claim is right or not; I only point out that it tends to be expressed in a way that glorifies violence.]

There is a ton to be said about gun rights and gun safety, and I’m not saying any of it, although I did recently point out that stupidity + guns = badness. But I am saying that if you want to talk about how culture breeds violence, you better be careful about your own words, because they too might breed violence. In my opinion, the gun crowd isn’t careful, and their cultural contributions are as violence-tinged as anything out of Hollywood.

By the way, has anyone ever noticed that LaPierre sort of looks like the villain in Raiders of the Lost Ark, right before the Ark of the Covenant melts his face away?

Wayne LaPierre

Wayne LaPierre

Villain With Face Melting

Villain With Face Melting

Why is Health Care So Expensive?

According to Steven Brill, whose 26,000 word article in Time is getting all kinds of attention, one big factor is price negotiation. An uninsured patient can’t negotiate at all, so they get charged $1.50 for a single Tylenol in a hospital. Insurance companies negotiate on their customers’ behalf, so they get charged less. And Medicare, which is the biggest player of all, negotiates hard — volume discounts and all, just like any big customer anywhere in the world — and thus pays the least for the same products and procedures.

Interestingly, Brill steps away from one obvious solution — have Medicare cover everyone — because he says it will leave doctors underpaid. Felix Salmon takes him to task for this, pointing out that Brill never states what “underpaid” is. Since my greedy doctor post remains my most read and commented of all time, I feel a certain obligation to chime in here. I have never seen any analysis that tries to show what doctors might get paid in an all-Medicare system. Maybe it would be pretty low; if GPs maxed out at $50,000 per year, they probably wouldn’t spend all that money and time at medical school. But maybe doctors would still get paid what they do now, and it would be hospital administrators (whose multi-million dollar salaries are the true villains in Brill’s piece) getting a pay cut. Or maybe it will be CEOs of drug companies getting paid less; who would complain about fewer $78 million severance packages being paid to CEOs?

You can read more commentary regarding Brill’s article here and here.

Emerging Diseases and Forest Fires

I have been interested in emerging diseases ever since reading The Hot Zone by Richard Preston, which was so intense that it kept me awake during an entire overnight train ride from Boston to Washington DC. So I was very psyched at Christmas to receive a copy of Spillover by David Quammen. I just finished it, and know a lot more now about how diseases jump from animals to humans.

Scary cover, great book!

Scary cover, great book!

Quammen uses the book to explore theories about why there seem to be more frequent incidents of humans being infected by animal diseases (think SARS, Ebola, Hendra, Avian Flu, etc.). One of the theories he discusses concerns how increased human development is breaking up large contiguous ecosytems into smaller ecosystems separated by cities, farms, etc.

For example, a large forest might be full of bats that could be carriers of some nasty virus. This forest contains metapopulation of bats, or a series of smaller populations that meet and mingle at their edges. In a metapopulation an infection is likely to be constantly present, but at a low level of incidence. If each smaller population becomes isolated, however, that population will likely go through a boom and bust cycle of infection, with periodic epidemics infecting most members of the population.

If that highly infected population runs into humans, there is increased likelihood of the infection passing to the humans. In other words, if 90% of bats are infected, then there is a higher probability of bats passing their disease to humans than if only 10% of bats are infected.

As development has broken up formerly vast forests into smaller forest segments surrounded by cities and suburbs, we have seen metapopulations of natural disease reservoirs (bats, rats, mice, etc.) broken up into the isolated populations that are more likely to transfer diseases. Hence the increasing number of obscure infections jumping into humans.

What struck me about this theory is the parallel to forest fires. Current wildfire thinking holds that if you put out wildfires, fuel loads will build up and eventually you will get a catastrophic fire that can’t be controlled (like the 2002 Biscuit Fire in Oregon, which burned nearly 500,000 acres; I drove through the edge of that fire, and the smoke turned day into night). But if you let natural fires burn, they will clear out the fuel load and not turn into conflagrations.

So you can have small, more frequent fires, or rare, catastrophic fires. Much like you can have frequent, low levels of infection in your animals, or rare, but massive levels of infection. And in both cases, human intervention in the environment is what moves things from low-level balance to a high-level cyclic system.

Spending Too Much on Brand Names; BMW, Coach, etc.

Interesting that it’s a car webzine (thetruthaboutcars.com) that has written the best commentary I’ve seen on the trend of the past few years in which young people have been spending well beyond their means on brand-name cars, purses, clothes and other consumer products. There was a time when buying a BMW, or an Armani suit, or $1,000 purses and shoes, was something done by people in their 40’s and 50’s, who had been well paid for decades. Now 25 year olds PR account executives making $40,000 are buying Jimmy Choos and putting them on their credit cards. Or as the article says, a few years ago “the idea of spending four figures on a handbag when one worked at an entry-level white collar job would have been seen as irresponsible and reckless at worst, crass at best.” The pre-financial crisis debt binge wasn’t just about mortgages. People were overspending on all kinds of goods, and they still are.