Why I Do Not Have a Smartphone.

I am finding that lately, I have had to defend my choice of telephone. I find this strange, as my telephone does exactly what I want it to do: It places and receives phone calls anywhere in America.

The fact that it does this, and that I pay about $35 per month for such services, seems like a good deal to me.

And here’s what I really like. My telephone does more than just handle phone calls. It can also send and receive text messages, which are growing on me as a legitimate tool for communication. It features voicemail, which is quite a bit more affordable than hiring a secretary to handle similar message-taking duties, and it has a few spiffy additional features, such as a tip calculator and alarm clock. The calendar function is especially useful, and can be used for such purposes as finding out what day it is.

Again, I thought this was a lot for a cell phone to offer.

The good folks at CNET had this to say about my phone:

“In any case, the SGH-A137 isn’t too much to get excited about. The simple flip phone is so basic that it doesn’t even offer an external display.”

Oh.

What I am discovering is that my colleagues agree with the editors at CNET. They tell me that more than my clothes, or my choice of automobile, or my chosen profession, my phone indicates what kind of human I am.

I thought my phone indicated that I was both sensible and uncomplicated.

Not even close.

Not even in a million, billion years. Just NO, Dan.

What my phone apparently signals to others is that I am, at best, uncool, and at worst, a lost cause. There is only one remedy for someone like me:

A smartphone.

A smartphone like the iPhone, or the Blackberry, or something that runs on Android. A phone capable of listening to a song on the radio, determining what song I’m listening to and then automatically downloading said song to the phone’s very hard drive. A phone capable of taking a photo and then rendering it in sepia. A phone capable of booking reservations at a nearby restaurant, testing my food for any toxins, chewing my food, paying my bill, getting me a taxi home, tucking me into bed and telling me a bedtime story.

A phone like that, or something.

I do not like that sound of that. Not at all.

See, I have a very simple mind. I actually like dividing up my devices into specific silos. I like reading on my Kindle. I like typing on my laptop. I like rocking out on my iPod. I like calling on my phone. This system works nicely for me.

And I suppose that, yes, I could get a singular device that could allow me to do all of those things. But I type slowly on a phone. I don’t like reading something on a three-inch screen. I like going for a jog and not having my music device start vibrating from an incoming text.

More than anything, I love disappearing. When I am at my computer, I respond to email. I write. I’m busy.

But away from that screen? I shut down. Work ends. I go out, and I enjoy life in this rather nice world of ours. If you need me, call me. I’ll pick up. But that email of yours will have to wait.

Give me a smartphone and I’d be in a state of perpetual Google. I’d be walking down the street and see a Curly W hat and ask myself, Who was it who hit 3rd for the Nationals in 2006?, and then I’d lose myself in the lifetime statistics of Jose Vidro, and then I’d pour over numbers on Baseball Reference, and then I’d find myself wondering what just happened to the previous 35 minutes. I know, because this is what happens at work. I take a thought, and connect it to another, and another, and then the time just disappears. I am good at wasting time, and on a smartphone, I would waste an awful lot of it.

My current phone? I don’t get lost in it. I make my call. I send my text. I move on. I leave myself time to stop and stare.

It is a phone that allows me to focus completely on what I am doing.

Of course, now that I’ve said all that: I’m going to get lost on the way downtown tonight. I’m going to need directions. A song will come on the radio, and I’ll want to download it. I’ll forget to make reservations at the place I’m headed. I’ll see something that demands to be sepia-ized. I’ll have an urgent email to send out.

And I’ll understand why everyone else has that thing in their pocket.

But me? No. Not yet. Not ever, I hope.

How Bon Jovi’s Subversive Smiley Face Would Go Viral in 2011.

I got a Bon Jovi song stuck in my head the other day. The song was “Have a Nice Day,” the title track from the band’s 2005 album. It’s got all the Bon Jovi hallmarks: those familiar power chords, Richie Sambora playing a double-necked guitar and multiple lyrics about “living my life.” All it’s missing is that signature “wah-wah” guitar riff.

But the music video for the song got me thinking about how viral campaigns work. The video starts off with Jon Bon Jovi outside of one of his concerts. A fan hands him a copy of the CD, and the singer grabs a Sharpie and draws this little doodle.

Then the fan pulls out his cell phone, takes a photo of the doodle and sends it to someone. And from there, the subversive smiley face goes viral. It’s plastered on mailboxes and billboards, tattooed onto arms and lower backs, and even cut into a corn field.

But step back a second. Let’s see where this all starts in the video.

It starts with that. With a picture taken on an old-fashioned, non-flip, Sprint cell phone. Not a Blackberry. Not an iPhone. A phone that retails today for less than $20.

Let’s put this Bon Jovi video campaign in perspective. The song came out in August 2005. The iPhone wouldn’t be released until June 2007. Twitter wouldn’t launch for another year, and wouldn’t gain popularity for another three years. Facebook was still limited to college students only, and those with accounts could only post one photo — their profile photo.

So this within-a-video viral campaign — one from a song that’s only six years old — is almost comically antiquated.

How would Bon Jovi’s smiley face go viral today? Probably like this.

❡❡❡

Bon Jovi draws the image on a fan’s CD. The fan whips out his iPhone and Twitpics it. Then, even though it’s a doodle, he Instagrams the image, because everything looks better in sepia.

The Twitpic gets a little bit of traction at first — a retweet here, a retweet there. Someone mass @-replies the message to celebrities. @kimkardashian makes the image her profile pic.

Soon, the smiley has its own Facebook page — Can this smiley face get more fans than the Jonas Brothers?

Then it gets its own Twitter account — @SubversiveSmiley, along with dozens of impostor accounts. (@FakeSubversiveSmiley, @SubversiveFrowny, @SubversiveSmileyGlobalPR, among others.)

(The Twitter account is later republished in book form, and makes the New York Times best-seller list. The CBS sitcom based on the tweets — “Have a Nice Day” starring John Stamos as a stuck-in-the-80s Jersey dad trying to make good — gets cancelled after the third episode.)

4Chan launches a meme — #icanhazsmiley — and then the Cheezburger Network launches a site devoted to sneaking the smiley face into famous photos “Where’s Waldo” style.

HuffPo publishes a photo gallery of 21 famous smiley faces, and although mostly inane, it draws 11 million page views.

@KanyeWest retweets the initial image with the hashtag #SWAG, and announces his next album will be called “Show Me How to Smiley.”

The image jumps the shark.

Two weeks later, Bon Jovi’s album, “Have a Nice Day,” finally hits iTunes. Fans are confused as to why Bon Jovi’s album is featuring an image that’s so last week.

What the Death of News Cycles Really Means For Most Humans.

A week ago, Mizzou’s men’s basketball coach, Mike Anderson, left to take the same position at Arkansas. And in the past week, there’s been a lot of speculation about who will become my alma mater’s new head coach. Mizzou went hard after Purdue’s coach, Matt Painter. Today, it looked like MU was going to sign him to a contract. I was following it all on Twitter. I had a column up in TweetDeck delivering every tweet related to Painter. They filed in, sometimes by the second. When the St. Louis Post-Dispatch reported that Painter had agreed to sign with Mizzou, Tiger fans started celebrating. Purdue fans, meanwhile, were pissed. When KOMU-TV in Columbia said the deal was 100% done, things got even more charged. Tweets were tweeted that I wouldn’t want to republish here.

And then, in 20 minutes, it all changed. One Indianapolis outlet reported Painter was staying. Then ESPN said so. Then CBS and FOX Sports. Then Purdue announced, officially, that the contract was done.

The tweets turned around. The Purdue fans were relieved. The Tiger fans were pissed.

After it was all over, I started thinking about a friend of mine, who was on a flight from Chicago to D.C. this afternoon. That’s a two-and-a-half hour flight. In the time between takeoff and landing, he missed an entire stream of emotions and news. While he was in the air, the story went one direction, then 180ed and went the other. The life cycle of the story started and ended in less time than wheels up to wheels down. When he landed, the story was already over. Like, over. Dead. Forgotten. By tomorrow, outside of Columbia, Mo., and West Lafayette, Ind., nobody will pay any attention to what’s just happened. The news will be less than 12 hours old, with emphasis on the old.

❡❡❡

So here’s a thought. It’s not scary or frightening or dangerous to our democracy. But I think it’s something worth considering.

It’s this: We don’t have news cycles anymore. We used to. We had news cycles where topics dominated the news and then faded out in favor of other topics. We had news cycles that lasted long enough for the public to learn about the topics of the day and make decisions about them. We had news cycles where what was in Tuesday’s Washington Post was probably still headline news on Sunday’s “Meet the Press.”

We don’t have that anymore. But we did, as recently as a decade ago.

I know, because, well, TV told me so. I was just watching a “West Wing” episode — Season 1, Episode 21: “Lies, Damn Lies, and Statistics.” It aired on May 10, 2000. In it, Rob Lowe’s character, Sam Seaborn, is photographed by paparazzi late at night while giving a graduation gift to a friend. The friend happens to be a call girl, and Sam’s a speechwriter for the President. Sam doesn’t see the paparazzi, but he does see a car rush away from the scene, and he’s suspicious. His worried about what a photo could do for the President’s public image. He calls C.J. Cregg, the President’s press secretary, to tell her what he’s seen.

Here’s the conversation that ensues the following morning between Leo McGarry, the President’s chief of staff, and C.J.:

LEO: How do you not tell me until this morning?

C.J.: Leo…

LEO: How do you not call me last night?

C.J.: We didn’t know anything last night.

LEO: Sam called you.

C.J.: That’s right. He met the girl and saw a suspicious car. I’m not going to call up the White House Chief of Staff in the middle of the night because someone started a car.

LEO: C.J., if it was…

C.J.: I was handling it, Leo. It took me three hours to confirm there was a picture, and another hour to find out who has it.

LEO: Who has it?

C.J.: The London Daily Mirror. They paid a waitress friend of hers $50,000 to set it up and confirm that she was a call girl.

LEO: When is it running?

C.J.: It’ll run later today. American press has it tomorrow morning.

In May of 2000, that was a realistic conversation. It wouldn’t be today. The obvious thing is that once the British paper got the photo, they wouldn’t be waiting for the presses. They’d have the photo online, and then everyone would have the photo. You’d wake up and it’d be staring back at you from your Facebook news feed.

There’s one another thing that wouldn’t happen today: If the President’s press secretary was lucky enough to find out in advance about scandalous news — say, if a USDA executive made controversial, on-the-record remarks — the White House would be barely ahead of the news cycle. But mostly, the news cycle is ahead of the actual newsmakers. Something is said, something is known, the public learns of it, the public renders its verdict on the news, and perhaps only then would the C.J. Creggs of the world have a chance to comment on it. The story is revealed in parts, often haltingly, and often without all the details. By the time the full story surfaces, the news cycle is already over.

❡❡❡

So, no, we don’t have news cycles anymore. We have moments. They start and they end faster than we can even process. A government’s overthrown in Egypt; we watch, and we forget. Japan’s hit by a tsunami, and it’s out of the news two weeks later. Libya’s being bombed, Iraq and Afghanistan are still at war, Sudan’s splitting apart, the economy’s slumping, the Chinese are doing God-knows-what with our money, the price of oil is rising, the dollar is falling, the cherry blossoms are blooming and the Nationals still don’t have an Opening Day starter. All moments. There are all these moments happening around us, all in real time, and we’re able to actually watch them pass and disappear behind us. You can sit there at your computer screen and actually watch the moments pass, in one eye and right out of sight.

I know, because today, I sat with a TweetDeck column open for the words “Matt Painter,” and I watched them pass.

It’s sad that that “West Wing” episode is hopelessly antiquated, because it’s only a decade old. Here’s a better example for our modern news cycle. It’s actually a quote from “Top Gun.” It’s from that fight scene at the end of the movie. Tom Cruise has just taken off from the flight deck in the Indian Ocean. Val Kilmer’s going one-on-five versus the Russian MiGs. The captain of the ship wants to launch additional planes into battle. And here’s what he’s told:

Officer: Both catapults are broken, sir.
Stinger: How long will it take?
Officer: It’ll take 10 minutes.
Stinger: Bullshit, 10 minutes! This thing will be over in two minutes! Get on it!

In Internet time, hours feel like days, and days feel like weeks. The web isn’t killing our brains, but it is killing our internal clocks. When the world is on demand, anything delivered less than instantaneously is an eternity.

That’s what we’re up against today. It used to be that there was no time like the present. No longer. Today, there’s only time like the present. If it’s not happening now, it’s barely happening at all.

What we really need to learn is patience. But where will we find the time?

The Three Stages of a News Start-Up.

I’ve been spending my week down in St. Petersburg, Fla., at the Poynter Institute. The theme of the week: entrepreneurial journalism. And after seeing case study after case study about successful journalism start-ups, I’m starting to see three common areas of overlap during the initial start-up process.

Those areas are:

Conceptualization –> Validation –> Realization

To break it down a bit further: the ends are the easy parts. Conceptualization: Man gets idea for business. Realization: Man makes business legitimate.

It’s the middle part — validation — that’s tricky. That’s the part where I’m hearing stories about what Seth Godin called ‘the Dip.’ It’s the part where a start-up is still trying to decide if their business is feasible, and it’s where they’re going through a massive period of self-doubt about the business’ chances for success.

But there are a few sources of validation that can convince a start-up to keep pushing forward. The three that seem to be on repeater:

Validation (or approval) from:
-The audience
-Investors (foundations/angels/VCs/donors)
-Other media (buzz about company/product)

It seems to be — and this is obviously a ‘duh!’ moment, but — that the companies that make it from concept –> reality get enough validation to convince them that it’s worth pushing through the Dip. It’s one thing to believe in your own idea. It’s another to hear from outsiders that the idea is one worth believing in.

Because without that validation, it’s almost impossible for a start-up to go from concept to reality.

(photo at top from South Park’s Underwear Gnomes episode.)

These Things I Know To Be True.

Jorge Chávez International Airport is not a fun place to be, especially after midnight when you’re leaving Peru but your flight back to Houston has been delayed yet again. But my delay at Lima’s airport gave me a few minutes to reflect on my recent trip abroad, and especially on a few things that I very much know to be true.

  1. A country cannot be truly free until its people can print out airline boarding passes from home.
  2. If my mother starts running at the sight of someone, you should start running too.
  3. Wherever your are, the drivers are worse than wherever you just were.
  4. There is nothing more arbitrary in this world than airport taxes.
  5. If you are on a historical tour, and your tour guide is not speaking in his/her native language, the truth will become slightly more malleable.
  6. It is difficult to trust anyone who packs more than 50 lbs. of luggage for a vacation to anywhere short of Antarctica.
  7. The same holds true for those who refuse to turn off their phones in the middle of the Amazon rainforest.
  8. The number of crying children on your plane varies directly with the length of your flight.
  9. It actually kind of helps to smile while you’re getting screwed.
  10. Luxury is a very, very relative term.

Why We Need to Change the Concept of Time — Immediately.

Today is my birthday, and my annual reminder of how much I dislike the concept of time.

Truth is, time is unfair. When I see someone wearing a watch, I don’t see someone with punctuality in mind. I see someone slowly counting down the seconds until the grave.

What is a day, after all? It’s a very strange segment [1. Do we divide anything else by 365?] of a larger year, which we define as the time it takes for the Earth to circle the sun. And if you’re like me, you can’t get enough reminders that the Earth is spinning blindly at 67,000 miles per hour through a vast and unknowable universe.

Point is, I’m just not a fan of time, especially on birthdays, when it serves to remind me that I’m getting both older and no closer to figuring anything out. Human years are so scarce; if we’re lucky, we get 70 or 80 years to live, and that just doesn’t seem like enough.

What I wish was that there was a way to make time seem less scarce [2. This, in itself, is a pretty strange thought, because time is infinite. What I really mean to say is that I mean to make my available time less scarce, though I’m not sure when I became so possessive about it.]. So I’m proposing here, on May 16, 2010, that we adjust our notions of time.

The human attention span is, depending on which Google source you trust, somewhere between five seconds and 20 minutes. 150 years ago, the Lincoln-Douglas debates lasted anywhere from five to seven hours at a time. Today, those debates would be reduced to mere soundbites, because our attention spans are shrinking. Who has time for five hours of political discourse? Hell, who has time for any political debate involving more than a few bullet points?

In the 2010, we have more distractions than ever, and we’re as easily distracted as ever.

But if that’s the case, then why are our standard units of time not adjusting to our shorter attention spans?

Let me put this another way: when Andrew Carnegie died, he was worth $475 million. But $475 million in 1919 isn’t worth what it is today. Luckily, we’ve got tools to compare the dollar from 1919 to today’s dollar. [3. In today’s money, Carnegie’s fortune would be in the billions.]

We adjust to each age. When humans got taller, we raised the height of our doors. When people got fatter, we widened the space in the supermarket aisles. A news cycle used to last months. Now it lasts hours. We constantly recalibrate to what’s happening now.

But we still allow time to remain constant. I don’t think that’s fair.

If a piece of paper can become more or less valuable over the course of time, then, well, why can’t time, too?

The best part is, there’s some precedent for this.

Abraham lived to be 175. His wife, Sarah, continued to have kids well into her hundreds. Biblical time clearly didn’t use our rigid time structure.

So what’s stopping us [4. Besides common sense] from altering our concept of time? I’m okay with seconds and minutes and hours — anything that can measure the length of a YouTube video seems like it should stay — and I’ve got no problem with sun-up-to-sun-down days. But why shouldn’t we alter our concept of years? Does any modern human have the capacity to actually pay attention to something for an entire year?

How’s this sound: let’s decree that each season is considered a year. The modern calendar year gets split into fours, so today, I’d have just turned 92 — and I’d still be entering my prime.

What happens to everyone else?

Dick Clark becomes four times as valuable. Gyms get four times the number of resolution-related membership drives. Champagne companies see four times the rate of sales.

And best yet: wasting a year doesn’t seem like that big of a deal, because there are so many more of them to waste. Sure, it’s just a way to trick the brain into believing that we’re not screwing around as much as we really are.

But it’s working.

Our concept of time is changing. It’s time to make it official, I think.

When I die, I want my rabbi to be able to say, “Here lies Dan Oshinsky, who died at the age of 375.” The crowd will nod appreciatively. Honestly, who’d believe that a man so old could have accomplished so little?

You Are Not a Supernetworker. (Sorry.)


About a month ago, I started writing a blog post that I never finished. It was about Dunbar’s Number, which explains a simple human limitation: we can only really care about so many people. Dunbar puts a limit on it: 150.

But thanks to Facebook and Twitter, we’re more easily connected to others than ever before. You don’t need a giant Rolodex anymore, just an active news feed and the latest version of TweetDeck. So I started wondering: I’ve got a few hundred Facebook fans and a few hundred Twitter followers. And that’s on top of my normal, Dunbar-defined circle.

I may not be a Supertasker, but could I be some sort of Supernetworker?

The résumé-deflating answer I came up with was, No, I’m not a Supernetworker, and neither are you. See, Dunbar’s theory creates circles, starting with your innermost circle of friends and expanding until you reach that outer circle of passive acquaintances.

Think of it this way: the inner circles will end up at your wedding. The outer circles might get a Christmas card (or maybe a Facebook birthday wall post). Social networking might bring you a few hundred or a few thousand additional connections, but the majority will remain in that outer circle — or beyond.

The irony is, you might engage them regularly — but you can’t really care about them on the level that Dunbar’s describing. [1. The closest thing I’ve heard of to a Supernetworker is Politico’s Mike Allen — who the New York Times describes as a one-man networking machine. He engages a huge network of contacts on a regular basis. But his closest friends also apparently don’t even know where Allen lives. So I’m not sure he’s the healthiest example of a normal human.]

But I was hugely impressed to see a media outlet finally discuss the ramifications of social networking on Dunbar’s Number. It came in a Guardian piece that actually asked Robin Dunbar what he thought of his number’s role in the world of social networks.

I asked Dunbar if he saw anything in the evolution of online networks to suggest that the next stage might extend our social horizons in any meaningful way.

“The question really is,” he said, “does the technology open up the quality of your social interaction to any great extent, and the answer to that question is, so far: not really.”

Exactly. But that doesn’t mean these connections are worthless. As Clay Shirky points out in the same piece:

“What these games and applications do,” he says, “is extend and churn the edges of our network, which is often how new ideas are brought into it.”

So add those friends on Facebook. Connect with others on Twitter. They probably won’t be coming to your wedding, and they might not even end up on your Christmas card list.

But if you’re smart, those fringe circles might just help you create something that your circle of 150 never would have thought of.

You don’t have to be a Supernetworker. You just have to be a good listener.

Why I Have Clearly Not Asked for God’s Help While Blogging.

What follows is a brief thought about the nature of God. It is not a serious thought. I hope you do not find it blasphemous. — Dan

I have recently begun to consider the idea that if there a God, he is probably not very good at multitasking.

I’ll direct you to this recent study, which suggested the existence of a rare group of humans known as “supertaskers.” They’re not just capable of multitasking; they actually perform better when doing so.

About 1 in 40 — or 2.5 percent of humans — have such skills, the study found.

But these guys are the outliers. Which brought me to an unusual thought: man was created in God’s image, or so certain books suggest. But if 97.5 percent of mankind is incapable of properly multitasking, then by the transitive property, can’t we assume that God probably isn’t a very good multitasker either?

Which brings me to another thought: if God is present in every aspect of our lives — and certainly, there seem to be more than a handful of athletes who believe in God’s willingness to take part in a post pattern — how does he juggle it all if he’s so average at multitasking?

¶¶¶

I put the question to a friend of mine today. We were on the front nine of some hacker course in Austin, Texas, and my friend was working on a precision slice that usually isn’t found outside a 10-piece knife set infomercial.

“Oh, Jesus,” he said after knocking consecutive shots into the pond.

“You sure you want him here for this?” I asked.

I gave my friend the rundown. Look: God’s a busy guy. He’s trying to balance the cosmos. His divinity might not even be able to solve the matter of Inbox Zero. He doesn’t care about your short game, and he probably shouldn’t.

“So?” my friend said.

Well, let’s suppose that God spent most of his time just watching over humanity, I said. But in a very limited way, he’d take an interest in you. You’d get to choose one aspect of your life, one thing that you do regularly, and God would play a role in it. You wouldn’t be superhuman in that one thing. But you’d know that when you took on that task, you’d have a bit of divine protection.

“So God could be present on the golf course?”

Yes.

“Or when I play Facebook Scrabble?”

You’d be wasting it, but yeah, sure.

“Or in the bedroom?”

You got a girl you’re trying to impress?

And that’s when it really began.

¶¶¶

The immediate instinct, under this God-as-a-Genie-with-one-wish-to-grant concept, is to go for something big. Ask God to keep watch when you’re playing poker. Or when you’re shooting those championship-winning free throws. Or when you’re looking for luck with the ladies. Ask for one of these, and you’re asking for God to give you house money to play with in Vegas.

But then there’s a secondary thought: What if you could better use your divine intervention? I’m talking about the kind of intervention that gets tossed around at Sunday School: Dear God, help me find courage. Dear God, help me comfort the sick. Dear God, please make me sick so I can leave this sermon early.

And then there’s the last thought: What if you could take it just a little bit further? If God can’t be present in every little thing you do, why not just choose one little thing that you do every day?

What if God could be present during your rush hour commute? (Finally, a practical reason to have a “God is my co-pilot” bumper sticker.) What if God could keep you engaged during those dull moments in your day? (When waiting in a dentist’s office, God could deliver the manna that is Men’s Health magazine.) What if God could help you be on time for meetings? (He might be a watchmaker anyway.) Why not ask God to be present in the kitchen? (Just smile and nod when someone tells you, “These fudge brownies are just heavenly.”)

¶¶¶

I’m not saying this theory of divine assistance is for everyone. What I am saying is this:

The next time you’re 130 yards out and deciding between a 9-iron and pitching wedge, ask yourself whether or not you really want the Almighty as your caddy. Besides, he might be able to spot a triple-word score in Facebook Scrabble that you’d never be able to see.

My Scoreboard.

Soon, I found myself keeping score. About to graduate, aimless, preparing for joblessness and possessing a degree worth about as much as the paper it was printed on, I realized — belatedly — that I wasn’t exactly a modern guarantee of potential.

I started searching for something tangible, something worthwhile to get me through my remaining months at school. As a college basketball obsessant, it’s no surprise that the end of the NCAA Tournament had something to do with it. With the games over, I felt a sense of emptiness. During the Tournament, a one-too-many-beers promise to follow a favorite team had suddenly turned into a road trip. (Dude, we’re going to Phoenix!) I had goals and aspirations and dreams. Most importantly, I had more games to watch.

But the Tournament ended, my team lost, Phoenix turned out to be a hell of a drive — who knew? — and I was facing the unthinkable: graduation. So it came to be that out of a month of non-stop basketball watching, I started keeping score.

It was innocent enough at first. I decided that I’d make up goals to distract me from my life as a writer of failed cover letters. These daily goals were my way of staying sane, of finding blips of success hidden amongst routine.

I started with a small one: every day, make someone laugh really hard. I wasn’t going to make milk come out anyone’s nose — you’d be surprised how rarely one sees college students consuming milk in public — but I could try. Do it once a day, and I could enjoy the scoreboard at the end of the night: Dan 1, Failure 0.

I liked coming out on the winning end so much that I added more categories to my day. The points started trending upwards, the scoreboard spinning like an odometer on a cross-country trip (to, please God, anywhere other than Phoenix). Being thankful for little things wasn’t hard; I could rack up a dozen points a day doing that. Being punctual was even easier. Soon, I was running up the score. 5-0, 10-0, 20-0.

It only got worse from there. I had started out seeking moral victories and joy in day-to-day moments, but the high from those little wins faded faster with each day. I craved even bigger wins.

In one day, I decided to start being more spontaneous and to start speaking Spanish more often. But I abused the system. Getting a haircut at a barbershop run by Spanish-speakers and discussing mullets fit both requirements. Or: Look! I’m ordering a chalupa without sour cream!

I decided to stop skipping breakfast, and I was earning easy points there, until I decided that I wanted to start sleeping later, which meant that I wasn’t waking up early to eat breakfast anymore. But the scoreboard took no notice. I’d only ever created one rule: complete the category and earn the point. There was no penalty for breaking the rules, because there really weren’t any rules.

The points piled on. I had created my own metrics for success, and by my own best standards, I’d become wildly successful.

With so many paths to success, I’d guaranteed myself blowout victories with each new day. I’d been giving myself points for reading books, for creating esoteric theories, for watching new movies, for blogging, for napping — all at odds! — but the scores kept going up, and it didn’t matter how hollow my victories had become. I found myself saying odd things in the morning, like, “Right here, in this moment, this is where the day will be won.” When had I started talking like a member of the Roundtable? When had I become obsessed with winning?

Then graduation came closer — first weeks away, then days, then looking back as I crossed the dais — and I wasn’t any closer to getting a job. But I’d still been finishing my day completely convinced that I’d spent it well. I was a success, but only in a world in which I controlled the definition of success.

A few weeks after graduation, I was lucky enough to take a job that I actually wanted. Everyone wanted to know: how much money would I be making? In a world where success can’t be easily measured, salary seems like the simplest way to understand value. But I’m not sure that’s what really constitutes success.

I’d like to think my daily scorekeeping — at least my initial efforts — came close to defining two key measures of success: chasing ambition and building a better community (one in which, I’d hope, success can be further nurtured). But I’ve started to realize that we can’t attach a number to success, and we probably shouldn’t try to.

So I’ve stopped keeping score. When I make a friend laugh, I’m not declaring it a personal victory. Happiness isn’t tied to some internalized competition. I’m not winning, but I feel sane.

Though part of me still thinks that I’d need a scoreboard to know for sure.

A Eureka! Moment: Why I Only Have Good Ideas When Tiny Scraps of Paper Are Around.

The revelation came to me in the moments before sleep, and I went searching for something to scribble it down on. All I could find was a small envelope on my kitchen table.

But what else could I be expected to write on in such a moment?

What hit me last night, what pulled me out of bed and sent me searching for any scrap of paper, was a simple truth: I only have good ideas when there’s barely anything around to write on.

I have owned dry erase boards that I’ve never used, oversized notepads that stayed blank and binders that held nothing.

But I’ve captured eureka! moments on cocktail napkins, scribbled genius ideas in the margins of newspaper columns and on business cards. I’ve rarely had success carrying around a notebook, with one exception: in the summer of 2008, when I had this bound, 3” x 2” pad that I covered every inch of with tiny thought bursts during my travels in China.

The more I consider it, the more the words jotted down last night on the back side of that envelope ring true: “The profundity of an idea varies in inverse proportion to the size of the paper it’s written on.”

eurekamomentsgraphed

Or, in words: the smaller (and stranger) the thing I’m writing on, the greater the eureka being written. [1. This may explain why I’ve jotted down great ideas on the inside of a paper towel roll but never on an actual, oversized paper towel.]

I’ve always kept these big legal pads around for the moments in which I’d need to fully flesh out an idea. But maybe it’s that a confined space — forced brevity! — is the key to innovation.

Shouldn’t the best ideas should be jotted down in their most basic form first before being carefully considered and expanded upon? Isn’t it only fair to let a spark turn into a slow burn, to let brief moments of genius turn into something of scale?

This is the kind of revelation that could force a change in lifestyle. I’ve started thinking about getting rid of all the big legal pads around my apartment. With the money saved, I could head to a local paper store instead and buy a stack of customized cocktail napkins. (“From the Desk of Dan Oshinsky,” they’ll read.)

That’s just one idea; I still haven’t decided what the next step is. But I’m not too worried. I picked up a tiny green receipt from a parking garage the other day. It couldn’t be more than an inch tall and two inches wide. I guess I’ll just have to keep it around and wait for inspiration to strike.