Skip to content
February 18, 2018 / Michael Yaroshefsky

Protected: The Lean Startup is incredibly wasteful

This content is password protected. To view it please enter your password below:

January 29, 2018 / Michael Yaroshefsky

3 things we got wrong during our first user testing

RocketVisor is launching our all-new Visor this spring.  The Visor lets marketing, sales, support, and success teams collaboration on accounts together.  It does this by adding thoughtful artificial intelligence to Google Chrome to anticipate your next action.  The first launch will focus on sales professionals.  This redesign comes after 18 months of learning through MVP iterations.

Alpha testers fell in love with our first app, Whiteboards, because of it’s simplicity. How do we maintain that simplicity while building an enterprise-grade account collaboration suite?

Our customers are busy and overwhelmed by tools.  Because the Visor adds to this stack, it must be easy to use.  Even though we are redesigning the product based on user feedback, it’s impossible to be an accurate judge of the usability of something new we redesigned.

The only accurate way to learn is to test the product with real users.  Prototyping software makes it possible cheaply to test certain parts of a product’s design before building it.  My senior thesis was about the psychology of UX design, and I studied the product design process in a Product Management 101 class.  But neither prepared me for how unexpected some of the real-world test results can be.

In some cases, the results seemed to contradict our interpretation of well-established usability principles. Here are a few learnings from our first two rounds of user testing:

1) Clever icons can reduce usability

Usability Principle:
Design products to use the least amount of text that gets most users to their intended goals.

Where it didn’t make sense:

Small, simple Visor apps are the main focus of the Visor.  They work because the Visor automatically understands which account a salesperson is working on.  When the Visor knows which account is active, the user can switch between a few lightweight apps with ease.

We decided to give each Visor app a clever name and cool icon.  This was a mistake.

Our real icons were going to incorporate the app name’s first letter.  For the prototypes, we used circular app icons with just that first letter of the app’s name.  In the image below these are the round icons for About, Whiteboard, Pings, Timeline, and Missions.

Before: We used circular icons to represent each app.

During our first round, testers were unclear on what the circular icons meant.  We’d ask them to find their whiteboard, and they’d generally go to the three-dot menu instead.  Once we explained them what the W meant, the app icons made sense.

But that initial experience failed to convey the product’s main value.  Some users thought the app icons were other accounts they could switch to.

We considered adding hover-hints.  But that would likely be insufficient.  The names of the apps didn’t describe what they did simply enough.  Testers had a hard time guessing what we meant by “Pings” and “Missions.”

Before the next day of testing, we adjusted the prototypes.  We replaced the icons with simple app names.  We also changed from branded app names like “Whiteboards” and “Missions” to basic names like “Notes” and “Tasks.”

After: We ditched icons and clever names for the most self-descriptive words we could think of.

The results were immediately better.  The next set of test users quickly understood the main value of the product.  They also understood that they could switch between apps for a given account.  The tab-like structure of the apps  made more sense to them than the app bar.

Takeaway Number 1:
Icons can be great for well-understood concepts, like search.  Otherwise, stick to simple self-descriptive words.  At the end of the day, it’s better to be simple and direct than clever.

2) Don’t rely on text to guide users

Usability Principle:
Design products to use the least amount of text that gets most users to their intended goals

Where it did make sense

In one user story, we ask testers to invite a colleague who was not yet signed up for RocketVisor.  Testers ran into a critical roadblock with the first design because we relied too much on text to guide them.

They were asked to invite a colleague named “Marzha Baker.”  They found it easy to click the “Person with a Plus sign” icon to invite someone.  Then, when a tester typed ‘Marzh…’ and no results were found, the response said:

“No results.  Type their email address to add.”

Before: When no results were found for an invite search, the text suggested typing their full email address to add them.

Most testers saw just the “No results” part of the message and stopped reading.  They thought they couldn’t invite Marzha.  So about half of them immediately went for the X-button to close the invite dialog.

Because our product is a collaboration platform, inviting new users must be seamless.  Not finding a colleague as an existing user cannot be a dead end.

Between the user testing sessions, we put together a new design that made the “Invite via Email” action much easier.

After: We provided an explicit call to action button for inviting colleagues via email.

The results were immediately better.  Users in the second cohort immediately clicked this button to continue the user story to completion.

Takeaway Number 2:
Beware of unexpected “dead ends” in the product. Don’t rely on a a set of text instructions.  Provide clear calls to action buttons to avoid dead ends.

3) Simple words can have unintended meanings

Usability Principle:
Anticipate that users will make mistakes; make it easy to undo actions.

This principle also applied when the product is the one that made a mistake.  Our product has an artificial intelligence component that guesses what a user might do next.  When the confidence is high enough, it can even take that action for users.  Sometimes these will not be correct, in which case the user should be able to easily undo the action.

Taking this quite literally, we designed a notification with an “Undo” function.

Before: We followed the usability principle and added an “Undo” button.

We were surprised to learn that testers were concerned what would happen when they pressed a button that said “Undo.”

It was particularly confusing to them since they didn’t actually do anything that was being undone.   It was the product that did something.

We also learned the testers associate the word “Undo” with losing some type of work they did.  Because it wasn’t clear what action of theirs was being undone, they were concerned about the impact.

For the next iteration, we changed the text to say “Return.”

After: We changed the wording to “Return.”

These tests went much more smoothly.  The testers knew exactly what to do and had no reservations clicking the button.

Takeaway Number 3:
User testing is as much about the copy as it is the layout and positioning of UI elements.  When prototyping, pay attention to how testers interpret the meaning of your words.

Takeaway 4: You won’t become a great product designer just by reading articles like these

Usability design is far from a rules-based discipline.  Knowing the principles simply isn’t enough.  As we found during these tests, sometimes applying the principles will at first guide you in the wrong direction.  Testing is the only way to know for sure whether your design is intuitive.

No amount of expert-studying or blog-reading is a substitute for getting out there and testing designs on real people.  So what are you doing still reading?  Get out there and test something.

Like to change people’s lives by designing beautiful, useful products?  
So do we.  And
we’re hiring a full-time product designer in New York City.

April 22, 2017 / Michael Yaroshefsky

Leaving Footprints with Other People’s Feet

When Steve Jobs announced his medical leave in 2011, Google’s Larry Page asked if he could visit Jobs to get tips on how to be a good CEO.  According to Jobs:

“My first thought was, ‘Fuck you.’ But then I thought about it and realized that everybody helped me when I was young, from Bill Hewlett to the guy down the block who worked for HP.  So I called him back and said sure.”

Jobs, who is generally known for his egoism, recognizes the importance of others in his success.

I came across this passage during a rereading of Walter Isaacson’s biography for one of my HBS classes, and it seems particularly relevant in the middle of raising a Seed Round for RocketVisor. Without any substantial business history to point to, investors at the Seed Stage stage are taking massive bets on the entrepreneurs and visions.  For Apple, Mike Markkula’s guidance and $250,000 line of credit was a crucial break for Jobs and Wozniak to begin converting their project into a company.

RocketVisor could not have gotten to this point without the generous support of investors, advisers, team members, and even customers who believe.  There’s no doubt that, like Apple, building great products is the surest path to success.  But doing so requires capital, advice, effort, and adoption by this village of support that forms around a business.

Keith Ferrazzi in Never Eat Alone observes the same:

“Ask any accomplished CEO or entrepreneur or professional how they achieved their success, and I guarantee you’ll hear very little business jargon.  What you will mostly hear about are the people who paved their way.”

Part of our success has been an enthusiastic cohort of product testers at Hubspot, and I like to visit them frequently at the office.  One day I walked passed a wall in their office with this quote by one of Hubspot’s founders, Dharmesh Shah:


Dharmesh has a good point: it feels great to give back to those who believed in you by making them look smart for doing so.

But I think the message is bigger than that — or perhaps even the very opposite of what Dharmesh proposes.  This is what Jobs ultimately realized when he agreed to meet Larry Page.

Success is more than just making your believers look smart and closing the loop; it’s about continuing the cycle anew.  Success is enabling others to accomplish great things by believing in them and supporting them.

It reminds me of the final class of “High Tech Entrepreneurship” at Princeton, where Professor Ed Zschau delivered an emotional farewell rendition of Frank Sinatra’s My Way.  Scrawled on the chalkboard behind him was the following ethos:

“Leaving footprints with other people’s feet.”

April 15, 2017 / Michael Yaroshefsky

The AI Problem Nobody’s Talking About: The Interface

When IBM’s Deep Blue defeated Chess Grandmaster Garry Kasparov on May 11, 1997, Newsweek painted it as a historical loss for humanity.  They called Deep Blue “a supercomputer especially outfitted to whack the human race down a notch.”

Now almost a decade later, Kasparov narrates a more hopeful reflection in “Learning to Love Intelligent Machines,” (WSJ paywall):

Machines that replace physical labor have allowed us to focus more on what makes us human: our minds. Intelligent machines will continue that process, taking over the more menial aspects of cognition and elevating our mental lives toward creativity, curiosity, beauty and joy. 

In contrast to bleaker prognostications about mass job destruction, I think Garry’s got it right.

Machine intelligence will complement how humans work — freeing us from menial, repetitive work — and allow humans to focus on the types of cognition we’re best at: creativity and social interactions.

However, while many entrepreneurs and investors are pouring resources into solving the data & algorithmic challenges of machine intelligence, most seem to be overlooking an even more critical problem:

How do we optimally pair machine & human intelligence?  

In a less famous but similarly significant freestyle chess match in 2005, two amateurs with three off-the-shelf HP laptops defeated grandmasters aided by supercomputers.  It was their ability to effectively interface with the machine intelligence that allowed this darkhorse team to defeat more capable humans allied with supercomputers.  As Kasparov concluded in a 2010 piece:

Their skill at manipulating and “coaching” their computers to look very deeply into positions effectively counteracted the superior chess understanding of their grandmaster opponents and the greater computational power of other participants…

Weak human + machine + better process was superior to a strong computer alone and, more remarkably, superior to a strong human + machine + inferior process.

You can have the most talented humans allied with the most powerful machine intelligence, but if they aren’t working well together – it doesn’t matter at all.

For machine intelligence: The process and interface matter more than human expertise or processing power

I find this challenge fascinating, because it requires the combination of three disciplines: operations research, psychology, and computer science. In 2012, my senior thesis at Princeton tackled exactly this problem.  I reviewed the published literature in each of these disciplines and defined principles that should guide the design of productivity applications.

I discovered psychology research on the science of attention, motivation, memory, and even visual perception that should influence how we design software experiences.  I applied these principles to build a simple todo list application.

After graduation, I became a member of Insight Venture Partners’ investment team.  While I became engrossed in the world of software investing, I couldn’t give up the interest I had in this human-machine interface challenge.  On nights and weekends I built a suite of browser-based technology to simplify my own daily workflow, looking for new investment opportunities.  The tools I had built focused on improving process and automating what was best done by the computer.

This project became so critical to the firm that when I left for HBS and stopped maintaining it, a team was hired to recreate parts of it that they could figure out how to.   Further to my surprise, friends and colleagues who had left and gone to other firms were so impacted by the project that a number asked if I could build solutions for their new firms.  It suggested that this type of software — focused on process, guidance, and automation for knowledge work — had a clear market opportunity.

The first thing I looked for in my investments at Insight was massive TAM, and venture capital CRM software certainly didn’t meet that hurdle.  But taking on a similar problem at a much larger scale — for all knowledge work — clearly does.  And that became the charter for RocketVisor.

We’re Patiently Setting Up Our Chess Board

While we are solving critical process and user experience issues, we’re also setting up our chessboard to make moves when the underlying AI technology matures.  We’ll be building the UX layer that spans all SaaS applications, establishing deep customer relationships, understanding their business processes, and collecting a massive data training set.

The key for us will be knowing how and when to integrate components of intelligence.  As much as I want to believe in AI, I’m still disappointed in the so-called “intelligent bots” being built today.  The back-end technology just isn’t ready yet, and I really don’t see conversational UI panning out as the panacea for modern UX.  It’s just too early to make that move.

Bent Larsen, a Danish grandmaster renowned for his unusual style of chess, famously remarked:

Lack of patience is probably the most common reason for losing a game, or drawing games that should have been won. 

Image Source: Shyam Sankar (TED)

April 23, 2016 / Michael Yaroshefsky

A Way to Calculate and Compare Customer Concentration

Last year, I came up with a way to measure and compare customer concentration, which I didn’t think was particularly special at the time.  That is until I came across the GINI coefficient for income inequality, which uses similar principles for measuring wealth concentration of a given country.  Since the GINI coefficient has it’s own Wikipedia page, and I haven’t seen my method out there for analyzing businesses, I figured it was worth sharing in case others may find it helpful.

The pretty simple calculation produces a number between 0 (Perfectly Concentrated) and 1 (Evenly Distributed).  The example illustration below shows 5 hypothetical customer bases, ranging from heavily concentrated (left) to evenly distributed (right), as well as the resulting concentration factors calculated for each.

In the top row, each of the four blue columns represents a customer, with its height proportional to revenue.  The bottom row shows the cumulative revenue, summed from smallest to largest.  The green area represents the difference between the company’s customer distribution and an idealized perfectly-even customer distribution.


I stumbled on this method while working on a due diligence project last year.  We were considering an investment in a firm with a highly concentrated customer base: a few big accounts made up most of the revenue.  But it turns out the market it sold into was also highly concentrated.  Of all potential customers, a few big firms make up most of the market.  Fortunately, there was a public lists of all of their potential customers, including each potential customer’s size.

The diligence question I wanted to answer was: How does the customer concentration of Company X compare to that of the market?

I had all of the raw data, but no algorithm or real way of comparing the data.  But how can you compare — or even measure — customer concentration?  One method is to determine the portion of revenue made up by the top 5, 10, or 25 customers.  But is there a better way?  Or at least a different lens to use?  I stumbled on one while manipulating the data to find some answers.

Consider a company with $400 of monthly recurring revenue (MRR) that has 4 customers:

  • Customer A: $40/mo
  • Customer B: $50/mo
  • Customer C: $60/mo
  • Customer D: $250/mo
  • Total MRR: $400/mo


From this picture, it’s pretty easy to see this is a concentrated customer base.  Customer D is $250/$400 = 62.5% of the revenue.  Now compare this with a perfectly even customer base, where each customer is $100/month.


It’s pretty easy to see side-by-side which of these companies is more or less concentrated. But when customer bases grow into the hundreds or even thousands, it becomes a lot harder to just “eyeball it.”  And that’s why it’s nice to use some simple analytics to derive a comparable concentration metric.  So let’s give it a shot.

Back to the original company, line up the customers in order from smallest to largest, and then add them up cumulatively like this.


Notice how it grows slowly with the small customers and then finally sprouts up with the last, big customer.  Compare this with the perfectly even company:


If you overlay these, you’ll notice something interesting:


They both climb to $400/mo, the total MRR for the company, but the more concentrated customer base grows more slowly, only shooting up towards the end once the larger customers are included.

Now let’s assume instead of 4 customers, we had a really big number of customers.  Imagine  there were so many customers that the columns were too tiny to even see anymore.  You’d have continuous curves like this:


If the blue area fills the entire bottom-right triangle (no green showed through), it would mean the customer base was perfectly even.  If the blue area bows out to the bottom-right corner, it means that there’s a lot of small customers on the left, and a few really big ones on the right — so the curve grows really slow and then shoots up at the end.  Consider this example comparing two customer bases (Orange, Blue) against the perfectly even distribution (Green):


In this example above the Company in blue is more concentrated than the company in orange.  This lends itself to a pretty simple idea that we can compare concentration of customers by determining how big the area under the company’s curve is to the area under the hypothetically perfectly even customer base curve.  Like this:



If the green Area B = 0, it means the customer base matches the even distribution, so the Customer Concentration Factor = Area A / (Area A + 0) = 1.

As the customer base gets more concentrated, the green area grows and the blue area shrinks, resulting in a number that approaches 0 as it gets more concentrated.

Now how do we calculate these areas? Well, our data is discrete (broken down into individual customer columns), not continuous like the curves above.  We can just assume each column has a width of 1 unit and calculate the areas, much like when calculating a Riemann sum in calculus.

We just need to make one adjustment, though, because we’re dealing with discrete data, so as not to double-count the last column.  The last column always equals the total revenue and does not add any information to our analysis regarding concentration.  Therefore we need to calculate this for N-1 customers, as follows (ignoring the gray area, and focusing on the area outlined in orange below).


Here’s the formula for calculating the coefficient:

CodeCogsEqn (3).gif

Note there’s a quick shortcut for calculating the denominator (Area A + Area B), using the fact that it’s a triangular number.

CodeCogsEqn (7)

And here’s what it looks like in Excel for the above sample data:

Screen Shot 2016-04-26 at 2.51.16 PM


So in the example above, the concentration factor is 0.467. Here are some more examples showing what it looks like for heavily concentrated and evenly distributed customers.

This slideshow requires JavaScript.

So that’s it!  If you have a table showing revenue by customer, you can use this pretty basic formula to calculate a customer concentration coefficient.  You can then use this as a standalone measure of concentration, or to compare concentrations among companies.  I don’t make any particular suggestions about what’s a good or bad coefficient, though I’d be interested in performing some benchmarking analysis to figure out where different types of companies fall.

The attached file below includes the visual demos shown above, as well as a second sheet for calculating the concentration factor for an arbitrarily large data set.  This is meant merely as an aid, and no warranties are made — so use it at your own risk.

Concentration Example _ Published


As is this case with any model or calculation that reduces some complex set of data to a simple number, beware that you lose a lot of information as you simplify.  (As we saw in the so-called “Formula that Killed Wall Street” — the Gaussian Copula).  It’s always important to visualize the data and apply a number of analytical tools before coming to a conclusion.

April 19, 2016 / Michael Yaroshefsky

Dr. Miklos Kiss or: How I Learned to Stop Worrying and Love Autonomous Driving (Part 2)

While we often hear about the obvious obstacles to autonomous cars — technology, regulation, insurance — the biggest obstacle may well be us.

Behind the wheel of an autonomous (or “piloted”) car, you’ll quickly grow frustrated by other drivers who seem intent to kill you: people who need to fill that safe gap you’re keeping ahead, people veering into your lane while texting or talking on the phone.  The other drivers assume you’ll understand their state (cruising contently, aggressively trying to catch a flight, driving drunk) and goals (keep a safe distance, find every possible gap through traffic, try not to wreck).  This misalignment of assumptions is what caused the most recent accident involving a Google Self-driving car:

Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it…. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements.

The artificial intelligence that powers these systems is incredibly sophisticated at solving deterministic problems (find the lane and keep the vehicle in it).  But it falls short of our ability to predict probabilistic outcomes and constantly negotiate fluid social negotiations with other drivers on the road.  The key difference is that AI lacks an innate theory of mind, or a sense of other people’s intentions and goals, which is something even toddlers have a strong capacity for.

Imagine this scenario: an aggressive driver is approaching fast in the lane to your right. Your car sees the other driver rapidly approaching your right blind spot and handily flashes the blind spot monitor.  But it’s not yet smart enough to assume that the driver is about to cut you off using that small gap between you and the car he’s rapidly approaching to your 1 o’clock.

A human might anticipate this aggressive driver making the maneuver and slow up to create a safer gap (or speed up to close it).  An autonomous car will suddenly see the other driver less than 10 feet from your front bumper (at roughly 70mph or 100 feet/second) and assume “car judgement day” has arrived — slamming on the brakes.  By the time your car uncovers its eyes and asks, “Is it over?” the aggressive driver has sped off, and there’s a train of cars behind you wondering why this jerk (you) just totally overreacted and slammed on his brakes.

Situations like this are common, particularly at the “edge” — potentially dangerous situations.  Part of the solution is improving the cars’ ability to understand us.  But part of the solution is our recognition that autonomous cars both see — and perceive — the world in a totally different way.  To safely share the road with autonomous cars,  we may have to slightly alter our own driving habits.

But how will we know which cars are being driven autonomously?   It might make sense during this interim phase to standardize some indicator when a car is being driven partly or wholly by AI.  Just like the “Student Driver” signs on cars driven by student drivers — but less dorky.  Perhaps this?


You knew the Hoff meant business when he drove around in KITT, his car that was also his witty sidekick.



April 14, 2016 / Michael Yaroshefsky

HBS Takeaways – Quick List 1

Takeaways Notebook

HBS turns the typical classroom model upside down.  At Princeton, professors started with a problem and guided the class towards a neat solution.  Each 50-minute class wrapped up in an orange bow, like episodes of The Wonder Years.

Here at Harvard Business School, the students talk most of the time. Our problems do not have neat, provable solutions.  We consider whether CloudFlare founders Matthew Prince and Michelle Zatlyn should be concerned that five of their 25 engineers just left the firm at once.  We debate the impact of chaebols, family-owned conglomerates, on South Korea’s economic development.  These classes get wrapped up in crimson-twined knots. It’s common to walk home each afternoon feeling like you now have even more questions than when you sat down that morning.

Instead of offering solutions, HBS leaves it up to you to take away whatever you want. I’ve been keeping a notebook of takeaways in my Field Notes notebooks (five-stars, would recommend).  I thought I’d share some of them here, in no particular order:


  • Shoot for Excellence, not Profit.  Profit eventually comes to Excellence in a circuitous way, so be there when it shows up.  -Attributed to Greg Glassman, Crossfit Founder.
  • Product-Market fit is important, but Product-Founder-Market fit is even more important.  -Attributed to Poshak Agrawal, Co-Founder Athena Education.
  • Always have something else in your pipeline. Even the most certain outcomes fall through.
  • Break down your product or solution into testable hypotheses, and design experiments to test each part, bit-by-bit.  And don’t fall victim to research bias, where you design the experiment for the outcome you want.  e.g. Don’t just ask your friends if they’ll buy your cool new gizmo/app.
  • Using “Beta” gives you a license to experiment and fail.  Use it liberally, when appropriate.
  • The more narrow and homogenous your target customer base, the better you can address their needs.
  • Can you cross the chasm to reach potential customers who aren’t the early adopters?


  • Give others jobs that you yourself would take.  -Attributed to Greg Glassman, Crossfit Founder.
  • If you’re sharing decision-making authority with a co-founder, determine in advance how you’ll work through a deadlock.
  • It should never feel like your team is doing you personal favors; everyone should be on the same mission.
  • Leaders at a company should set an example of healthy balance.  Take vacation, spend time with family.
  • Hire slow, fire fast.
  • Get rid of job titles in the early stages of a company.  Titles are earned, not given.
  • When building a team, think about each members’ skillsets as circles in a Venn Diagram.  You want enough overlap so you can communicate, but not too much overlap — it’s wasteful and can result in conflict.


April 10, 2016 / Michael Yaroshefsky

Dr. Miklos Kiss or: How I Learned to Stop Worrying and Love Autonomous Driving (Part 1)

Cruising at a comfortably brisk speed on I-90 Westbound, just outside of Worcester, MA, I gaze over the steering wheel to find a swelling sea of red taillights coming to a complete, abrupt stop ahead.  *Expletive!*

But this time is different.  Instead of “brake and groan,” I “clench and pray.”  But I’m not praying to my usual deity this time.  I’m praying to some engineer in Bavaria, on whose competence I’m trusting the fancy front fascia of my new Mythos Black Audi A6.  Dr. Miklos Kiss, Audi’s head of predevelopment for driver assistance systems, is a good choice.


Driving is becoming more like commercial piloting: setting the computer and watching the gauges to make sure everything’s running OK.

I’ve been trained all my life that in a situation like this, I’m supposed to turn a potentially unsafe situation into a safe one.  Right foot, left pedal, brake!   It takes an incredible amount of willpower to deny this nearly-primal urge and hand my safety over to the Machine.  Regardless, my right foot hovers nervously over the brake pedal.

While the gap is closing, the car keeps galloping at full speed.  They wouldn’t release this feature if it didn’t work 99.999999% of the time, right?  I catch a judgmental glimpse from my fiance in the right seat, so I redirect my thoughts to my mouth so she knows why I’m not doing anything about the impending situation ahead: “C’mon, Audi!” I repeat twice.  Her lips are silent, but her eyes tell me she’s not amused.

Finally, I hear some faint clicking down by the pedals, and the speedometer needle starts falling.  I feel the engine let up, and the brakes start to engage.   Slowly at first.  60-55-50. Then somewhat panicked for my personal tastes.  40-25-10.   Lisa and I hold our breath as the rear bumper in front of us approaches.  5-3-1-0.  Phew!

Hair-raising nerves give way to a flush of relief, and finally a blissful realization: technology just innovated away my least favorite part of driving.

This is Part 1 of a 5-Part Series. 

October 12, 2015 / Michael Yaroshefsky

HBS Reflections

I’ve never really considered myself the “reflective type,” but HBS makes a point of promoting reflection for the purpose of digesting lessons we learn in class.  Unlike accounting or finance principles that can generally be accepted at face value and applied by rote, lessons in leadership and team dynamics are not prescriptive.  They need to be filtered and adapted based on one’s own personal style.  While being a narcissistic hardass may have worked for Steve Jobs at Apple, it’s something I personally wouldn’t enjoy nor do I think I could pull off successfully.

localglobalmaxFortunately, there is no single correct leadership strategy, only a diverse set of parameters with a continuum of possible approaches.  And across this multidimensional landscape of leadership styles, for each individual in a given context, there exist local and global maxima of effectiveness.  (Clearly I take comfort in relating squishy concepts back to hard, logical math.)  The goal of reflection, therefore, is to become aware of these parameters and consider where my personal local and global maxima are.

Early in the semester we were urged to keep a “Leadership Journal.”  While I initially balked at the idea, I now feel like I’m learning far too many new concepts and frameworks each day to hope to remember.  For this reason, I’m endeavoring to start keeping a leadership journal… online — in hopes that I can look back on these posts for inspiration in the future and that they may be of value to anyone else who stumbles on them.

This leadership journal begins now, with this post reflecting on the merits of reflection.  Very meta.

November 14, 2012 / Michael Yaroshefsky

The Organic Music Movement

Every time I switch from satellite radio to CDs when I lose the signal in the Lincoln Tunnel, I realize how much more engrossing high-bitrate music is.  The sound of the CDs feels much fuller.  The surround sound amplifier really hits its stride, giving me a convincing soundstage (as much as you can get in a car).  For those few minutes where the satellites can’t see me, subtle details that are muffled or forgotten by XM’s compression come rushing back and remind me why I love music, and I ask myself why I’m not listening to more CDs.  Then I realize I’m listening to the same songs I did last week because I forgot to change the CD…

English: Spindle with writable CDs.

CDs still sound better than their modern replacements. (Photo credit: Wikipedia)

Streaming audio providers (XM, Pandora, Spotify, etc.) continue to make the CD obsolete by offering us variety and convenience, but in return we sacrifice audio quality.  So the question is: does anyone care?  If trends in other consumer verticals can give us any hints, then perhaps we’ll soon see a renaissance of high quality audio.

Background on Audio Compression

In audio, a song’s bitrate describes the amount of information per second that is recorded.  Generally, a higher bitrate means the audio will sound better.  Streaming audio services like Pandora and Spotify broadcast audio compressed using lossy algorithms in order to make the songs load faster.  In other words, to reduce the file size, they toss out information you are the least likely to miss.

While that may sound complex, we do something like this all the time when we abbreviate in text messages.  Instead of ‘I will be right back,” we write “brb.”  Your friend knows what that means, and you saved a lot of room.  Musical compression operates similarly by scrapping information that isn’t essential to getting the point across.  The ability to compress audio efficiently made way for innovations like the iPod and the iTunes music store, since now you could store thousands of songs in your pocket and download them quickly.

Although not exactly how modern audio compression works, this graphic demonstrates the principle that reducing the amount of data per second can have a significant impact on audio quality. Notice how different the area in the third yellow circle is from the original.  (Photo credit: Sony)

This also made way for innovations in streaming music. By compressing the files enough, companies like Pandora can make sure your music plays without having to stop for loading or buffering — providing a continuous listening experience.  It also decreases their costs to store and transfer this data, meaning it becomes cheaper for consumers.

However, just because you are are less likely to miss what was taken out by compression doesn’t mean you don’t notice that it’s missing.  This site lets you play a file at different compression rates to hear the difference. Try comparing the 64 kbps file to the 192 kbps file.  You can still understand what’s being said at the lowest audio quality, but there’s a real difference in quality.  Although the difference between higher-bitrates (say 160kbps and 320kbps) are less obvious, they are still noticeable when using high-fidelity audio components.  In other words, if you’re listening to the music on your laptop’s speakers or Apple Earbuds, you might not notice the difference.  But you might if you listen to a good set of speakers or headphones.

Imagine walking through an art museum wearing the wrong prescription glasses. Sure, you’re experiencing the same art, but everything’s a little blurry. It’s not the way the artist intended for you to experience it, and that’s why compression is a problem: because it distorts the original work.

What the Organic Food Movement Suggests

Today, compressed music is everywhere: the iTunes store, Spotify, Pandora, XM satellite radio.  Even the audio component of YouTube videos is compressed to save space.  All of these innovations are bringing music and entertainment to our fingertips in ways that would be much more difficult (practically impossible) without compression.  However, compressed audio — by definition — contains less information than the original recording.

Pandora is doing to music what McDonalds does to food.  They’re making music cheaper and more convenient by reducing its quality.

So what happens next?  As we are seeing with the demand for healthier food, consumers are increasingly spurning mass-produced, preservative-laden food for the real deal — and paying a premium for quality and taste.  Companies like Whole Foods and Chipotle are capitalizing on this shifting consumer interest.  Even major CPG brands are recognizing this trend, offering product alternatives to keep these customers from buying elsewhere: from Bisquick “Complete” pancake mix to Yoplait’s new entry into Greek yogurt.

Blue Bottle Coffee

Blue Bottle Coffee has emerged as a competitor to Starbucks in certain markets for coffee aficionados  (Photo credit: inky)

The coffee market is another great example.  Dunkin Donuts commoditized coffee, making it cheaper and more convenient.  Starbucks disrupted their model by offering a higher quality product in convenient locations, at a commensurately higher price.  Recently, upstarts promising even higher quality brews are threatening the market, like VC-backed Blue Bottle Coffee.  Meanwhile, McDonalds has launched the McCafe brand to bring their own coffee up to snuff.

What About Music?

The recent trend in expensive headphones (like Beats By Dr. Dre) seems to suggest that consumers are willing to open their wallets for a higher quality music experience.  High definition video sources (especially BluRay) similarly followed the adoption of HDTVs, so perhaps Monster Audio’s success with Beats suggests that higher-quality audio sources will become more prevalent.

In fact, it seems that the transition to higher bitrates is already underway.  Earlier this year, Spotify rolled out an “Extreme” quality audio setting to the iOS application, doubling the bitrate of it’s previous high-quality audio.  This suggests that the major players are unlikely to be caught flatfooted.

Nonetheless, there may still be room for a competitor to claim to be the Starbucks or BlueBottle to Spotify’s Dunkin Donuts, offering premium-quality audio conveniently over the internet and/or to vehicles.  Perhaps XM will find ways to upgrade their channels to higher bitrates if consumers demand it.  Or maybe there will be a greater interest in live performances going forward, as hipsters consider anything prerecorded to be too mainstream.  Or better yet, maybe we’ll all hire minstrels to follow us around one day.

However it pans out, I am looking forward to being able to fully retire my CDs so I don’t have to be reminded of the Spice Girls album I still own every time I go looking for a new album.

%d bloggers like this: