Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Steve Jobs at MIT Sloan School of Management (1992) [video] (youtube.com)
254 points by LiweiZ on May 14, 2018 | hide | past | favorite | 76 comments


Listening to this talk, you get a sense of how much more intelligent he was than he may have even gotten credit for. All the talk about his temper and his stealing of engineers ideas, an imperfect boss, etc, sorta gives the impression that he wasn't as smart and he was more of a salesman, but holy shit, he was incredibly intelligent, and a real thinker.


Some of the smartest people I've met work in sales. You don't have to be an engineer to be smart.


Agreed. It's easy for engineers to assume because they have a deep understanding of a complex topic, that it is a good proxy for assessing general intelligence.

Such assessment is a good proxy for a certain kind of intelligence only, because intelligence itself exists on a fairly wide spectrum, it is not a binary trait.


The problem is that the average salesperson is pretty ignorant, even about the probuct he's supposed to sell and the target audience to sell to. Encountering a great salesperson is a rare occurrence in life, encountering a more or less average one is very common.


Same with programmers or anything else.


The problem is the baseline STEM disciple looks on the baseline biz/marketing graduate with disdain like they took an easy route out.

'proper' Engineering (MECH/AERO/CIVIL/EEE) is full of memes about this mindset.

CS is just as bad.

Most decent engineers loose this mindset after freshman year but many go on to see them selves as the MVP department of whatever business.

In some ways I think Startup culture fights against this because it's really obvious at small team scale how different skill sets are needed.


The absolute best in any field including fast food cook is probably as intelegent as any other field. However the average really can be meaningfully different based on self selection. HR for example really does not attract the best and brightest.

Sales is an outlier because compensation for the best can be really good. So, it's got a large group of very talented people and many far less so.


There's always the difficulty of being able to recognise that as a lamen. Mediocre sales people are often transparent to anybody who listens to them and is able to ask questions. A mediocre engineer "blinds with science" and is very hard to identify without some amount of specialist knowledge.


Of course, never said it's not. Are people reflecting from themselves, or what is this about?


Classic programmer narcissism


Possibly just ignorant. Most people have never/will never be in charge of large budgets.

A normal consumer with a normal job is only really going to encounter the lower end of the field.


Only today’s nitpicking culture could ever create an impression that Steve Jobs wasn’t intelligent. He basically created modern computing. The thought that he was just a snake oil huckster is lunacy.


> He basically created modern computing.

Whooaa, that's quite a stretch.

'Modern Computing' was happening with or without Steve Jobs. He certainly was an exemplar visionary that put the right parts together at the right time.

But, he did not create or define. Technological progress has it's own immutable momentum.


Perhaps a better expression is that he helped popularize computing.


I would again push back on that.

He did introduce a class of devices/products but Apple and Computational machines are not two concepts I associate.

The Windows OS has done more. My point being... yes he's a genius but in terms of Marketing, Business and seeing where the hockey puck is going.

He excelled at selling to desires, but look at the 'real computers' Apple has produced in recent years.


I shared the same ignorant view until I saw with my eyes what really happened. Everything done in the early 2000s was done by Apple first(and for the most part really well) as much as I hate to admit that. All Windows did was copy much of the functionality(and did a terrible job at it). If you go through the Keynotes during Steve Jobs return to Apple from 1999-2002 you can see WiFi coming out at least 1 year before anyone, high quality, performing laptops with good battery life and WiFi before anyone. Take a look at the OS X Aqua interface[1](which hasn't changed much even to this day) coming out 1 year before even XP. Lets not even forget the failure of Vista. It took almost a decade for something even close to come out on Windows.

Spend hours and hours going through the unedited version of history and judge for yourself. You can see the birth of many industries(Youtube-Mac Streaming, Google Search - Sherlock) in those keynotes and its refreshing to see it done so well. I'm not saying Jobs succeeded at all these things, but he definitely saw these things way way before anyone.

[1]https://youtu.be/auXc0tgdJSo?t=8m43s


Dude...that sounds a lot like a "no true scotsman" line of argument. if by "real computers" you mean devices for programmers to train ML models, then yeah, Apple is pretty lacking, if by "real computers" you mean devices which operate end-user facing software, well, the spectrum widens enough to show you why Apple is approaching a trillion dollar market cap.

put another way, a lot of the _reason_ that companies like facebook and google are doing any computation in the first place is because of the level of data and interaction that people are doing on devices made by apple. You are welcome to make a distinction between a Cray and a MacBook pro, bu it is kind of pedantic.


If sending emails and browsing Facebook is 'computing' then you're right.

If being able to create the 'end use software' on these devices, then again you're right.

If these devices are little more than portable web browsers, computing ?


Wrong, not a stretch.

It’s an interpretation, on what all but the extremes on either side can agree is a subjective inquiry in the first place. Also, a bit of a futile and misguided one if used to minimize other’s contributions in the field.

But for the same reason, it’s folly to discount Jobs as not a, if not the, closest thing to a creator. This is true despite engineers’ immutable/incorrect belief that actually creating product value + building working software = changing the world is inevitable, this video proves otherwise (back room computing vs actually empowering operations using computing) and reminds us that real artists ship.

Had Steve not, who knows what computing would look like 2008-2018, but it’s certainly not a stretch to posit that it’d look a whole lot different, which is what it seems gg99 was saying.


Are you conflating 'computing' and consumption of screen time?

Windows has done far far more in terms of being a platform for computational work.


> He basically created modern computing.

Incorrect. Perhaps this was just your attempt at humor, but it couldn't possibly be further from the truth.

> The thought that he was just a snake oil huckster is lunacy.

That someone is intelligent does not prevent them from also being a snake oil huckster. As a matter of fact, intelligence will make you the best snake oil huckster.


I read his biography by Isaacson. He was a fascinating person, extremely smart, a keen sense of design and a (self-taught) ability for sales. He was most definitely a boss I wouldn't have wanted to work for though, because he was an extreme narcissist.

Smart and terrible person don't exclude each other.


I don't know, I'd rather be an early Apple's employee or part of the first iPhone team, witnessing a major new chapter of computing, than a comfortable developer in a comfortable setting.


True! And, this was change the world kind of stuff — it wasn’t being “mean” for the hell of it, it was being “mean” in the pursuit of a vision. It doesn’t excuse “being mean,” but it’s far different than having some jackass middle manager at some consulting firm yelling about TPS reports.

It’s like the drill sergeant phenomenon— harsh leadership can have a place when attempting to forge ahead in a difficult mission as it often motivates the team to accomplish more than would normally be considered comfortable (or even healthy.)

An asshole boss on a mission is far preferable (to me,) than a “nice” boss driven by nothing other than maintaining the status quo.

Asshole bosses maintaining the status quo — now that genuinely sucks.


>True! And, this was change the world kind of stuff — it wasn’t being “mean” for the hell of it, it was being “mean” in the pursuit of a vision. It doesn’t excuse “being mean,” but it’s far different than having some jackass middle manager at some consulting firm yelling about TPS reports.

I don't think the meanness helped in creating the vision. They're two entirely separate things

>An asshole boss on a mission is far preferable (to me,) than a “nice” boss driven by nothing other than maintaining the status quo.

I'll agree on that, but still my experience has been that the bosses that extracted the most work from their employees were the ones that could give praise where it was due. I think Apple made it in spite of Steve Jobs character flaws, not because of them.


"When we were learning about manufacturing at Mac, we hired a Stanford Business School Professor at the time named Steven Wheelwright, and he did a neat thing. He drew on the board a little chart, first time I met him. He said, you can view all companies from a manufacturing perspective this way.

"You can say there's five stages-- one, two, three, four, five. They have all these things. And stage one is companies that view manufacturing as a necessary evil. They wish they didn't have to do it, but damn it, they do. And all the way up through stage five, which is companies that view manufacturing as a competitive opportunity for competitive advantage. We can get better time to market, and get new products out faster. We get lower costs. We get higher quality.

"And in general, you know, you can put the American flag here [puts it under 1], and put the Japanese flag here [puts it under 5] [Laughter] [Applause]. And that's changing, however.

"By the way, just going back to software for a minute, I often apply this scale to computer companies, and how they look at software. See, I think most computer companies are stage one. They wish software had never been invented. I put Compaq in that category. And IBM is maybe stage two, and things like that. And I think there's only three companies in here [pointing at 5] and that's us, Apple and Microsoft, in stage five. We start everything with the software and work back."

Wow. I think this is a great way to look at how companies look at ML/AI: You have Facebook, Google, MS, Amazon trying to use it as a competitive advantage, whereas I'm sure there are some companies (medical?) that wish these systems were never invented.

This also shows why you would want to be at companies that look at software as a competitive advantage: as an engineer you are the profit centre and not the cost centre.


Very good framework! Recursively applying it, given an individual's expertise and interest, we should all perhaps aim to be in Stage 5 companies (or verticals within companies) within our fields of expertise. That's where your work is most likely to be meaningful and hopefully, at the cutting edge.


Alan Kay which Steve Jobs quoted said if you're serious about Software you should build your own hardware.

I think they're both important. It's also interestng to note in this video he says that hardware has no competitive edge. But that was 1992. Now it has flipped, software has less competitive edge than hardware.


Context matters.

Jobs was repeatedly referring to system software in this talk, a category distinct from application software.

System software + own hardware = competitive advantage

Application software + own hardware = ?*

*Most likely a world of pain and heavy losses.


Will keep it short -

Using same analogy - When you look at Toyota, what does the first thing come to your mind? I hope Cars. So, computer engineering is going to be a secondary endeavor there.

Internally, MS and Apple have a lot of units\departments, which have many computer engineers, which they treat them as cost centres


> “I think that without owning something over an extended period of time, like a few years, where someone has a chance to take responsibility for one’s recommendations, where one has to see one’s recommendations through all action stages and accumulate some scar tissue for the mistakes and pick one’s self up off the ground and dust one’s self off, one learns a fraction of what one can,” Jobs said. “You do get a broad cut at companies, but it’s very thin.”

> “You never get three-dimensional,” he said. “You might have a lot of pictures on your wall, you can say ‘Look, I’ve worked in bananas, I’ve worked in peaches, I’ve worked in grapes.’ But you never really taste it.”

His view on Consulting is pretty interesting. Any consultants here who'd like to give the other perspective?

http://mitsloan.mit.edu/newsroom/articles/steve-jobs-talks-c...


I agree with his view on consultants from my own experience. I've never been a consultant, but in the last 10 years, I have been a "job hopper", but only because companies refuse to give market based raises and the best way to get a "raise" is to get a new job.


I've met consultants who have described it as sort of 'addictive' for people with ADD, mainly because they get to feel like they are in the middle of some big drama, for a short time, too short usually to really own the problem, then the engagement is over and they go on to the next thing.


He started hiring consultants a few years later at NeXT. I was hired to code WebObects projects for NeXT customers.

Steve branded us as 'WebObjects Experts'.


Could you write about your experiences being a consultant during that time? Are you still a consultant, how has it changed over the years?


I agree with his take as well.

The nature of consulting in my experience is such that a consultant will generally not have enough skin in the game, to use NNT's well-known phrase, to be able to deliver an out-sized outcome for the client, or to deliver an expensive but important life lesson for the consultant, if things go south.


I pulled out a section of this talk the other week that I thought was especially brilliant: his description of "Technological Windows of Opportunity" — when enough trends converge onto a radically new platform/experience and the economic costs of the technology align to make sense for a company to deploy and become the widely adopted platform in that window

https://www.youtube.com/watch?v=zJX476dDFVc

The Lisa was too early. So was the Newton. So was the Next Cube and NeXTStep... (which of course now is iOS)


A huge percentage of ideas aren’t bad ideas but they’re not at the right time because people aren’t ready for them, complementary technologies aren’t mature enough, they solve a problem that isn’t a current market priority, etc.


You can find plenty of articles in magazines in the early 1990s talking about the handheld internet connected device (that nobody could succeed with until the iPhone).

The Mac itself was too early for wide adoption. The first Mac had a tiny memory footprint and no hard drive. CPU and GPU technology needed time to speed up and shrink in size and cost. Apple survived on Apple II sales for years after the Mac was introduced and iterated, and nearly suffocated from mismanagement until Jobs returned and the iMac launched — Windows 95 was right on time.

Here's an Apple video about conversational voice-based digital assistants... from 1987 https://www.youtube.com/watch?v=HGYFEI6uLy0

The Valley is littered with husks of companies that had the right vision before the technology was ready (e.g. https://en.wikipedia.org/wiki/General_Magic). NeXT was almost one of them.


The Mac wasn't just "too early" it was overpriced.

The popular narrative that Apple struggled from 1984 until 1997 isn't true. Apple was doing fine between around 1987-1994. It was vying for the #1 computer seller in the early 90s with HP.

Two things happen in 1995. Apple made a lot of low end computers and not enough high end computers and Windows 95 was good enough.


It was overpriced because it was too early. The components to deliver the GUI experience weren't sufficiently commoditized yet.

In 1995, the entire product line was confused, and the then-CEO had opened up the platform to commodity cloners, robbing Apple of its vertical market and hardware margin.


The Mac had ridiculous margins compared to PCs by 1987 - especially for color Macs.

Jean Louis Gassée (founder of Be) was the main executive of Apple who went for short term margins instead of going for market share.

I got my first Mac in 1992 after owning an Apple //e for years. For the price of my Mac LCII - 68030-16Mhz with slow built in graphics, and a 512x384 12" monitor that wasn't compatible with most Mac games, I could have gotten a much faster 386, a sound card, and better graphics, much cheaper. Then Apple even cheap out more and had a 16 bit bus instead of the 32 bit bus. The Mac used 68030 processors that were used by other workstations, SCSI chips that were widely available, etc.


Yes, they managed the company terribly in the years after Jobs left (with the exception of the alignment with IBM and Motorola for the Power architecture and chip transition). Apple had something like 5 different CEOs in 8 years before Amelio decided to buy NeXT and bring Jobs back into the fold.


Sculley wasn't a bad CEO. He was Steve Ballmer style CEO. He knew how to profitably manage a company but he had no vision. Under his leadership, he brought Apple from the brink of extinction in 86, led the transistion to the PPC, the PowerBooks defined what a modern laptop should be and between 1989 - 1992, Apple was at it height. Scully would have been a better CEO than all of the CEOs that came between 1993 and the return of Jobs.

There are very few tech companies that successfully pivot into new areas and can transistion when technology changes that are not led by their founders. Just like "only Nixon could go to China", only Jobs could make the famous deal with Microsoft in 1997.


but then there was Spindler and Amelio. Though Spindler handled the 68k->PPC transition, and Amelio did pull the trigger on Next over Be and brought back Jobs.


There is some truth to that in my experience. Most of my friends coveted a Mac, but only a few could afford one. We settled for Windows 95.


I never watched a Jobs talk until now. I always thought he was an asshole based on what others had said. Shame on me. But now I've got a very different take. This guy was in a league of his own. He may not have been the best programmer, manager, salesman or other various roles he held but he was smart enough to know failure paves the way for success. I was moved by his way of conveying a complex message in a simple saying is priceless. It is a shame he is no longer with us.


You should listen to the commencement speech he made in 2005 at Stanford https://youtu.be/D1R-jKKp3NA


I'm amazed at his ability to deliver this type of presentation with such fluidity. Granted, this is most likely amalgamation of his daily talks and thinking that he internalized, but in order to formulate them, he must spend a better part of his day just thinking.


One aspect which really struck me was his diametrical opposition to Amazon's famous "disagree and commit" core value. Very interesting to hear the counter argument.


Job's read and perception of where things were going is very impressive, hard to think of anyone today who could be this eloquent and persuasive...


Not sure I agree here, but he was a CEO who was simultaneously in this underdog position but also playing it really well. The refreshing thing about this is that he's relatively open about talking about his thoughts for the future and the strategy of his company. Most CEOs (including Jobs at a later stage) are much more apprehensive about talking openly. He turned his futurism into a marketing move, a bit like Elon Musk uses his vision, goals and trajectory as a marketing move.

That's basically the only reason I like to listen to the big VCs, because they're actually terribly open about their thoughts unlike virtually all CEOs. Because they also use their vision of the future as a marketing move. Someone mentioned Naval for example, that's no coincidence I think, angel/venture investors cluster quite high on my list of people talking openly about their perception of where things are going (regardless of whether it's a correct perception or not).


I was probably hard on Naval just now - IMO Jobs created what is today the most successful company on the planet. VC's are great at placing other people's bets and sounding convincing about it, not in the same league...



that's just someone who has tweeted 16.7k times, I can't imagine Jobs would waste his time on doing that...


>eloquent and persuasive

Nothing to do with how they conducted their lives.


> waste his time on doing that...

Not a time waste. Look at Elon. He spends a lot of his time in engaging at Twitter


He should spend more of his time building cars.


Elon has said that his tweet frequency should not be construed as time spent on the platform.


He must have better access to the space time continuum than everyone else then - I tend to put manic tweeters into a low attention span, high hand waving for attention group who only have a few hours left in their day to actually do anything else...


What he probably has access to is people who review his feed and replies and curate that for him. So he's not responding to tweets he's found himself; but rather to tweets that have been highlighted for him to respond to (if he so desires). Heck, his team might actually include possible response ideas.

The fallacy is thinking Elon uses Twitter like any other person uses Twitter. Sitting on his device, manually scrolling through it and replying sequentially. He's probably outsourced much of this to his media teams or other people he trusts to provide relevant material to engage with.


I wonder in what directions and how much Jobs' thinking about management developed since 1992. He had been fired from Apple, partly because of his management practices. Certainly, he learned things in the 'wilderness', and learned things after he returned.


This would have been around the time he negotiated a gigantic deal with Disney for Toy Story at Pixar, after selling them Pixar's hardware and CAPS software for digital compositing — used to great effect in the 'Disney Renaissance' era (it was first deployed in this opening shot for The Rescuers Down Under (1990): https://www.youtube.com/watch?v=KjkdOAjtJ1k)


If only someone had written a biography about his life... I'm sure it would have spent a bunch of time looking into this crucial period!


"We won't see another operating environment for computers, because it's extremely hard to fund a professional sales force to educate users with an ASP of around $500"

It's amazing how well this insight/prediction holds after 26 years, the only other comparable OS/OE today is linux, which didn't rely on the funding. Makes me think if/how another leap in computing software will happen.


Well, iOS, right? or Android? or are those not "Computers" ?

If I had the money to bet, I'd be betting on AR as the next major paradigm, though. I think the idea of OS changes a lot when you get rid of the "slab of glass with a terminal" model. If computers are fundamentally ubiquitous, then what an OS does will need to be quite a bit different. I'm not sure it replaces a legacy computer, but I genuinely hope to have AR be a completely alternative option for most applications.

If I had to guess where the funding for that salesforce will come from, I'd guess: Softbank, and other mega-VC funds. if you do successfully create the next wave forward for computing, and you can provide the systems to enable it, the rewards are likely commensurate. I think this is why everyone is so interested in AR. Magic Leap is the darling, but Google, Apple, and Microsoft all agree that this is a critical turn, and are putting in the dollars to win.


> Well, iOS, right? or Android? or are those not "Computers" ?

iOS is NextStep, and Android is Linux, so in that way what Steve said holds true, given that Linux wasn't developed commercially.


I’d place my bet on driverless and personal assistants as the next OS platforms. VR will be part of a personal assistant OS.


I’d say that Android and the iPhone OS count (or maybe that didn’t fall under the definition of computer) the difference here is that there was a new hardware interface driving the development of a new operating environment.


"The best code is the code you don't write"

This really rang true for me when I started developing with WebObjects (from Next). In the first week I was blown away how little code I had to write and how much was done for me. I knew the theory of a "Framework" and I think to this day many Libraries are called "Frameworks", but working with a real one is a game changer.


So fascinating. 70 minutes, not a single slide.


Just like most University lectures in the country. I don't think that is good or bad in itself.


One thing that stood out to me is how much of what he talked about is still true and still playing out decades later. The fundamentals of technology businesses and the dynamics that occur really don’t change all that often.


Wait until you realize that they never change, and apply to all industries in slightly different flavors.


Interesting to think about why this reality didn’t come to pass. Ben was clearly right about the opportunity market being huge and i think it’s timing. Without having thought too much about I think he underestimated or couldn’t compete with Microsoft. Having cheap wintel PCs everywhere made next irrelevant.


The points Steve makes in the early parts of the session were quite visionary, but when he got to the hardware parts, I guess it's quite easy to say in retrospect what they got wrong.

What's interesting, however, is that he does discuss this from several angles (my interpretation / memory below), namely that a) the hardware angle was a way to get people to use the software, b) a purely software venture couldn't sustain funding for the level of marketing required to break into the market, c) software seems to be moving slower than hardware (? - I guess that's the 80s / early 90s for you) and d) there's still a juicy hardware business that would be left off the table, if it were a software-only venture. So, it was a fairly well researched and opinionated bet, that just didn't pay off. I guess by tying the success of their software to their hardware, they ended up sealing its fate.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: