I worked at Bell Labs Holmdel for precisely 10 days. It's a memory I am glad I have, because it was what persuaded me to never, ever work for a large corporation ever again, and very specifically, to never ever work in an interior, windowless office.
My assigned task: there was a constant in the C code that ran their telephone exchange hardware which controlled how many forwarding hops were allowed. I was taked with changing it from 32 to 64. The allotted time for this task was 1 week.
While I appreciate the committment to quality assurance/testing, the idea that I could have spent my life working in such an environment fills me with shudders.
My brief time there was ended when I rolled my then-wife's car 3 times on the way down to the Outer Banks (NC), broke my arm and could no longer commute between Phila. and Holmdel. Lessons learned, for sure, and appreciated, but not necessarily in a good way.
1 week is fascinating. Was it like – the missing piece was modern version control/CD? What kind of testing would need that? (We have configs at work where the system interactions are so unknowable and the financial implications of reduced efficiency so profound that we have to run multi-week A/B tests to change values) Was it some kind of pathological documentation culture?
AFAIR, there were two aspects to testing. The code change itself obviously only took tens of minutes, if that. First round of testing was just the build test, and that was fully automated but I think there were independent builds for multiple different hardware variations and so the total time for that was several days. Then there was actual use-case testing ... I wasn't involved in that at all, but was told it would also take several days of actual testing by a QA team.
1975: "One of our salaried PhD-level engineers designed this custom slide rule so that you guys can do cost estimates when speaking to customers on site."
2025: "We spent a bajillion dollars on a custom LLM chatbot so that you guys can get hallucinated product specs when speaking to customers on Zoom."
> Bell Labs’ One Year On Campus program, in which they paid new-grad employees to earn a master’s degree on the topic of Bell’s choosing
I wonder why companies don’t do this anymore. Is it something to do with the monopoly AT&T held, is it related to corporate tax structures, is it related to how easy it is to find PhD graduates who studied similar topics of interest, or is it something else entirely?
One common theme is that companies used to treat good employees as assets. Now they treat all employees as liabilities.
What changed? A lot. The underlying theme across all companies, to both employees and customers, has been "see how much abuse they'll take before they leave", which sadly has had marvelous results because the answer is... a lot. Notice that at least half of the largest companies by market cap have no actual support at all.
Add in that tuition has exploded, it became cheaper and quicker just to import people than to train them.
> What changed? A lot. The underlying theme across all companies, to both employees and customers, has been "see how much abuse they'll take before they leave", which sadly has had marvelous results because the answer is... a lot. Notice that at least half of the largest companies by market cap have no actual support at all.
That's probably a direct consequence of the (Milton) Friedman doctrine taking hold (https://en.wikipedia.org/wiki/Friedman_doctrine), which has made capitalism much less friendly to customers and employees in order maximize shareholder rewards.
For customers and employees, this kind of capitalism "optimizes" your situation until it's as bad as you can take without quitting. If you're genuinely happy, that's a glitch that's in the process of being worked out.
But even if companies decided to see how much abuse employees would take, that doesn't explain this. The point of the master's program wasn't to reward the employee; it was to make the employee more valuable for the company.
I think the issue is that companies no longer trust their ability to retain their more valuable employees. Why pay to make the employee more valuable for someone else?
It may be an overall increase in economic anxiety, even at the corporate level. At one time, AT&T could probably not imagine their status would ever really change; today, you'd be foolish to be running almost any company in the US economy and think that the company's position in 10 years is likely to be similar to the one it has today. Making longer term investments in employee skills is almost certainly concomittant with the idea of tackling their income needs/desires/wants, and that looks intimidating when you don't have faith that your company will still be growing 5 years from now.
This is somewhat self-fulfilling is it not? I fear my valuable employee may leave, so to protect myself in this situation I withhold resources I could have granted them. Said employees later realize they could get better and leave.
It is in many cases. Short tenures often become normalized all the way around so companies tend to expect employees not to stick around too long and employees have a mindset that frequent job hunting is just what you do.
The Overton Window for what is acceptable company behaviour has slid very far to "maximizing near-term profits at all costs". You see this is in a lot of areas:
- Employee training and retention. Employers would rather aggressively churn the cheapest employees they can get, rather than cultivating experienced employees.
- Just-in-time inventory. Businesses want to have as little inventory as possible, which means any supply chain disruption causes ripples down the line.
- Advertising. Everything is stuffed full of ads now, including your smart appliances.
The reason companies get away with this is that all big companies have more or less colluded to behave the same way. The managerial class has said "we can have terrible customer service because _everyone_ has terrible customer service". Your product can be full of ads and break every 3 years because _every_ product is like that.
You might ask "why doesn't one competitor break the pattern and do a good job to gain market share?". The answer is that they do, and then they get bigger and more financialized. The capital required to do a good job requires them to take investment, and once the growth narrative has ended they're pressured to squeeze profits out of their customers.
Because we're not in the gargantuan post-WW2 economic boom combined with the electronics-ization / mainframe computerization tech boom anymore. It's easy to treat employees generously when revenue growth just keeps happening almost on its own.
And STEM degrees or even possessing any undergraduate degree was much less common than they are today. The article says "Over 130 people signed up for the One Year On Campus program in 1970.", which is a pittance in a corporation with over a million employees at its peak. Unsurprisingly, people who specialties make them more difficult to replace get treated more favorably.
The answer to your question seems straight-forward: they still do.
1. Companies have tuition reimbursement for general employees, but it's usually nominal at $5k/year and part-time education is implied.
2. Executive MBAs and more selective executive seminars are covered by companies so the focus will be better. These aren't cheap (easily $125k/year) and might come with early exit penalties. I'd be surprised if the Bells lab offer didn't have similar clawbacks.
3. Tuition in 1970 isn't what it is now! Wayyyy cheaper!
> I'd be surprised if the Bells lab offer didn't have similar clawbacks.
Bellcore inherited the year on campus program from Bell Labs. There were no clawbacks and no requirement to even return to work for Bellcore after graduation.
Masters programs in engineering certainly have costs for full-time residential programs. But actual tuition isn't necessarily that much.
As you say, executive MBA programs are costly and the assumption is almost certainly that a company is paying for it for select senior people being groomed for executive roles. They're often full-time but of relatively limited duration.
>I wonder why companies don’t do this anymore. Is it something to do with the monopoly AT&T held, is it related to corporate tax structures, is it related to how easy it is to find PhD graduates who studied similar topics of interest, or is it something else entirely?
Companies do a lot less to retain employees - fewer pensions, fewer accrued benefits and perks - so it's much less attractive to train when you are assuming attrition instead.
I know too many people that've gotten fancy EMBAs paid for by the company as retention/networking tools along with the countless others (read: majority) that don't use the annual tuition reimbursement available to all employees.
Even if the company is subsidizing, earning a degree while working is a pretty substantial time commitment even as a part-time thing though some people obviously do it.
I don't think companies pay for you to go do an MS or a PhD full-time on company dime. At that time AT&T was a monopoly and may have had money to burn on this. There also may not have been expectations of hyper-growth from the stock market that exist today.
AT&T still pays for various MS courses (mostly MSCS, MS data science and cybersecurity) you can do on a part-time basis. It's quite easy to get the tuition reimbursement for it.
In general, nobody needs to pay someone to do a PhD, at least in a scientific or engineering field -- even in the US, the country where education generally costs the most -- typically PhD students get free tuition as well as a living stipend. Masters students, granted typically don't get this benefit -- but in the US that's typically a terminal degree -- unlike in many countries you don't do a Masters and then a PhD -- you do one or the other.
Aren't those stipends skewed toward Ph.D students intending to pursue an academic career? I seem to recall hearing rumors that admitting that one's intent was to go into industry after getting a Ph.D would be viewed unfavorably in grad school applications and by advisors. Perhaps it's only at R1 universities?
True, I think technically I got one when I passed my prelims -- they do that so if people end up dropping out before completing their dissertation they have something at least. But what I mean is in the US dedicated Masters programs are largely set up for people in industry to advance their careers and the schools see the program as a way to raise funds.
>The company I work pays normal salaries for their PhD students.
Which is uncommon. BMW used to list their salaries for PhD students, which were fixed at 36.000 a year, which are wages you would expect for working at a grocery store.
This is still very common in Germany. Many companies, especially large ones, offer Master/bachelor thesis, which are supervised by a professor at an university, but the student is employed full time for the duration of the thesis.
Bell Labs is best known for inventing things like the solar cell and transistor, but that's a small part of their work. Bell Labs had a whole applied division dedicated to phone company science. This article digs into the details of what it was like to work at Bell Labs, but not the Bell Labs.
The Bell System was, in modern parlance, fully vertically integrated. They didn't own the mines that the copper for the conductors was extracted from. And they didn't cast the copper ingots. Though they were interested in the metallurgy. Because they drew their own wire. And cables. And transformers. And vacuum tubes. And so on. It was all in-house. They even treated their own telephone poles. So something like a practical survey of various types of preservative treatments for wood was in the remit of Bell Labs just like the physics of a vacuum tube. Almost everything in science was in the remit when they had such a broad mandate. The Bell System had elements that almost resembled a kind of state within a state. (Part of why it was killed off -- antitrust violations.)
> There was a separate building in the area that did research in radio telescopes. This was an outgrowth of research that investigated some odd radio interference with communication, that turned out to be astronomical. I was never in that building.
Yeah the accidental discovery of the background radiation that they eventually traced to big bang predictions. It’s a good story about following through with your experimental results.
This honestly does not sound boring in the least. Statistical design of experiments is super interesting. You can tune your experiments to get the most useful information within your experimental budget. If you’ve ever run a physical real world experiment, you’ll understand how much time and expense is involved in doing it at a plant level. The ability to be economical here is so important!
Totally agree! DOE is way underrated. Once you’ve had to run real experiments (especially at scale), you really appreciate how much time and money a good experimental design can save. It’s one of those areas where a bit of math makes a huge practical difference.
I agree! Science is about experiments to verify hypotheses. Design of Experiments seems like a fundamental part of that. That's also why the quote below made me laugh.
> What if you don’t care about efficiency or causality?
"Yeah, what about if you don't care about money/time and are happy with finding a correlation only?!!?"
So fractional factorial design using orthogonal arrays / design matrices is the way to go? That’s interesting, but I’ll need help applying this.
When I saw the title, I thought this could be about boring holes, but it was really using the word “boring” to talk about something interesting. And perhaps it’s about digging for oil metaphorically?
She has some heritige there. If your near ancestors are academic it much be such a lift in terms of advice and connections. Espepecially ex Bell labs and the TV patent.
I worked at Bell Labs Holmdel for precisely 10 days. It's a memory I am glad I have, because it was what persuaded me to never, ever work for a large corporation ever again, and very specifically, to never ever work in an interior, windowless office.
My assigned task: there was a constant in the C code that ran their telephone exchange hardware which controlled how many forwarding hops were allowed. I was taked with changing it from 32 to 64. The allotted time for this task was 1 week.
While I appreciate the committment to quality assurance/testing, the idea that I could have spent my life working in such an environment fills me with shudders.
My brief time there was ended when I rolled my then-wife's car 3 times on the way down to the Outer Banks (NC), broke my arm and could no longer commute between Phila. and Holmdel. Lessons learned, for sure, and appreciated, but not necessarily in a good way.
Reminds me of this HN comment about working on the Oracle RDBMS: https://news.ycombinator.com/item?id=18442941
1 week is fascinating. Was it like – the missing piece was modern version control/CD? What kind of testing would need that? (We have configs at work where the system interactions are so unknowable and the financial implications of reduced efficiency so profound that we have to run multi-week A/B tests to change values) Was it some kind of pathological documentation culture?
AFAIR, there were two aspects to testing. The code change itself obviously only took tens of minutes, if that. First round of testing was just the build test, and that was fully automated but I think there were independent builds for multiple different hardware variations and so the total time for that was several days. Then there was actual use-case testing ... I wasn't involved in that at all, but was told it would also take several days of actual testing by a QA team.
1975: "One of our salaried PhD-level engineers designed this custom slide rule so that you guys can do cost estimates when speaking to customers on site."
2025: "We spent a bajillion dollars on a custom LLM chatbot so that you guys can get hallucinated product specs when speaking to customers on Zoom."
> Bell Labs’ One Year On Campus program, in which they paid new-grad employees to earn a master’s degree on the topic of Bell’s choosing
I wonder why companies don’t do this anymore. Is it something to do with the monopoly AT&T held, is it related to corporate tax structures, is it related to how easy it is to find PhD graduates who studied similar topics of interest, or is it something else entirely?
One common theme is that companies used to treat good employees as assets. Now they treat all employees as liabilities.
What changed? A lot. The underlying theme across all companies, to both employees and customers, has been "see how much abuse they'll take before they leave", which sadly has had marvelous results because the answer is... a lot. Notice that at least half of the largest companies by market cap have no actual support at all.
Add in that tuition has exploded, it became cheaper and quicker just to import people than to train them.
> What changed? A lot. The underlying theme across all companies, to both employees and customers, has been "see how much abuse they'll take before they leave", which sadly has had marvelous results because the answer is... a lot. Notice that at least half of the largest companies by market cap have no actual support at all.
That's probably a direct consequence of the (Milton) Friedman doctrine taking hold (https://en.wikipedia.org/wiki/Friedman_doctrine), which has made capitalism much less friendly to customers and employees in order maximize shareholder rewards.
For customers and employees, this kind of capitalism "optimizes" your situation until it's as bad as you can take without quitting. If you're genuinely happy, that's a glitch that's in the process of being worked out.
But even if companies decided to see how much abuse employees would take, that doesn't explain this. The point of the master's program wasn't to reward the employee; it was to make the employee more valuable for the company.
I think the issue is that companies no longer trust their ability to retain their more valuable employees. Why pay to make the employee more valuable for someone else?
It may be an overall increase in economic anxiety, even at the corporate level. At one time, AT&T could probably not imagine their status would ever really change; today, you'd be foolish to be running almost any company in the US economy and think that the company's position in 10 years is likely to be similar to the one it has today. Making longer term investments in employee skills is almost certainly concomittant with the idea of tackling their income needs/desires/wants, and that looks intimidating when you don't have faith that your company will still be growing 5 years from now.
This is somewhat self-fulfilling is it not? I fear my valuable employee may leave, so to protect myself in this situation I withhold resources I could have granted them. Said employees later realize they could get better and leave.
It is in many cases. Short tenures often become normalized all the way around so companies tend to expect employees not to stick around too long and employees have a mindset that frequent job hunting is just what you do.
The Overton Window for what is acceptable company behaviour has slid very far to "maximizing near-term profits at all costs". You see this is in a lot of areas:
- Employee training and retention. Employers would rather aggressively churn the cheapest employees they can get, rather than cultivating experienced employees.
- Just-in-time inventory. Businesses want to have as little inventory as possible, which means any supply chain disruption causes ripples down the line.
- Advertising. Everything is stuffed full of ads now, including your smart appliances.
The reason companies get away with this is that all big companies have more or less colluded to behave the same way. The managerial class has said "we can have terrible customer service because _everyone_ has terrible customer service". Your product can be full of ads and break every 3 years because _every_ product is like that.
You might ask "why doesn't one competitor break the pattern and do a good job to gain market share?". The answer is that they do, and then they get bigger and more financialized. The capital required to do a good job requires them to take investment, and once the growth narrative has ended they're pressured to squeeze profits out of their customers.
> "I wonder why companies don’t do this anymore."
Because we're not in the gargantuan post-WW2 economic boom combined with the electronics-ization / mainframe computerization tech boom anymore. It's easy to treat employees generously when revenue growth just keeps happening almost on its own.
And STEM degrees or even possessing any undergraduate degree was much less common than they are today. The article says "Over 130 people signed up for the One Year On Campus program in 1970.", which is a pittance in a corporation with over a million employees at its peak. Unsurprisingly, people who specialties make them more difficult to replace get treated more favorably.
The answer to your question seems straight-forward: they still do.
1. Companies have tuition reimbursement for general employees, but it's usually nominal at $5k/year and part-time education is implied.
2. Executive MBAs and more selective executive seminars are covered by companies so the focus will be better. These aren't cheap (easily $125k/year) and might come with early exit penalties. I'd be surprised if the Bells lab offer didn't have similar clawbacks.
3. Tuition in 1970 isn't what it is now! Wayyyy cheaper!
> I'd be surprised if the Bells lab offer didn't have similar clawbacks.
Bellcore inherited the year on campus program from Bell Labs. There were no clawbacks and no requirement to even return to work for Bellcore after graduation.
Masters programs in engineering certainly have costs for full-time residential programs. But actual tuition isn't necessarily that much.
As you say, executive MBA programs are costly and the assumption is almost certainly that a company is paying for it for select senior people being groomed for executive roles. They're often full-time but of relatively limited duration.
>I wonder why companies don’t do this anymore. Is it something to do with the monopoly AT&T held, is it related to corporate tax structures, is it related to how easy it is to find PhD graduates who studied similar topics of interest, or is it something else entirely?
Companies do a lot less to retain employees - fewer pensions, fewer accrued benefits and perks - so it's much less attractive to train when you are assuming attrition instead.
No way...
Even guys get parental leave, now!
I know too many people that've gotten fancy EMBAs paid for by the company as retention/networking tools along with the countless others (read: majority) that don't use the annual tuition reimbursement available to all employees.
Even if the company is subsidizing, earning a degree while working is a pretty substantial time commitment even as a part-time thing though some people obviously do it.
I don't think companies pay for you to go do an MS or a PhD full-time on company dime. At that time AT&T was a monopoly and may have had money to burn on this. There also may not have been expectations of hyper-growth from the stock market that exist today.
AT&T still pays for various MS courses (mostly MSCS, MS data science and cybersecurity) you can do on a part-time basis. It's quite easy to get the tuition reimbursement for it.
In general, nobody needs to pay someone to do a PhD, at least in a scientific or engineering field -- even in the US, the country where education generally costs the most -- typically PhD students get free tuition as well as a living stipend. Masters students, granted typically don't get this benefit -- but in the US that's typically a terminal degree -- unlike in many countries you don't do a Masters and then a PhD -- you do one or the other.
Aren't those stipends skewed toward Ph.D students intending to pursue an academic career? I seem to recall hearing rumors that admitting that one's intent was to go into industry after getting a Ph.D would be viewed unfavorably in grad school applications and by advisors. Perhaps it's only at R1 universities?
Depending on the program it's not that uncommon to get a Masters as a step on a path to earning a PhD.
True, I think technically I got one when I passed my prelims -- they do that so if people end up dropping out before completing their dissertation they have something at least. But what I mean is in the US dedicated Masters programs are largely set up for people in industry to advance their careers and the schools see the program as a way to raise funds.
The company I work pays normal salaries for their PhD students.
That pretty much standard in Germany in industrial research departments.
>The company I work pays normal salaries for their PhD students.
Which is uncommon. BMW used to list their salaries for PhD students, which were fixed at 36.000 a year, which are wages you would expect for working at a grocery store.
This is very common in Europe. PhDs are sponsored by a given company
This is still very common in Germany. Many companies, especially large ones, offer Master/bachelor thesis, which are supervised by a professor at an university, but the student is employed full time for the duration of the thesis.
Bell Labs is best known for inventing things like the solar cell and transistor, but that's a small part of their work. Bell Labs had a whole applied division dedicated to phone company science. This article digs into the details of what it was like to work at Bell Labs, but not the Bell Labs.
The Bell System was, in modern parlance, fully vertically integrated. They didn't own the mines that the copper for the conductors was extracted from. And they didn't cast the copper ingots. Though they were interested in the metallurgy. Because they drew their own wire. And cables. And transformers. And vacuum tubes. And so on. It was all in-house. They even treated their own telephone poles. So something like a practical survey of various types of preservative treatments for wood was in the remit of Bell Labs just like the physics of a vacuum tube. Almost everything in science was in the remit when they had such a broad mandate. The Bell System had elements that almost resembled a kind of state within a state. (Part of why it was killed off -- antitrust violations.)
"the" Bell Labs was effectively gone by the 1970's anyway. So the Bell Labs described in this article was "the" Bell Labs
That didnt stop the flow of Nobels
For work done after 1970 - optical tweezers,laser cooling, quantum hall, superres microscopy, quantum dots (NP 2023)
Not to forget Shor and Grover (1990s!)
Holmdel had the antenna of Penzias/Wilson
For me, this marked the end of Bell Labs (Murray Hill):
https://en.wikipedia.org/wiki/Sch%C3%B6n_scandal#Beginning_o...
Not sure what the bottleneck was: https://www.forbes.com/sites/josipamajic/2025/11/19/science-...
[dead]
> There was a separate building in the area that did research in radio telescopes. This was an outgrowth of research that investigated some odd radio interference with communication, that turned out to be astronomical. I was never in that building.
Wonder if that's the detection of CMBR
Yeah the accidental discovery of the background radiation that they eventually traced to big bang predictions. It’s a good story about following through with your experimental results.
This honestly does not sound boring in the least. Statistical design of experiments is super interesting. You can tune your experiments to get the most useful information within your experimental budget. If you’ve ever run a physical real world experiment, you’ll understand how much time and expense is involved in doing it at a plant level. The ability to be economical here is so important!
Totally agree! DOE is way underrated. Once you’ve had to run real experiments (especially at scale), you really appreciate how much time and money a good experimental design can save. It’s one of those areas where a bit of math makes a huge practical difference.
I agree! Science is about experiments to verify hypotheses. Design of Experiments seems like a fundamental part of that. That's also why the quote below made me laugh.
> What if you don’t care about efficiency or causality?
"Yeah, what about if you don't care about money/time and are happy with finding a correlation only?!!?"
So fractional factorial design using orthogonal arrays / design matrices is the way to go? That’s interesting, but I’ll need help applying this.
When I saw the title, I thought this could be about boring holes, but it was really using the word “boring” to talk about something interesting. And perhaps it’s about digging for oil metaphorically?
I suggest “Design and Analysis of Experiments” by Montgomery.
https://faculty.ksu.edu.sa/sites/default/files/douglas_c._mo...
She has some heritige there. If your near ancestors are academic it much be such a lift in terms of advice and connections. Espepecially ex Bell labs and the TV patent.