Ironically, YouTube is now forced to support a browser that has terrible standards support, entirely of their own making: Cobalt[1].
YouTube on TVs is actually a web app that loads into a stripped down, custom webview. The YouTube team doesn't have the resources to implement many web APIs, so they implemented just what they needed.
The problem is that they can't reliably update Cobalt versions on TVs, they can't ask users to update, and they can't just break older TVs in the wild. So the YouTube on TV frontend (not YouTube TV the service) has to only use APIs they shipped like 10 years ago.
And because it takes so long for an old Cobalt version to go out of support, they don't invest in implementing new features because they wouldn't be usable anytime soon. 10 years ago I was in a meeting with them where they said they couldn't implement something because they wouldn't be able to use it for 5 years... They still haven't implemented it.
> 10 years ago I was in a meeting with them where they said they couldn't implement something because they wouldn't be able to use it for 5 years... They still haven't implemented it.
I call that trailer park logic:
They say: "Why go to college? That will take four years and I need a job now!"
Then four years later, while still in a dead end job: "Why go to college? That will take four years and I need a job now!"
Right, that's the point though - at some point, getting out of that circumstance is the best available move, meaning walking away - dropping everything and just walking away, somewhere different, is going to be better than staying where you're at and continuing to struggle. Bankruptcy, homelessness in big cities, addictions, abusive relationships, there are all sorts of contexts where people get stuck, and they feel compelled to stay and struggle and try to battle through whatever those challenges are. They feel like they have a duty to battle out the hardest, most impossible struggles where nobody reasonable in the entire world would expect them to have to overcome.
Sometimes the best available move is physically escaping, just getting up and walking away and continuing until you find anything better than where you were at. For some weird reason, that move feels like giving up to people, until they actually do it and it works. This move is sometimes appropriate for jobs, relationships, addictions, violent circumstances, toxic social groups, politics, and so on.
If you've got next to nothing, then you have almost nothing to lose, and that can be a profound amount of freedom if it's seized. It's not always the right move, but sometimes the only winning move is not to play. Go find a better game.
That logic is also how established tech companies allow startups with disruptive technology to eat their lunch. Of course it can sometimes take decades for that disruptive innovation to appear.
It's not profound. It's literally just the age old <group of of people> <thing> combination but with a couple extra words to seem high brow and the group it casts shade upon is picked to confirm certain biases.
People who live in trailer parks are poorer and have lower educational attainment than other groups. It's not a bias to acknowledge this or to reference it.
It's also not unreasonable to believe that the two things are linked.
You could've called it hoodrat logic, their educational and financial success is on the same order.
Why not call it woman logic? They are famous for strictly using long term planning and cold logic to plan their lives to the point they even joke about it.
You could've subbed in just about any nationality.
But you chose trailer park because the point was to pick a group of people that a bunch of other educated white collar people (I think the trailer park people would use the term "coastal elites" for this group, lol) like feeling better than, hence why the other groups wouldn't do.
If you wanted to make if harmless you could've chose any manner of public personality (politicians are gold mines for peddling short sighted stuff, plenty of examples to choose from) to name it after.
> But you chose trailer park because the point was to pick a group of people that a bunch of other educated white collar people (I think the trailer park people would use the term "coastal elites" for this group, lol) like feeling better than, hence why the other groups wouldn't do.
I started calling it that when I lived in the trailer park.
> If you wanted to make if harmless
It's already harmless. The people who live in trailer parks don't need your protection, and acknowledging that most of them don't want to be there isn't hurting anyone at all.
The yt-dlp devs also talk about this openly, in GitHub issues and elsewhere. I think there have been multiple front page stories about how yt-dlp works here on HN over the last few years.
Even simple web apps can benefit from web platform improvements. JS, HTML, and CSS have all gotten significantly better in recent years.
But YouTube is also a very complex app. Yes it "just" exists to play videos, but the app is so much more than a video player. Browsing, searching, comments, chat, playlists, YT Live, subscriptions, profiles, ratings... there's a lot there.
<video> tag is probably the biggest change, but I still remember YT used SWF/FLV before then (and likely could still do today).
However, it's clear that the devs are mainly composed of trendchasing sheeple who have drunk the Goog-Aid and are addicted to newness and reinventing wheels to make them square... because they have to justify their existence.
> And because it takes so long for an old Cobalt version to go out of support, they don't invest in implementing new features
What new features?
The only "new features" Youtube implements is shoving shorts down your throat and taking five seconds to show video times on thumbnails despite the fact that the data is already there.
There's nothing Youtube requires from "new features" that can't be implemented in a browser tech from 15 years ago.
Also, Youtube the site doesn't have to deal with Cobalt-the-TV-app just like it doesn't have to deal with YouTube-the-mobile-app
No, SABR and UMP were implemented recently. That did come on the tail of dropping some older TV (presumably Cobalt) clients though.
Video encodings themselves are separate the client always selects the most favorable one from the available set (e. G., vp9 over av1 when hw decode for av1 is not present)
See how you can't even explain what features Youtube wants to implement that they are so hamstrung by Cobalt.
Or why they are hamstrung by Cobalt at all, since it's by definition a TV app that is not expected to implement all the features of a desktop or a mobile app [1]
Instead, you just go for a personal attack.
So, who's insufferable here?
[1] BTW I literally work on one such app, and the number of features we cannot implement on such a constrained platform as a TV is now probably in the hundreds. Doesn't affect the site, or the desktop app, or the mobile apps.
Edit. BTW, if you think that ditching Cobalt (whose features Google literally directly controls) in favor of browsers running on TVs or gaming consoles will somehow give you great modern browsers with standards support, you know even less than nothing (if it's at all possible).
E.g. we still don't use CSS variables because browsers on a significant portion of TVs that are still in use don't support them.
It's a site that shows a bunch of text, a few images, and then loads and plays video. What features does it need on a TV that it's so hard to implement?
Edit: I know which ones, and they have very little to do with Cobalt, but with the fact that even high end TVs are often worse than a Raspberry Pi, and can stick around for a decade. But this is nothing ditching Cobalt would fix.
E.g. you can't run 4k video on some models that can technically show it because there's not enough CPU and RAM to run the browser, the video, the decoder, and the DRM at the same time, the video stutters
I worked on the front end of Bing (then Live Search) back in 2007, and even within Microsoft, IE6 was hated and rallied against, at least by any team doing web development.
I remember that the former GM of the Internet Explorer 5 and 6 team transferred to my org about a year after I joined. In his intro email, he included a sheepish apology for IE6, which I printed and kept on my office wall for the rest of my time at Bing, it was a prized possession. Man that browser caused so many nightmares.
(to clarify, the GM was a good and smart guy, the apology was a little tongue-in-cheek since IE6 was arguably the best browser upon its release - the problem was Microsoft effectively abandoned it and let it languish and stagnate for years while the web moved on without it, which turned it and the IE org into well-deserved pariahs)
Microsoft was in such a hurry to kill off IE6 that if install a fresh copy of Windows Server 2022 with the latest updates, then Event Viewer will throw an “Access denied!” error in your face at startup. That’s because the IE6 logs were unceremoniously ripped out, but the default Administrative view still contains it in its list.
If IE6 did everything that a user wanted, why should he have to update? I totally disagree with this assumption that we should always have to be on the update treadmill, always changing the software we're using, and if we don't, then it's some kind of user failure showing "we cannot be trusted". Maybe if IE6 was so terrible, Microsoft shouldn't have released it in the first place. Don't blame the user.
The only good justification I can think of to update totally working software that I am happy with, is for mitigating security vulnerabilities. And even then, the choice should be on the user.
> If IE6 did everything that a user wanted, why should he have to update?
It often didn't, but the user was not always in control. Many business and educational environments held back and their users were in locked down machines (for good reasons) so could not upgrade if they wanted to.
This had a secondary effect: parents in households where the kids weren't in control of the tech were wary to upgrade in case it made them incompatible with work or things the kids needed for school.
> Maybe if IE6 was so terrible, Microsoft shouldn't have released it in the first place.
As has been mentioned a few times, upon release IE6 was the best browser commonly available by a number of metrics. Netscape was properly stagnating around then, Firefox was not yet a thing (even under is earlier names), chrome was even further off, and other alternatives only captured a niche market. A lot changed between then and 2009 but IE6 didn't.
> this assumption that we should always have to be on the update treadmill
This wasn't the enshitification treadmill that we experience today. Newer browsers at the time were offering key benefits for performance and security as well as significant useful features for designers that had to be inefficiently polyfilled or rejected if you needed to support older UAs.
> mitigating security vulnerabilities. And even then, the choice should be on the user.
No. As much as I disparage Windows for being the OS that can't be trusted not to randomly reboot if you leave it unattended for 12 hours, security updates are everybody's problem if you get infested with something that goes on to affect the wider network.
From the depths of my heart: thank you. Whatever you did to kill it, I claim it was justified self-defense. I have my scars from the Browser Wars, and the string "IE6" fills me with loathing to this day.
For my own part, I made sure my employer had plans to remove IE6 from our support list the day Google officially did the same in March 2010. The very next day, I started adding code to our site that complied with official standards and worked perfectly on every other browser, and removing all the compatibility hacks we'd deployed to make that pig render a screen correctly. It was incredibly liberating.
My first serious web programming job was creating a complicated web-app with lots of JavaScript that had to support IE-4/4.5/5 and Netscape Communicator.
I don't miss those days at all... A major client of my employer around 2000-2001 was "standardized" on Netscape 4.06. And I was expected to make stack diagrams and Gantt charts. I wrote an abstraction library just to draw boxes on the screen targeting the IE/NN 4x/5x differences, having to cover the screen in NN while "drawing" just to prevent the flickering effect causing someone a seizure. ILayer/IFrames, ugh... dynamic forms were horrible, having to mirror multiple forms into a composite hidden field form next to the submit button.
So many hacks... Not to mention the IE 5.0.0 select api bug, or the later uncatchable error in IE8's JSON parser... those were some rough years.
Well, JavaScript didn't have a ton of features back then to muddy up the waters. So that helped. And no frameworks kept things simple.
The most complex part was a dynamic query builder where you could pick columns and various kinds of filters. We could have gone to the server each time the user changed the query, but I found it a lot snappier to do it all with document.write().
For a while, JavaScript was shunned by a lot of web shops. Applets and Flash were the future! Then Google Maps came out and showed what you could really do, and JS became cool again.
The browser wars have not ended though... Chrome simply replaced IE6. We are in the exact same situation as before: the web is effectively owned by a single corporation.
Except Chrome itself is open-source, cross platform and runs almost everywhere, not to mention multiple forks and other browser implementations around it.
The IE6 age was much worse. IE6 only worked on Windows so many sites did not work on Mac or Unix which was a big hiderence for the adoption of non-windows platforms.
While Chrome is big, Safari still has significant market share today.
Not really, IE6 legacy was split in 2, Safari is the one that don't implement modern Web APIs, some times going with alternative implementations that benefits only them.
"Modern Web APIs" are just what Chrome is doing. They even usurped the standards bodies with their "WHATWG". It's what Microsoft wishes it had achieved back then.
Just give me WebGPU, proper PWAs and the FileSystem API mate, I don't care if what the users want was proposed by Google. Maybe the bluetooth and NFC APIs too, if that would not disrupt their App Store too much :)
I ran a web dev agency back in 2012 building websites for restaurants and SMEs. One of my partners was insistent that we had to support IE6 and also avoid using CSS3 and HTML5. Despite our own analytics showing less than 3% usage.
Netscape 4 was the bane of my existence, moreso than IE6 ever was, as an important client standardized internally on that forever so our entire platform had to be completely compatible with it. At least with IE you could do things in a user friendly way (perhaps at 2x the development and maintenance cost). Netscape 4 simply didn’t have the capability to do things we wanted to do experience-wise (like getting pushed content, I think?) without doing some extremely crazy and brittle workarounds at best (making it feel more like 5x the cost).
Also, IE4 was such a magnificent leap forward in the web that effectively enabled support for modern apps, which bought IE a ton of goodwill from me that didn’t wear off for a decade or so.
agreed*. You often hear this assumption today that Netscape was always the better browser and that people using IE were simply making a mistake. If anything they were just shit in different ways. For a while Netscape refused to implement CSS and wanted people to use their own JavaScript Style Sheets https://en.wikipedia.org/wiki/JavaScript_Style_Sheets technology which no-one did.
Well I lived through it, and you are absolutely right. Netscape 4 was terrible. Internet Explorer was much better and more standards compliant in comparison. Netscape 4 was hated by web designers just like IE6 later came to be hated. The difference was that Netscapes marked share dwindled pretty fast, while IE6 lived on for an eternity.
Exactly, and in my sad and unfortunate case I had to support it because of that one stubborn client, even though the rest of the world moved on. Actually, it might have just been their CEO accessing the site from his home machine or something?
Oh, what the heck, it’s been 20 years. Vertex Pharmaceuticals: shame on you. In the mid-2000s you had very poor taste in browsers ;-)
I've read this story before on a different site.
I was near the start of my career in 2009. I honestly think they are overstating the effect of those banners.
The significant shift IMO was when Windows 7 machines replaced the ageing XP machines. That is what I saw in the google analytics on the sites I was supporting at the time.
Yes. I am sure it did contribute, but they are overstating the effect of the banner. I honestly think Win 7 being a good OS and the Intel Macs actually being good is what led to nibbling away of legacy IE.
As an aside. IE7 was IMO worse in some ways the IE6. It had many of the same rendering bugs but was more subtle in how it failed.
I was really happy in (iirc) 2017 working on a green project where when we actually surveyed the users/customers, none were using legacy browsers at all... So we were able to ditch all the shims/shivs and drop babel altogether. The ability to target current browsers that at least supported async functions went a long way towards bringing down payloads.
It took a bit of diligence, but while I was there the release app payload never got over 400kb of initial JS, which for a modern React+MUI app is pretty good. Having to yank moment.js and a couple other libraries a couple times was not the most fun sset of conversations to have. Not to mention, replacing massive charting libraries with plain svg generation in React.
It bugs me to no end when developers don't seem to actually care about their craft at all.
I spent the first few years of my career wrestling with Internet Explorer 6 compatibility while working in a marketing studio that was Internet-first and pioneered concepts like responsive web development (the precursor to native mobile experiences/layouts).
Internet Explorer 6 was an incredible waste of resources. I developed primarily on a Mac OS system at the time, which was somewhat progressive in the industry, but in order to verify the functionality we had was working correctly on Internet Explorer 6 (which we still had observed was greater than 50% of the market share) I had to keep a PC on my desk just for IE6 testing.
There were a number of hacks that we could incorporate into additional override style sheets like conditional HTML comments that you could use to incorporate IE6 overrides or weird patterns that you could do by using asterisks that would allow you to target it specifically.
We didn't necessarily prioritize feature parity with IE6, but the site had to load and render correctly and support the cause of marketing the property that we were tasked to do. Once the adoption of it finally slowed, it was a great sigh of relief to the industry, and it made it feel like we could do anything we wanted to because we had been making concessions to it for so long.
> Our most renegade web developer, an otherwise soft-spoken Croatian guy, insisted on checking in the code under his name, as a badge of personal honor, and the rest of us leveraged our OldTuber status to approve the code review.
Whoever this Croatian guy is, thankyou! True hero of the internet.
As soon as that banner popped up on Youtube we were able to tell our customers the same thing.
IE6 was basically the canary in the coalmine. Holding the web back from constant change was a good thing as it let many more implementations be usable. Remember Opera and the various other browsers around at the time? Not long after IE6 died, they got crushed too by Google using change as a weapon against its competitors. The recent notable change of Google requiring JS even for its own search engine, and thereby shutting out simple and basic browsers like Lynx, should be extremely concerning for the future of the web.
> Google requiring JS even for its own search engine
This has been annoying me personally because sometimes I'll "bookmark" something I want to come back to like a movie or a video game by Googling it and leaving the tab open on my phone's browser.
But now, seemingly when those pages are suspended, Javascript isn't allowed to run in them, so all my Google search tab thumbnails are just a static screen telling me that I need to enable Javascript.
The reason Microsoft held the web back is because it saw it has a potential threat to it's dominance as the main platform for applications.
And if you are actually concerned about the future of the web, instead of it's past, I would more concerned with Apple holding back development on Safari, to make people focus on writing native apps for mobile. There are so many apps that in the past would be websites, but end up being natively coded because that's the only way to get a good customer experience in mobile.
> Frustrated, one of the lawyers asked “Why did you have to put Chrome first?” Confused, I explained that we did not give any priority to Chrome. Our boss, in on the conspiracy with us, had thoughtfully recommended that we randomize the order of the browsers listed and then cookie the random seed for each visitor so that the UI would not jump around between pages, which we had done. As luck would have it, these two lawyers still used IE6 to access certain legacy systems and had both ended up with random seeds that placed Chrome in the first position. Their fear was that by showing preferential treatment to Chrome, we might prick the ears of European regulators already on the lookout for any anti-competitive behavior.
Wow those lawyers must've left the place many years ago huh!
I'll go one step further, because the company I used to work at built browser extensions. Google built ChromeFrame (https://www.chromium.org/developers/how-tos/chrome-frame-get...) a tool that would allow IE to load chrome as an activex component and transparently replace the rendering engine of IE.
But building the software wasn't enough, they used some scammy browser toolbar company (one of our competitors) to deploy this software silently and without any user intervention, all of a sudden millions of users overnight switched to chrome. It was deployed as a proxy botnet and Google knew full well what was happening. I sent a note to the humans at Firefox because we had a top 10 extension at the time and were in the midst of porting it to Chrome. They called their contacts and sure enough our suspicions were correct.
Google would later go on to buy that company because they were pushing so much traffic to Google's ad partners (Ad Meld being another acquisition).
We got screwed and were never able to recover from the run-around. I became friends with the folks on the Chromium team and we talked about how google used a botnet to launch Chrome over beers in a SF dive bar.
I loved IE6 as as user when it came out, and grew to hate it as a developer when the browser standards moved on, but a stubborn, large-enough user base percentage had not. I blame slow-moving IT departments that refused to touch their internal environments when all the Web 2.0 progress made things new and scary. A product my team was in charge of had to support IE6 and IE7 years after the rest of the world moved on because the IT admins at Walgreens straight out refused to update the machines that the pharmacists used at their stores.
The irony is that web standards didn’t move fast enough either, so the browser developers simply bypassed the standards body in favor of their own post-hoc ‘living standard’.
It wasn’t so much about the tempo of web standards; it was rather that W3C cared about the consensus of a far wider variety of entities, and browsers got fed up with being told what they should and shouldn’t do by people that were nothing to do with browsers—people that had interests in HTML, sure, but who were trying to pull it in directions that none of the browsers were interested in. And so, W3C having failed as a venue for browser HTML standardisation, they took it over.
To parody the situation: a consortium of bridge engineers is discussing building standards, but somehow they’ve been lumped together with every girl named Bridget and every young boy making toy bridges with blocks, and they all have voting rights, and the girls are insisting that bridges must sparkle, and the boys think every bridge should be able to support helicopters and diggers.
The risk of updating the machines to support IE9 might indeed be large, for not very obvious benefits. But what did they say about staying as is, and switching to Firefox or Chrome? Was it impossible due to use of some MS-only tech?
It's hard to remember the exact details all these years later... I doubt it was due to MS-only tech, but rather that IE6/7 were tested and approved and everything else was not. The incentives for IT teams are such that it's a lot easier to say no to something than yes, and create a ton of work and liability.
My first startup we had to support IE 7 for a bit and then IE 8 until like 2017… I thought I had it bad then. I’m so glad I didn’t have to fight any older version
I think what did IE in was the security issues that IE, ActiveX, and Windows had in the mid-2000s. This, combined with IE 6's stagnation, gave Firefox an opening to compete and to challenge the IE 6 monopoly.
There was a sweet spot between roughly 2007 and sometime in the mid 2010s when web developers coded to standards instead of just the dominant browser, and where there was browser diversity: Firefox, Safari, Opera, Chrome, and IE 7+. It was a good time for the Web.
Chrome then became dominant, and unfortunately now we're in a "Best viewed in Chrome" era, and we're back in an era where some developers only code for the dominant browser.
I've used Firefox since version 1.5. I don't remember it being bad around version 3.0.
There's a long time during which Firefox was somewhat slow, and I remember the spidermokey team releasinf the famous Are we fast yet website that was saying no during this period.
It was horrendously bad. If you had more than 10 tabs open it was painful. I was using it at the time on a relatively high spec machine for the time (Core 2 Quad) and 8GB of ram.
I was so happy when Chrome came out. Over the years I've tried going back to Firefox and I've gone back to either Chrome, Ungoogled Chromium or Brave.
This happened enough that I think it was partially fixed in 3.5.
At the time 3.0 came out I was still in University and every other student I spoke to had the same experience. People also experienced in my first place of work in 2009.
This problem was ultimately fixed with 4.0 when they did something different with how multi-tab worked.
Firefox had 30% market share in 2010, I would hardly call that "niche", especially for a browser that didn't come bundled with your operating system or have the marketing power of the world's default search engine.
It also had an outsized impact on the web because it was a popular with developers for doing web development.
> Firefox had 30% market share in 2010, I would hardly call that "niche", especially for a browser that didn't come bundled with your operating system or have the marketing power of the world's default search engine.
In 2010 Firefox 3.5 had a estimated global market share of 15-20%. BTW nobody thought any of the stats were accurate at the time as they were easily skewed. Some counters would report 30%, but it was an estimate and not a reliable one.
On the sites I was building, which were mostly travel sites, e-commerce and later gambling. Firefox was maybe 5-10%. I am also no in the US. I just didn't see anything like what was reported in site stats for the things I was working on.
Later on it was IE and Chrome and Firefox was still at maybe 10%. I really cared about compatibility and web standards at the time and made every effort to make sure that the sites work. So I knew it wasn't a "the site doesn't work in this browser".
> It also had an outsized impact on the web because it was a popular with developers for doing web development.
So people that used the net heavily used Firefox and people that didn't tended to use IE. So on some sites Firefox usage would be far higher than it would otherwise be.
e.g. People using IE might only use the internet for online shopping, checking mails, so visiting an online shop, booking a flight etc. Whereas many Firefox users would be using Social media, Blogs, Forums or YouTube more heavily. So you will see two completely different pictures depending on what your site's audience would be.
I was in perf engineering at the time. we would switch between a handful of string concatenation methods every browser release. it wasn't much about real performance, but just shifting trade-offs in the jit. but google PR team was very good at running in front of the changes and pointing their overly optimized way to magazines. so they would run an array concat test that was much faster while being much slower in plus sign concat, but they often left that out. anyway, everyone drank the coolaid. 100% of the v8 performance over spider monkey was not attaching debuggers and dev tools. and sadly, mozilla had to follow. nowadays we are mostly back to square one (still some niceties from dalvik missing).
true performance improvement came much later than that.
No I am not. I remember this clearly and all my friends were complaining about it before chrome was released. I just checked the dates. Firefox 3 was released a whole year before Chrome.
I really don't appreciate it when people tell me that I have been swayed by some big company, when my friends and I were complaining about it before we even knew that Google had a browser.
Firefox used to just completely lock up. Wouldn't load a tab. Chrome didn't with the same number of tabs. I am not talking about JS perf speed or anything like that, I am talking about the browser just not locking up when using more than few tabs.
I do occasionally think Safari is the new IE though -- not in terms of terribleness but just in terms of holding back the web by being the slowest to implement big new features.
I wouldn't care about Safari at all if Apple allowed any other browser engine on iOS. The fact that they don't allow other browsers to use their own browser engine is a fucking travesty, and it's part of the reason Apple is being sued by the DOJ.
WebGL took forever on mobile. WebGPU still is only partially supported on desktop. Memory64 isn't available yet at all. That's just my short list of things I care about, but any time I look at the boxes in caniuse safari is always the red one.
I don't think I've seen a feature that was shipped in s browser for years, then removed, and then shipped again.
I'd love to know why. They only explanations I've seen seem to be this: https://x.com/xeenon/status/652573047623323648 "The implementation of Shared Web Workers was imposing undesirable constraints on the engine. It never gained any adoption." and this: https://bugs.webkit.org/show_bug.cgi?id=149850#c5 "This feature was originally removed temporarily during multiprocess bring-up, and actual usage on the web has been pretty low. We're willing to reconsider if there is significant demand."
> Chrome dominance is a result of Google wanting to control the web and its dominance in the ads and search areas.
Chrome was simply better than Firefox, Internet Explorer, Legacy Edge. I am not a Windows Admin, but Chrome also offered an MSI package whereas Firefox didn't bother until years later. So it was easy for IT to roll out Chrome as part of the standard corpo install image and not an option with Firefox.
As for web development it offered better JS debugging and had a decent phone emulator built in and this was back in 2014. I am sure I could also debug phones as well with some reverse proxy shenanigans via fiddler and some open source tooling I forgot the name of now.
You can see here https://www.w3counter.com/trends that Safari, Chrome, and IE8 all had bumps around that time, and looking at the IE-only chart it seems like the boost in IE8 might have actually slowed IE's overall decline a bit.
But the trend for IE started before this and continued after it.
It was worse than IE not adopting standards. It was a capricious browser, would crash and misbehave for arbitrary reasons, and had an almost perverse implementation of web rendering.
People try to equate it to Safari now but that's just not comparable. Safari will render something badly or not support a CSS decorator that you'd really like to use, but it will rarely crash, go into an infinite URL-fetching loop, or arbitrarily fail to recognize random HTML tags.
IE didn’t fight anything; it merely existed. There was no constant barrage of features that you ‘had to’ make use of to ‘keep up with the times’. Microsoft correctly decided that the Web was done in ~1999. They even had ‘Electron’ in the form of HTAs, except it wasn’t remotely as bad.
I don't miss IE 4/5/6 [etc subversion hell]. Supporting these tripled the time it took to build any site for WWW. Pick any w3C standard: some chance it worked - on one of the browsers but not others. Some chance each had entirely incompatible workarounds. Documentation? Good luck. What did exist tended to deny html standards other than their own existed so they didn't even give any clue how to solve this. Had to support them, never enjoyed it. There was nothing fun or rewarding about supporting any of them.
IE6 was a great browser. It was superior in any way (at the time it was released). So good that no other browser was needed. And that's when it started to become garbage.
Hahaha, I love that social proof worked. The Docs guys thought you guys had approval and the Youtube managers thought you were following through with a bigger initiative from across, started by Docs. Lucky break!
Man, I love these tales of people doing the right thing cutting through the red tape.
This is why small scrappy (at the time the YouTube eng team was small) companies get shit done and big companies with process and controls take forever.
The rogues take responsibility, think carefully, act carefully.
The problem is that pretty much all small scrappy companies grow up to be large behemoths that all migrate to have process and controls that take over. The way workflows are created while being small and scrappy doesn't lend itself well when you have more than one dev working on something and there's no guidance for how the devs are to move forward. One dev wants to take 3 left turns, another dev wants to take a simple right turn. After that, you start having meetings to layout code and how to handle merges and the next thing you know you have processes and controls
Ironically, YouTube is now forced to support a browser that has terrible standards support, entirely of their own making: Cobalt[1].
YouTube on TVs is actually a web app that loads into a stripped down, custom webview. The YouTube team doesn't have the resources to implement many web APIs, so they implemented just what they needed.
The problem is that they can't reliably update Cobalt versions on TVs, they can't ask users to update, and they can't just break older TVs in the wild. So the YouTube on TV frontend (not YouTube TV the service) has to only use APIs they shipped like 10 years ago.
And because it takes so long for an old Cobalt version to go out of support, they don't invest in implementing new features because they wouldn't be usable anytime soon. 10 years ago I was in a meeting with them where they said they couldn't implement something because they wouldn't be able to use it for 5 years... They still haven't implemented it.
[1]: https://developers.google.com/youtube/cobalt
> 10 years ago I was in a meeting with them where they said they couldn't implement something because they wouldn't be able to use it for 5 years... They still haven't implemented it.
I call that trailer park logic:
They say: "Why go to college? That will take four years and I need a job now!"
Then four years later, while still in a dead end job: "Why go to college? That will take four years and I need a job now!"
It's a trap, but that doesn't mean it's escapable if you do need a job now.
It's the type of trap that only works if you agree to allow it to work. You might call it Sunk Opportunity Cost Assumption, mostly fits.
The trap will work if you are near destitute.
Right, that's the point though - at some point, getting out of that circumstance is the best available move, meaning walking away - dropping everything and just walking away, somewhere different, is going to be better than staying where you're at and continuing to struggle. Bankruptcy, homelessness in big cities, addictions, abusive relationships, there are all sorts of contexts where people get stuck, and they feel compelled to stay and struggle and try to battle through whatever those challenges are. They feel like they have a duty to battle out the hardest, most impossible struggles where nobody reasonable in the entire world would expect them to have to overcome.
Sometimes the best available move is physically escaping, just getting up and walking away and continuing until you find anything better than where you were at. For some weird reason, that move feels like giving up to people, until they actually do it and it works. This move is sometimes appropriate for jobs, relationships, addictions, violent circumstances, toxic social groups, politics, and so on.
If you've got next to nothing, then you have almost nothing to lose, and that can be a profound amount of freedom if it's seized. It's not always the right move, but sometimes the only winning move is not to play. Go find a better game.
> They say: "Why go to college? That will take four years and I need a job now!"
This is more like: "Why implement it? That won't be seen for five years and I need a promotion now!"
That logic is also how established tech companies allow startups with disruptive technology to eat their lunch. Of course it can sometimes take decades for that disruptive innovation to appear.
Going to college usually means you work the same job after college, just with a lot more debt.
So profound. I'm keeping it to use later :)
It's not profound. It's literally just the age old <group of of people> <thing> combination but with a couple extra words to seem high brow and the group it casts shade upon is picked to confirm certain biases.
People who live in trailer parks are poorer and have lower educational attainment than other groups. It's not a bias to acknowledge this or to reference it.
It's also not unreasonable to believe that the two things are linked.
You could've called it hoodrat logic, their educational and financial success is on the same order.
Why not call it woman logic? They are famous for strictly using long term planning and cold logic to plan their lives to the point they even joke about it.
You could've subbed in just about any nationality.
But you chose trailer park because the point was to pick a group of people that a bunch of other educated white collar people (I think the trailer park people would use the term "coastal elites" for this group, lol) like feeling better than, hence why the other groups wouldn't do.
If you wanted to make if harmless you could've chose any manner of public personality (politicians are gold mines for peddling short sighted stuff, plenty of examples to choose from) to name it after.
> But you chose trailer park because the point was to pick a group of people that a bunch of other educated white collar people (I think the trailer park people would use the term "coastal elites" for this group, lol) like feeling better than, hence why the other groups wouldn't do.
I started calling it that when I lived in the trailer park.
> If you wanted to make if harmless
It's already harmless. The people who live in trailer parks don't need your protection, and acknowledging that most of them don't want to be there isn't hurting anyone at all.
And thanks to these old endpoints that can’t be changed yt-dlp is able to function
I'd be a bit more careful with making statements like that here. "The walls have ears."
It's not like Youtube's engineers can't just. Download yt-dlp and see how it works
It's not proprietary
The yt-dlp devs also talk about this openly, in GitHub issues and elsewhere. I think there have been multiple front page stories about how yt-dlp works here on HN over the last few years.
When a video is loaded on a Cobalt browser, why can't they redirect to something like youtube.com/cobalt/player/123456
This way they could keep an old html/css/js implementation running alongside the upgraded one.
I thought the whole thing was a different app at youtube.com/tv
They do. But the Cobalt version is still used on new TVs and to view new videos with new features, so it can't just be a time capsule.
Then everybody would just try to use that instead of the now common frontend.
>The YouTube team doesn't have the resources to implement many web APIs
Annual revenue is a few dozen billions.
So what's the problem, here?
> So what's the problem, here?
Minimizing developer pain is not a business objective.
The irony, when you consider who pushes that "many web APIs"...
Make no mistake, "standards" really mean "what Google wants" these days.
YouTube was perfectly usable 15 years ago, on the machines and software of the time.
I'll stop at the imminent conclusion that having Cobalt is a good thing for various reasons.
But the use case is just to serve videos right? I know that new things will not come. But YouTube is almost the same in these 10 years I think.
Even simple web apps can benefit from web platform improvements. JS, HTML, and CSS have all gotten significantly better in recent years.
But YouTube is also a very complex app. Yes it "just" exists to play videos, but the app is so much more than a video player. Browsing, searching, comments, chat, playlists, YT Live, subscriptions, profiles, ratings... there's a lot there.
And which of those things that people could build already in mid-90s require some nebulous unspecified "new broswer features"?
Perhaps they could start with just cutting down their bloated 100x-duplicated 4MB CSS file?
<video> tag is probably the biggest change, but I still remember YT used SWF/FLV before then (and likely could still do today).
However, it's clear that the devs are mainly composed of trendchasing sheeple who have drunk the Goog-Aid and are addicted to newness and reinventing wheels to make them square... because they have to justify their existence.
YT2009 and WarpStream (Protoweb) prove the old YouTube can still work today. The new one is just a cat and mouse game of diminishing returns.
> And because it takes so long for an old Cobalt version to go out of support, they don't invest in implementing new features
What new features?
The only "new features" Youtube implements is shoving shorts down your throat and taking five seconds to show video times on thumbnails despite the fact that the data is already there.
There's nothing Youtube requires from "new features" that can't be implemented in a browser tech from 15 years ago.
Also, Youtube the site doesn't have to deal with Cobalt-the-TV-app just like it doesn't have to deal with YouTube-the-mobile-app
New codecs at least, I would assume?
No, SABR and UMP were implemented recently. That did come on the tail of dropping some older TV (presumably Cobalt) clients though.
Video encodings themselves are separate the client always selects the most favorable one from the available set (e. G., vp9 over av1 when hw decode for av1 is not present)
[flagged]
See how you can't even explain what features Youtube wants to implement that they are so hamstrung by Cobalt.
Or why they are hamstrung by Cobalt at all, since it's by definition a TV app that is not expected to implement all the features of a desktop or a mobile app [1]
Instead, you just go for a personal attack.
So, who's insufferable here?
[1] BTW I literally work on one such app, and the number of features we cannot implement on such a constrained platform as a TV is now probably in the hundreds. Doesn't affect the site, or the desktop app, or the mobile apps.
Edit. BTW, if you think that ditching Cobalt (whose features Google literally directly controls) in favor of browsers running on TVs or gaming consoles will somehow give you great modern browsers with standards support, you know even less than nothing (if it's at all possible).
E.g. we still don't use CSS variables because browsers on a significant portion of TVs that are still in use don't support them.
[flagged]
why can't they just serve different sites based on identifying the browser/app, and if it's old you get less features?
That's exactly what they do. The hard part is making it work with fewer features
And those fewer features are?
It's a site that shows a bunch of text, a few images, and then loads and plays video. What features does it need on a TV that it's so hard to implement?
Edit: I know which ones, and they have very little to do with Cobalt, but with the fact that even high end TVs are often worse than a Raspberry Pi, and can stick around for a decade. But this is nothing ditching Cobalt would fix.
E.g. you can't run 4k video on some models that can technically show it because there's not enough CPU and RAM to run the browser, the video, the decoder, and the DRM at the same time, the video stutters
I worked on the front end of Bing (then Live Search) back in 2007, and even within Microsoft, IE6 was hated and rallied against, at least by any team doing web development.
I remember that the former GM of the Internet Explorer 5 and 6 team transferred to my org about a year after I joined. In his intro email, he included a sheepish apology for IE6, which I printed and kept on my office wall for the rest of my time at Bing, it was a prized possession. Man that browser caused so many nightmares.
(to clarify, the GM was a good and smart guy, the apology was a little tongue-in-cheek since IE6 was arguably the best browser upon its release - the problem was Microsoft effectively abandoned it and let it languish and stagnate for years while the web moved on without it, which turned it and the IE org into well-deserved pariahs)
Microsoft was in such a hurry to kill off IE6 that if install a fresh copy of Windows Server 2022 with the latest updates, then Event Viewer will throw an “Access denied!” error in your face at startup. That’s because the IE6 logs were unceremoniously ripped out, but the default Administrative view still contains it in its list.
Automatic updates get a bad rap on HN; but it's not like Microsoft wasn't happily giving away Internet Explorer 7 and 8 to any computer listening.
It took Microsoft over 5 years to release Internet Explorer 7. That’s what allowed the web to ossify around it.
For comparison, Internet Explorer 6 came 2.5 years after 5 and so did 8 after 7.
Internet Explorer 7 had been on the market for three years, and Internet Explorer 8 for three months, when this story took place.
The lesson of IE6 is that people cannot be trusted to handle updating themselves.
If IE6 did everything that a user wanted, why should he have to update? I totally disagree with this assumption that we should always have to be on the update treadmill, always changing the software we're using, and if we don't, then it's some kind of user failure showing "we cannot be trusted". Maybe if IE6 was so terrible, Microsoft shouldn't have released it in the first place. Don't blame the user.
The only good justification I can think of to update totally working software that I am happy with, is for mitigating security vulnerabilities. And even then, the choice should be on the user.
> If IE6 did everything that a user wanted, why should he have to update?
It often didn't, but the user was not always in control. Many business and educational environments held back and their users were in locked down machines (for good reasons) so could not upgrade if they wanted to.
This had a secondary effect: parents in households where the kids weren't in control of the tech were wary to upgrade in case it made them incompatible with work or things the kids needed for school.
> Maybe if IE6 was so terrible, Microsoft shouldn't have released it in the first place.
As has been mentioned a few times, upon release IE6 was the best browser commonly available by a number of metrics. Netscape was properly stagnating around then, Firefox was not yet a thing (even under is earlier names), chrome was even further off, and other alternatives only captured a niche market. A lot changed between then and 2009 but IE6 didn't.
> this assumption that we should always have to be on the update treadmill
This wasn't the enshitification treadmill that we experience today. Newer browsers at the time were offering key benefits for performance and security as well as significant useful features for designers that had to be inefficiently polyfilled or rejected if you needed to support older UAs.
> mitigating security vulnerabilities. And even then, the choice should be on the user.
No. As much as I disparage Windows for being the OS that can't be trusted not to randomly reboot if you leave it unattended for 12 hours, security updates are everybody's problem if you get infested with something that goes on to affect the wider network.
You chose IE6 to make this argument? Your argument defeats itself if you have any context at all
So. Are you still using MS-DOS? Or Commodore Basic?
Why evolve software at all?
From the depths of my heart: thank you. Whatever you did to kill it, I claim it was justified self-defense. I have my scars from the Browser Wars, and the string "IE6" fills me with loathing to this day.
For my own part, I made sure my employer had plans to remove IE6 from our support list the day Google officially did the same in March 2010. The very next day, I started adding code to our site that complied with official standards and worked perfectly on every other browser, and removing all the compatibility hacks we'd deployed to make that pig render a screen correctly. It was incredibly liberating.
My first serious web programming job was creating a complicated web-app with lots of JavaScript that had to support IE-4/4.5/5 and Netscape Communicator.
FWIW that app is still running to this day: https://resultview.q2labsolutions.com/resultview/logon/logon...
Vanilla JavaScript just works. Marvel at the circa 2001 Login button!
I don't miss those days at all... A major client of my employer around 2000-2001 was "standardized" on Netscape 4.06. And I was expected to make stack diagrams and Gantt charts. I wrote an abstraction library just to draw boxes on the screen targeting the IE/NN 4x/5x differences, having to cover the screen in NN while "drawing" just to prevent the flickering effect causing someone a seizure. ILayer/IFrames, ugh... dynamic forms were horrible, having to mirror multiple forms into a composite hidden field form next to the submit button.
So many hacks... Not to mention the IE 5.0.0 select api bug, or the later uncatchable error in IE8's JSON parser... those were some rough years.
Hey, nice job if it's still running! That was quite the exercise back in the day, wasn't it?
Well, JavaScript didn't have a ton of features back then to muddy up the waters. So that helped. And no frameworks kept things simple.
The most complex part was a dynamic query builder where you could pick columns and various kinds of filters. We could have gone to the server each time the user changed the query, but I found it a lot snappier to do it all with document.write().
For a while, JavaScript was shunned by a lot of web shops. Applets and Flash were the future! Then Google Maps came out and showed what you could really do, and JS became cool again.
The browser wars have not ended though... Chrome simply replaced IE6. We are in the exact same situation as before: the web is effectively owned by a single corporation.
I am old enough to remember banks and many sites shovelling windows to your mouth because they only work in IE. No, Chrome is not the same.
At work I have to use web applications that literally do not work in any non-Chrome browser on a daily basis. Nothing has changed.
Except Chrome itself is open-source, cross platform and runs almost everywhere, not to mention multiple forks and other browser implementations around it.
I am also old enough. It’s very similar, lots of things don’t work on Firefox on Safari and mandate Chrome.
The base OS is far less relevant than it used to be, so it’s far less relevant that Chrome is merely an inner OS.
The IE6 age was much worse. IE6 only worked on Windows so many sites did not work on Mac or Unix which was a big hiderence for the adoption of non-windows platforms.
While Chrome is big, Safari still has significant market share today.
Not really, IE6 legacy was split in 2, Safari is the one that don't implement modern Web APIs, some times going with alternative implementations that benefits only them.
"Modern Web APIs" are just what Chrome is doing. They even usurped the standards bodies with their "WHATWG". It's what Microsoft wishes it had achieved back then.
Just give me WebGPU, proper PWAs and the FileSystem API mate, I don't care if what the users want was proposed by Google. Maybe the bluetooth and NFC APIs too, if that would not disrupt their App Store too much :)
I ran a web dev agency back in 2012 building websites for restaurants and SMEs. One of my partners was insistent that we had to support IE6 and also avoid using CSS3 and HTML5. Despite our own analytics showing less than 3% usage.
It was the worst two years of my life.
Netscape 4 was the bane of my existence, moreso than IE6 ever was, as an important client standardized internally on that forever so our entire platform had to be completely compatible with it. At least with IE you could do things in a user friendly way (perhaps at 2x the development and maintenance cost). Netscape 4 simply didn’t have the capability to do things we wanted to do experience-wise (like getting pushed content, I think?) without doing some extremely crazy and brittle workarounds at best (making it feel more like 5x the cost).
Also, IE4 was such a magnificent leap forward in the web that effectively enabled support for modern apps, which bought IE a ton of goodwill from me that didn’t wear off for a decade or so.
agreed*. You often hear this assumption today that Netscape was always the better browser and that people using IE were simply making a mistake. If anything they were just shit in different ways. For a while Netscape refused to implement CSS and wanted people to use their own JavaScript Style Sheets https://en.wikipedia.org/wiki/JavaScript_Style_Sheets technology which no-one did.
* Kind of, I was born in the 2000s
Well I lived through it, and you are absolutely right. Netscape 4 was terrible. Internet Explorer was much better and more standards compliant in comparison. Netscape 4 was hated by web designers just like IE6 later came to be hated. The difference was that Netscapes marked share dwindled pretty fast, while IE6 lived on for an eternity.
Exactly, and in my sad and unfortunate case I had to support it because of that one stubborn client, even though the rest of the world moved on. Actually, it might have just been their CEO accessing the site from his home machine or something?
Oh, what the heck, it’s been 20 years. Vertex Pharmaceuticals: shame on you. In the mid-2000s you had very poor taste in browsers ;-)
> For a while Netscape refused to implement CSS and wanted people to use their own JavaScript Style Sheets https://en.wikipedia.org/wiki/JavaScript_Style_Sheets technology
Man, literally every time the web platform had to choose between the IE way and the Netscape way they made the wrong choice huh.
I hung on to Netscape 4 until the first versions of Mozilla
I've read this story before on a different site. I was near the start of my career in 2009. I honestly think they are overstating the effect of those banners.
The significant shift IMO was when Windows 7 machines replaced the ageing XP machines. That is what I saw in the google analytics on the sites I was supporting at the time.
Indeed, their own graph shows IE7 dropping in usage share by very similar amounts at the same point in time, without a banner.
Yes. I am sure it did contribute, but they are overstating the effect of the banner. I honestly think Win 7 being a good OS and the Intel Macs actually being good is what led to nibbling away of legacy IE.
As an aside. IE7 was IMO worse in some ways the IE6. It had many of the same rendering bugs but was more subtle in how it failed.
I was really happy in (iirc) 2017 working on a green project where when we actually surveyed the users/customers, none were using legacy browsers at all... So we were able to ditch all the shims/shivs and drop babel altogether. The ability to target current browsers that at least supported async functions went a long way towards bringing down payloads.
It took a bit of diligence, but while I was there the release app payload never got over 400kb of initial JS, which for a modern React+MUI app is pretty good. Having to yank moment.js and a couple other libraries a couple times was not the most fun sset of conversations to have. Not to mention, replacing massive charting libraries with plain svg generation in React.
It bugs me to no end when developers don't seem to actually care about their craft at all.
I spent the first few years of my career wrestling with Internet Explorer 6 compatibility while working in a marketing studio that was Internet-first and pioneered concepts like responsive web development (the precursor to native mobile experiences/layouts).
Internet Explorer 6 was an incredible waste of resources. I developed primarily on a Mac OS system at the time, which was somewhat progressive in the industry, but in order to verify the functionality we had was working correctly on Internet Explorer 6 (which we still had observed was greater than 50% of the market share) I had to keep a PC on my desk just for IE6 testing.
There were a number of hacks that we could incorporate into additional override style sheets like conditional HTML comments that you could use to incorporate IE6 overrides or weird patterns that you could do by using asterisks that would allow you to target it specifically.
We didn't necessarily prioritize feature parity with IE6, but the site had to load and render correctly and support the cause of marketing the property that we were tasked to do. Once the adoption of it finally slowed, it was a great sigh of relief to the industry, and it made it feel like we could do anything we wanted to because we had been making concessions to it for so long.
> Our most renegade web developer, an otherwise soft-spoken Croatian guy, insisted on checking in the code under his name, as a badge of personal honor, and the rest of us leveraged our OldTuber status to approve the code review.
Whoever this Croatian guy is, thankyou! True hero of the internet.
As soon as that banner popped up on Youtube we were able to tell our customers the same thing.
IE6 was basically the canary in the coalmine. Holding the web back from constant change was a good thing as it let many more implementations be usable. Remember Opera and the various other browsers around at the time? Not long after IE6 died, they got crushed too by Google using change as a weapon against its competitors. The recent notable change of Google requiring JS even for its own search engine, and thereby shutting out simple and basic browsers like Lynx, should be extremely concerning for the future of the web.
> Google requiring JS even for its own search engine
This has been annoying me personally because sometimes I'll "bookmark" something I want to come back to like a movie or a video game by Googling it and leaving the tab open on my phone's browser.
But now, seemingly when those pages are suspended, Javascript isn't allowed to run in them, so all my Google search tab thumbnails are just a static screen telling me that I need to enable Javascript.
The reason Microsoft held the web back is because it saw it has a potential threat to it's dominance as the main platform for applications.
And if you are actually concerned about the future of the web, instead of it's past, I would more concerned with Apple holding back development on Safari, to make people focus on writing native apps for mobile. There are so many apps that in the past would be websites, but end up being natively coded because that's the only way to get a good customer experience in mobile.
You're worried about Safari in the browser space? It's the only browser that's holding back a near 100% monopoly from Google at the moment.
If Apple gets forced to allow other browser engines on iOS, it's game over. Google wins the web forever.
> And if you are actually concerned about the future of the web
>I would more concerned with Apple holding back development on Safari
I think they are more concerned with the future of the web due to Google and so am I.
The HTML-only web has been dead for years. Documents may be safe, but publishing websites that work without JS is war with no winners.
Related:
A conspiracy to kill IE6 (2019) - https://news.ycombinator.com/item?id=39294406 - Feb 2024 (106 comments)
A Conspiracy to Kill IE6 - https://news.ycombinator.com/item?id=38210439 - Nov 2023 (1 comment)
A Conspiracy to Kill IE6 (2019) - https://news.ycombinator.com/item?id=28725293 - Oct 2021 (80 comments)
A Conspiracy to Kill IE6 at YouTube - https://news.ycombinator.com/item?id=28655890 - Sept 2021 (2 comments)
A Conspiracy to Kill IE6 - https://news.ycombinator.com/item?id=19798678 - May 2019 (363 comments)
Back in 2010, my startup offered front-end engineer prospective hires a major perk: we don’t care about IE6 compatibility.
Amazing read! One detail jumped out at me:
> Frustrated, one of the lawyers asked “Why did you have to put Chrome first?” Confused, I explained that we did not give any priority to Chrome. Our boss, in on the conspiracy with us, had thoughtfully recommended that we randomize the order of the browsers listed and then cookie the random seed for each visitor so that the UI would not jump around between pages, which we had done. As luck would have it, these two lawyers still used IE6 to access certain legacy systems and had both ended up with random seeds that placed Chrome in the first position. Their fear was that by showing preferential treatment to Chrome, we might prick the ears of European regulators already on the lookout for any anti-competitive behavior.
Wow those lawyers must've left the place many years ago huh!
Don't need em now! When you're small, cooperate, when you're big, take over. Google is big now
A small group of people took a chance, and it turned into a movement and changed internet history. I bet this could become a solid documentary.
Great article!
I still remember the time when people cherished the arrival of IE5.5 and IE6 later. They were once the best browsers.
I'll go one step further, because the company I used to work at built browser extensions. Google built ChromeFrame (https://www.chromium.org/developers/how-tos/chrome-frame-get...) a tool that would allow IE to load chrome as an activex component and transparently replace the rendering engine of IE.
But building the software wasn't enough, they used some scammy browser toolbar company (one of our competitors) to deploy this software silently and without any user intervention, all of a sudden millions of users overnight switched to chrome. It was deployed as a proxy botnet and Google knew full well what was happening. I sent a note to the humans at Firefox because we had a top 10 extension at the time and were in the midst of porting it to Chrome. They called their contacts and sure enough our suspicions were correct.
Google would later go on to buy that company because they were pushing so much traffic to Google's ad partners (Ad Meld being another acquisition).
We got screwed and were never able to recover from the run-around. I became friends with the folks on the Chromium team and we talked about how google used a botnet to launch Chrome over beers in a SF dive bar.
The web was a much better place when it had to support IE6.
I loved IE6 as as user when it came out, and grew to hate it as a developer when the browser standards moved on, but a stubborn, large-enough user base percentage had not. I blame slow-moving IT departments that refused to touch their internal environments when all the Web 2.0 progress made things new and scary. A product my team was in charge of had to support IE6 and IE7 years after the rest of the world moved on because the IT admins at Walgreens straight out refused to update the machines that the pharmacists used at their stores.
The irony is that web standards didn’t move fast enough either, so the browser developers simply bypassed the standards body in favor of their own post-hoc ‘living standard’.
It wasn’t so much about the tempo of web standards; it was rather that W3C cared about the consensus of a far wider variety of entities, and browsers got fed up with being told what they should and shouldn’t do by people that were nothing to do with browsers—people that had interests in HTML, sure, but who were trying to pull it in directions that none of the browsers were interested in. And so, W3C having failed as a venue for browser HTML standardisation, they took it over.
To parody the situation: a consortium of bridge engineers is discussing building standards, but somehow they’ve been lumped together with every girl named Bridget and every young boy making toy bridges with blocks, and they all have voting rights, and the girls are insisting that bridges must sparkle, and the boys think every bridge should be able to support helicopters and diggers.
The risk of updating the machines to support IE9 might indeed be large, for not very obvious benefits. But what did they say about staying as is, and switching to Firefox or Chrome? Was it impossible due to use of some MS-only tech?
It's hard to remember the exact details all these years later... I doubt it was due to MS-only tech, but rather that IE6/7 were tested and approved and everything else was not. The incentives for IT teams are such that it's a lot easier to say no to something than yes, and create a ton of work and liability.
Better to ask for forgiveness than permission.
My first startup we had to support IE 7 for a bit and then IE 8 until like 2017… I thought I had it bad then. I’m so glad I didn’t have to fight any older version
Is it really something to be proud of? Somehow as a result of IE hate we ended up with a Chrome-dominated world.
I think what did IE in was the security issues that IE, ActiveX, and Windows had in the mid-2000s. This, combined with IE 6's stagnation, gave Firefox an opening to compete and to challenge the IE 6 monopoly.
There was a sweet spot between roughly 2007 and sometime in the mid 2010s when web developers coded to standards instead of just the dominant browser, and where there was browser diversity: Firefox, Safari, Opera, Chrome, and IE 7+. It was a good time for the Web.
Chrome then became dominant, and unfortunately now we're in a "Best viewed in Chrome" era, and we're back in an era where some developers only code for the dominant browser.
It was Chrome, Win 7 and Smartphones that killed IE. Firefox was extremely niche and was downright bad browser when 3.0 was released.
Chrome when it came out was much faster than Firefox. It was lighter and worked better.
Also macs had moved to Intel chips a few years before and were actually pretty decent so a lot of people were moving to them.
Also of this chipped away at XP and the few people running XP machines were diehard xp fans or corps that were dragging their heels upgrading.
I've used Firefox since version 1.5. I don't remember it being bad around version 3.0.
There's a long time during which Firefox was somewhat slow, and I remember the spidermokey team releasinf the famous Are we fast yet website that was saying no during this period.
It was horrendously bad. If you had more than 10 tabs open it was painful. I was using it at the time on a relatively high spec machine for the time (Core 2 Quad) and 8GB of ram.
I was so happy when Chrome came out. Over the years I've tried going back to Firefox and I've gone back to either Chrome, Ungoogled Chromium or Brave.
I was using it on way lower spec machines than that (including a Pentium II machine with 256M of RAM), Firefox handled 10 tabs well at the time.
On Linux though, and blocking ads.
This happened enough that I think it was partially fixed in 3.5.
At the time 3.0 came out I was still in University and every other student I spoke to had the same experience. People also experienced in my first place of work in 2009.
This problem was ultimately fixed with 4.0 when they did something different with how multi-tab worked.
Firefox had 30% market share in 2010, I would hardly call that "niche", especially for a browser that didn't come bundled with your operating system or have the marketing power of the world's default search engine.
It also had an outsized impact on the web because it was a popular with developers for doing web development.
> Firefox had 30% market share in 2010, I would hardly call that "niche", especially for a browser that didn't come bundled with your operating system or have the marketing power of the world's default search engine.
In 2010 Firefox 3.5 had a estimated global market share of 15-20%. BTW nobody thought any of the stats were accurate at the time as they were easily skewed. Some counters would report 30%, but it was an estimate and not a reliable one.
On the sites I was building, which were mostly travel sites, e-commerce and later gambling. Firefox was maybe 5-10%. I am also no in the US. I just didn't see anything like what was reported in site stats for the things I was working on.
Later on it was IE and Chrome and Firefox was still at maybe 10%. I really cared about compatibility and web standards at the time and made every effort to make sure that the sites work. So I knew it wasn't a "the site doesn't work in this browser".
> It also had an outsized impact on the web because it was a popular with developers for doing web development.
So people that used the net heavily used Firefox and people that didn't tended to use IE. So on some sites Firefox usage would be far higher than it would otherwise be.
e.g. People using IE might only use the internet for online shopping, checking mails, so visiting an online shop, booking a flight etc. Whereas many Firefox users would be using Social media, Blogs, Forums or YouTube more heavily. So you will see two completely different pictures depending on what your site's audience would be.
That is why the statistics can be misleading.
you're repeating google marketing.
I was in perf engineering at the time. we would switch between a handful of string concatenation methods every browser release. it wasn't much about real performance, but just shifting trade-offs in the jit. but google PR team was very good at running in front of the changes and pointing their overly optimized way to magazines. so they would run an array concat test that was much faster while being much slower in plus sign concat, but they often left that out. anyway, everyone drank the coolaid. 100% of the v8 performance over spider monkey was not attaching debuggers and dev tools. and sadly, mozilla had to follow. nowadays we are mostly back to square one (still some niceties from dalvik missing).
true performance improvement came much later than that.
> you're repeating google marketing.
No I am not. I remember this clearly and all my friends were complaining about it before chrome was released. I just checked the dates. Firefox 3 was released a whole year before Chrome.
I really don't appreciate it when people tell me that I have been swayed by some big company, when my friends and I were complaining about it before we even knew that Google had a browser.
Firefox used to just completely lock up. Wouldn't load a tab. Chrome didn't with the same number of tabs. I am not talking about JS perf speed or anything like that, I am talking about the browser just not locking up when using more than few tabs.
IE was so much worse than Chrome will ever be.
I do occasionally think Safari is the new IE though -- not in terms of terribleness but just in terms of holding back the web by being the slowest to implement big new features.
I wouldn't care about Safari at all if Apple allowed any other browser engine on iOS. The fact that they don't allow other browsers to use their own browser engine is a fucking travesty, and it's part of the reason Apple is being sued by the DOJ.
https://www.justice.gov/archives/opa/media/1344546/dl
> being the slowest to implement big new features.
You mean Chrome-only non-standards that Mozilla usually opposes, too
WebGL took forever on mobile. WebGPU still is only partially supported on desktop. Memory64 isn't available yet at all. That's just my short list of things I care about, but any time I look at the boxes in caniuse safari is always the red one.
SharedWorker was implemented in Safari years after Chrome/Firefox
I'm sure there are many such examples.
Shared Workers is weird. Safari released them in 2010, one month after Chrome. And then... removed them in 2013.
Firefox implemented them in 2014, Edge in 2020, and Safari re-introduced them in 2022.
The was a proposal to remove the spec entirely in 2015: https://github.com/whatwg/html/issues/315
I don't think I've seen a feature that was shipped in s browser for years, then removed, and then shipped again.
I'd love to know why. They only explanations I've seen seem to be this: https://x.com/xeenon/status/652573047623323648 "The implementation of Shared Web Workers was imposing undesirable constraints on the engine. It never gained any adoption." and this: https://bugs.webkit.org/show_bug.cgi?id=149850#c5 "This feature was originally removed temporarily during multiprocess bring-up, and actual usage on the web has been pretty low. We're willing to reconsider if there is significant demand."
Maybe I'm misremembering but i reckon something about shared memory vulnerabilities.
As someone involved in web dev during the IE5/6/7 days, the short answer is yes.
The longer answer is yes, absolutely.
I don't think the chrome dominance is a result of the IE hate.
Firefox briefly dominated the web in between.
Chrome dominance is a result of Google wanting to control the web and its dominance in the ads and search areas.
> Chrome dominance is a result of Google wanting to control the web and its dominance in the ads and search areas.
Chrome was simply better than Firefox, Internet Explorer, Legacy Edge. I am not a Windows Admin, but Chrome also offered an MSI package whereas Firefox didn't bother until years later. So it was easy for IT to roll out Chrome as part of the standard corpo install image and not an option with Firefox.
As for web development it offered better JS debugging and had a decent phone emulator built in and this was back in 2014. I am sure I could also debug phones as well with some reverse proxy shenanigans via fiddler and some open source tooling I forgot the name of now.
You can see here https://www.w3counter.com/trends that Safari, Chrome, and IE8 all had bumps around that time, and looking at the IE-only chart it seems like the boost in IE8 might have actually slowed IE's overall decline a bit.
But the trend for IE started before this and continued after it.
Chrome sucks but it’s miles better than IE ever was.
If you read the article, one of the buttons on the bar prompted people to upgrade to the latest version of IE.
The hate was not against IE, but against a popular tool that fought against shared standards.
It was worse than IE not adopting standards. It was a capricious browser, would crash and misbehave for arbitrary reasons, and had an almost perverse implementation of web rendering.
People try to equate it to Safari now but that's just not comparable. Safari will render something badly or not support a CSS decorator that you'd really like to use, but it will rarely crash, go into an infinite URL-fetching loop, or arbitrarily fail to recognize random HTML tags.
IE didn’t fight anything; it merely existed. There was no constant barrage of features that you ‘had to’ make use of to ‘keep up with the times’. Microsoft correctly decided that the Web was done in ~1999. They even had ‘Electron’ in the form of HTAs, except it wasn’t remotely as bad.
I don't miss IE 4/5/6 [etc subversion hell]. Supporting these tripled the time it took to build any site for WWW. Pick any w3C standard: some chance it worked - on one of the browsers but not others. Some chance each had entirely incompatible workarounds. Documentation? Good luck. What did exist tended to deny html standards other than their own existed so they didn't even give any clue how to solve this. Had to support them, never enjoyed it. There was nothing fun or rewarding about supporting any of them.
Now the YouTube layout is bloated and is in diminishing returns for a while.
The writing is on the wall.
Maybe we need to hire construction teams to break into peoples' houses and change them every 5 seconds.
IE6 was a great browser. It was superior in any way (at the time it was released). So good that no other browser was needed. And that's when it started to become garbage.
Once person's conspiracy is another person's public service. Thank you for helping to kill IE6
Hahaha, I love that social proof worked. The Docs guys thought you guys had approval and the Youtube managers thought you were following through with a bigger initiative from across, started by Docs. Lucky break!
Man, I love these tales of people doing the right thing cutting through the red tape.
This was a delightful read. You have done the world a service there, truly!
This is why small scrappy (at the time the YouTube eng team was small) companies get shit done and big companies with process and controls take forever.
The rogues take responsibility, think carefully, act carefully.
The problem is that pretty much all small scrappy companies grow up to be large behemoths that all migrate to have process and controls that take over. The way workflows are created while being small and scrappy doesn't lend itself well when you have more than one dev working on something and there's no guidance for how the devs are to move forward. One dev wants to take 3 left turns, another dev wants to take a simple right turn. After that, you start having meetings to layout code and how to handle merges and the next thing you know you have processes and controls
Can we have a conspiracy to kill IPv4 next?
But only if we also kill NATs along the way, otherwise this would be a tangible effect.