It’s kinda understandable, but you should reasonably assume that Firefox did it first and anything chrome does after the fact followed that, unless Google thinks it will help them keep a tighter grip on the market, like Manifest V3.
I mean, Firefox has had profiles for years before chrome did. And then they had multi-account containers and chrome still doesn’t have those. And now they have a new distinct feature that they’ve given the same name of profiles that is just slightly different from the already existing profiles feature.
Projectors are simply better than OLEDs for movie watching, and no amount of TV brain can change this.
The cult of OLED has convinced people that one single spec—perfect blacks—matters more than everything else. But cinema isn’t about looking into a glowing rectangle with black pixels. It’s about immersion, scale, and light that feels like the real world. If you care about movies as movies, projection isn’t just better—it’s the only serious choice.
Start with size. Movies were never meant to be watched at 65 or 77 inches. A decent 4K laser projector can give you 120 to 150 inches without breaking a sweat. That’s an image that swallows your entire field of vision and changes the way you perceive the film. Wide shots become landscapes. Close-ups feel intimate. Action feels overwhelming in the best way.
OLEDs top out at 97 inches in the consumer market, and anything beyond that is microLED wall territory—Samsung’s “The Wall” runs over $200,000 for 146 inches. Meanwhile, a 4K laser projector with a proper screen costs $5,000–8,000 and achieves the same or greater size. The scale difference alone makes OLED feel like a toy by comparison.
And then there’s light. OLED is emissive. Each pixel is a miniature flashlight shooting photons directly into your eyes. Yes, it produces perfect blacks, but it also produces eye strain. Your pupils are constantly constricting and dilating to deal with rapid HDR changes.
Projectors use reflected light, which is how our eyes evolved to see the world. A laser beam hits a screen, scatters, and bounces back softly and naturally. That’s why projection looks filmic instead of hyper-real. It’s why a three-hour epic is comfortable on a projector and fatiguing on OLED. When movies are mastered for theatrical presentation, they’re mastered on projectors. That reflected light is the reference, not a self-glowing slab.
OLED fans will argue that projectors can’t work in anything but a dark room. That used to be true when we were talking about bulb-based machines and matte white screens. But today’s high-end laser projectors have solved this.
Laser phosphor and RGB models push 3,000–5,000+ lumens without losing color accuracy, and when paired with an ALR (ambient light rejecting) screen, they can produce a bright, contrast-rich image in a normal living room with windows or lamps on. ALR screens are designed to selectively reflect light from the projector while absorbing or deflecting ambient light from other angles. The result is a 120–150 inch image that’s still crisp and cinematic even in spaces where an OLED would otherwise seem like the only option. That completely destroys the “dark room only” myth.
Aspect ratio flexibility is another win for projectors. Most movies aren’t shot in 16:9, but OLEDs are stuck with that rectangle. Watching Lawrence of Arabia or Dune on OLED means black bars eating up screen space. On a projection setup with masking, you can run constant image height—CinemaScope films expand to fill the frame completely. No bars. No compromises. It looks exactly as intended, and the image dominates the room the way it’s supposed to.
Motion also reads differently. OLEDs are sample-and-hold displays, which smear fast movement unless you enable black-frame insertion. That drops brightness and introduces flicker. Projectors have a different cadence. They mimic the way film frames roll in theatres, delivering movement that feels cinematic instead of soap-opera smooth. The difference isn’t subtle once you’ve seen both.
HDR is another place where OLED flexes numbers but misses the point. Yes, OLEDs can spike to 1,000–1,500 nits. But theatrical reference brightness is 48 nits (14 foot-lamberts). Movies aren’t designed to blind you with specular highlights. They’re designed to stretch across a massive screen with consistent luminance. A sunset on OLED is a bright pixel cluster. On a projector, it’s an expanse of color that fills twelve feet of wall. It feels expansive rather than harsh.
Color reproduction is where modern projectors push into true cinema territory. RGB laser light engines often exceed DCI-P3 and reach into Rec.2020. That means you’re seeing color closer to what filmmakers master for. On a massive screen, that richness envelops you. Reds feel deep, blues glow, and alien landscapes look otherworldly. On OLED, even with good calibration, you’re still looking at colors confined to a small rectangle.
Projectors also offer flexibility OLED cannot match. One projector can be scaled down to 90 inches for casual TV or pushed to 150 inches for movie night. Move houses? Resize the screen. Change aspect ratios? Use masking. Upgrade later? Swap the projector but keep the screen. An OLED is a glowing slab of fixed size. If you want bigger, you replace the whole thing.
And then there’s cost. You can get a cheap 1080p projector for under $100—something that instantly delivers a big-screen experience for next to nothing. Midrange models with decent HDR and brightness run a few hundred. A top-tier 4K ultra short throw laser with an ALR screen and HDR support might be $5,000–8,000. Compare that to an OLED panel at 97 inches, which costs around $30,000, or a microLED wall at 146 inches for over $200,000. The value proposition isn’t close. Projection scales from cheap-and-cheerful to reference-grade cinema, while OLED scales from expensive to absurd.
And this is exactly why people still go to the movies. It’s not because OLEDs don’t look great—they do. It’s because cinema is projection. It’s immersive size, reflected light, and framing designed for the big screen. For $15, you can get that at your local theatre. For $100, you can get it in your living room with an entry-level projector. For $5,000, you can have glorious 4K HDR laser projection beamed at a 150-inch canvas. OLED can’t touch that.
So let’s be clear: OLED has one advantage, and that’s pixel-level blackness. But movies aren’t about spec sheets. Movies are about experience. And in every way that matters—size, immersion, comfort, color, flexibility, cost, and alignment with the theatrical standard—projectors leave OLED in the dust.
OLED gives you perfect black bars. Projectors give you cinema.
It’s two or three years old. Maybe you just have lower standards than me, but just because it puts a 150” image on the wall doesn’t mean it’s actually going to be watchable at that size. And just because a projector advertises something also doesn’t mean that’s gonna be the case.
Projectors are simply better than OLEDs for movie watching, and no amount of TV brain can change this.
The cult of OLED has convinced people that one single spec—perfect blacks—matters more than everything else. But cinema isn’t about looking into a glowing rectangle with black pixels. It’s about immersion, scale, and light that feels like the real world. If you care about movies as movies, projection isn’t just better—it’s the only serious choice.
Start with size. Movies were never meant to be watched at 65 or 77 inches. A decent 4K laser projector can give you 120 to 150 inches without breaking a sweat. That’s an image that swallows your entire field of vision and changes the way you perceive the film. Wide shots become landscapes. Close-ups feel intimate. Action feels overwhelming in the best way.
OLEDs top out at 97 inches in the consumer market, and anything beyond that is microLED wall territory—Samsung’s “The Wall” runs over $200,000 for 146 inches. Meanwhile, a 4K laser projector with a proper screen costs $5,000–8,000 and achieves the same or greater size. The scale difference alone makes OLED feel like a toy by comparison.
And then there’s light. OLED is emissive. Each pixel is a miniature flashlight shooting photons directly into your eyes. Yes, it produces perfect blacks, but it also produces eye strain. Your pupils are constantly constricting and dilating to deal with rapid HDR changes.
Projectors use reflected light, which is how our eyes evolved to see the world. A laser beam hits a screen, scatters, and bounces back softly and naturally. That’s why projection looks filmic instead of hyper-real. It’s why a three-hour epic is comfortable on a projector and fatiguing on OLED. When movies are mastered for theatrical presentation, they’re mastered on projectors. That reflected light is the reference, not a self-glowing slab.
OLED fans will argue that projectors can’t work in anything but a dark room. That used to be true when we were talking about bulb-based machines and matte white screens. But today’s high-end laser projectors have solved this.
Laser phosphor and RGB models push 3,000–5,000+ lumens without losing color accuracy, and when paired with an ALR (ambient light rejecting) screen, they can produce a bright, contrast-rich image in a normal living room with windows or lamps on. ALR screens are designed to selectively reflect light from the projector while absorbing or deflecting ambient light from other angles. The result is a 120–150 inch image that’s still crisp and cinematic even in spaces where an OLED would otherwise seem like the only option. That completely destroys the “dark room only” myth.
Aspect ratio flexibility is another win for projectors. Most movies aren’t shot in 16:9, but OLEDs are stuck with that rectangle. Watching Lawrence of Arabia or Dune on OLED means black bars eating up screen space. On a projection setup with masking, you can run constant image height—CinemaScope films expand to fill the frame completely. No bars. No compromises. It looks exactly as intended, and the image dominates the room the way it’s supposed to.
Motion also reads differently. OLEDs are sample-and-hold displays, which smear fast movement unless you enable black-frame insertion. That drops brightness and introduces flicker. Projectors have a different cadence. They mimic the way film frames roll in theatres, delivering movement that feels cinematic instead of soap-opera smooth. The difference isn’t subtle once you’ve seen both.
HDR is another place where OLED flexes numbers but misses the point. Yes, OLEDs can spike to 1,000–1,500 nits. But theatrical reference brightness is 48 nits (14 foot-lamberts). Movies aren’t designed to blind you with specular highlights. They’re designed to stretch across a massive screen with consistent luminance. A sunset on OLED is a bright pixel cluster. On a projector, it’s an expanse of color that fills twelve feet of wall. It feels expansive rather than harsh.
Color reproduction is where modern projectors push into true cinema territory. RGB laser light engines often exceed DCI-P3 and reach into Rec.2020. That means you’re seeing color closer to what filmmakers master for. On a massive screen, that richness envelops you. Reds feel deep, blues glow, and alien landscapes look otherworldly. On OLED, even with good calibration, you’re still looking at colors confined to a small rectangle.
Projectors also offer flexibility OLED cannot match. One projector can be scaled down to 90 inches for casual TV or pushed to 150 inches for movie night. Move houses? Resize the screen. Change aspect ratios? Use masking. Upgrade later? Swap the projector but keep the screen. An OLED is a glowing slab of fixed size. If you want bigger, you replace the whole thing.
And then there’s cost. You can get a cheap 1080p projector for under $100—something that instantly delivers a big-screen experience for next to nothing. Midrange models with decent HDR and brightness run a few hundred. A top-tier 4K ultra short throw laser with an ALR screen and HDR support might be $5,000–8,000. Compare that to an OLED panel at 97 inches, which costs around $30,000, or a microLED wall at 146 inches for over $200,000. The value proposition isn’t close. Projection scales from cheap-and-cheerful to reference-grade cinema, while OLED scales from expensive to absurd.
And this is exactly why people still go to the movies. It’s not because OLEDs don’t look great—they do. It’s because cinema is projection. It’s immersive size, reflected light, and framing designed for the big screen. For $15, you can get that at your local theatre. For $100, you can get it in your living room with an entry-level projector. For $5,000, you can have glorious 4K HDR laser projection beamed at a 150-inch canvas. OLED can’t touch that.
So let’s be clear: OLED has one advantage, and that’s pixel-level blackness. But movies aren’t about spec sheets. Movies are about experience. And in every way that matters—size, immersion, comfort, color, flexibility, cost, and alignment with the theatrical standard—projectors leave OLED in the dust.
OLED gives you perfect black bars. Projectors give you cinema.
Like I said, I have a $600 projector (might have been $700, hard to remember, it’s a Benq stored away), and it’s not doing 150”. It says it can do 100”, but that’s generous as well. It would be best at something like 80”. And then you’re back to just competing with OLEDs. Sorry but that part of the argument was just so disingenuous.
LTT has many videos about trying to use projectors. You would think they would use them more if they actually were that much better. They have spent tens of thousands of dollars on projectors. And yet they still use regular TVs for almost everything.
Projectors are simply better than OLEDs for movie watching, and no amount of TV brain can change this.
The cult of OLED has convinced people that one single spec—perfect blacks—matters more than everything else. But cinema isn’t about looking into a glowing rectangle with black pixels. It’s about immersion, scale, and light that feels like the real world. If you care about movies as movies, projection isn’t just better—it’s the only serious choice.
Start with size. Movies were never meant to be watched at 65 or 77 inches. A decent 4K laser projector can give you 120 to 150 inches without breaking a sweat. That’s an image that swallows your entire field of vision and changes the way you perceive the film. Wide shots become landscapes. Close-ups feel intimate. Action feels overwhelming in the best way.
OLEDs top out at 97 inches in the consumer market, and anything beyond that is microLED wall territory—Samsung’s “The Wall” runs over $200,000 for 146 inches. Meanwhile, a 4K laser projector with a proper screen costs $5,000–8,000 and achieves the same or greater size. The scale difference alone makes OLED feel like a toy by comparison.
And then there’s light. OLED is emissive. Each pixel is a miniature flashlight shooting photons directly into your eyes. Yes, it produces perfect blacks, but it also produces eye strain. Your pupils are constantly constricting and dilating to deal with rapid HDR changes.
Projectors use reflected light, which is how our eyes evolved to see the world. A laser beam hits a screen, scatters, and bounces back softly and naturally. That’s why projection looks filmic instead of hyper-real. It’s why a three-hour epic is comfortable on a projector and fatiguing on OLED. When movies are mastered for theatrical presentation, they’re mastered on projectors. That reflected light is the reference, not a self-glowing slab.
OLED fans will argue that projectors can’t work in anything but a dark room. That used to be true when we were talking about bulb-based machines and matte white screens. But today’s high-end laser projectors have solved this.
Laser phosphor and RGB models push 3,000–5,000+ lumens without losing color accuracy, and when paired with an ALR (ambient light rejecting) screen, they can produce a bright, contrast-rich image in a normal living room with windows or lamps on. ALR screens are designed to selectively reflect light from the projector while absorbing or deflecting ambient light from other angles. The result is a 120–150 inch image that’s still crisp and cinematic even in spaces where an OLED would otherwise seem like the only option. That completely destroys the “dark room only” myth.
Aspect ratio flexibility is another win for projectors. Most movies aren’t shot in 16:9, but OLEDs are stuck with that rectangle. Watching Lawrence of Arabia or Dune on OLED means black bars eating up screen space. On a projection setup with masking, you can run constant image height—CinemaScope films expand to fill the frame completely. No bars. No compromises. It looks exactly as intended, and the image dominates the room the way it’s supposed to.
Motion also reads differently. OLEDs are sample-and-hold displays, which smear fast movement unless you enable black-frame insertion. That drops brightness and introduces flicker. Projectors have a different cadence. They mimic the way film frames roll in theatres, delivering movement that feels cinematic instead of soap-opera smooth. The difference isn’t subtle once you’ve seen both.
HDR is another place where OLED flexes numbers but misses the point. Yes, OLEDs can spike to 1,000–1,500 nits. But theatrical reference brightness is 48 nits (14 foot-lamberts). Movies aren’t designed to blind you with specular highlights. They’re designed to stretch across a massive screen with consistent luminance. A sunset on OLED is a bright pixel cluster. On a projector, it’s an expanse of color that fills twelve feet of wall. It feels expansive rather than harsh.
Color reproduction is where modern projectors push into true cinema territory. RGB laser light engines often exceed DCI-P3 and reach into Rec.2020. That means you’re seeing color closer to what filmmakers master for. On a massive screen, that richness envelops you. Reds feel deep, blues glow, and alien landscapes look otherworldly. On OLED, even with good calibration, you’re still looking at colors confined to a small rectangle.
Projectors also offer flexibility OLED cannot match. One projector can be scaled down to 90 inches for casual TV or pushed to 150 inches for movie night. Move houses? Resize the screen. Change aspect ratios? Use masking. Upgrade later? Swap the projector but keep the screen. An OLED is a glowing slab of fixed size. If you want bigger, you replace the whole thing.
And then there’s cost. You can get a cheap 1080p projector for under $100—something that instantly delivers a big-screen experience for next to nothing. Midrange models with decent HDR and brightness run a few hundred. A top-tier 4K ultra short throw laser with an ALR screen and HDR support might be $5,000–8,000. Compare that to an OLED panel at 97 inches, which costs around $30,000, or a microLED wall at 146 inches for over $200,000. The value proposition isn’t close. Projection scales from cheap-and-cheerful to reference-grade cinema, while OLED scales from expensive to absurd.
And this is exactly why people still go to the movies. It’s not because OLEDs don’t look great—they do. It’s because cinema is projection. It’s immersive size, reflected light, and framing designed for the big screen. For $15, you can get that at your local theatre. For $100, you can get it in your living room with an entry-level projector. For $5,000, you can have glorious 4K HDR laser projection beamed at a 150-inch canvas. OLED can’t touch that.
So let’s be clear: OLED has one advantage, and that’s pixel-level blackness. But movies aren’t about spec sheets. Movies are about experience. And in every way that matters—size, immersion, comfort, color, flexibility, cost, and alignment with the theatrical standard—projectors leave OLED in the dust.
OLED gives you perfect black bars. Projectors give you cinema.
And then there’s cost. You can get a cheap 1080p projector for under $100—something that instantly delivers a big-screen experience for next to nothin
So no, you’re not getting big screen performance for under thousands of dollars. If you’re at the point of buying an oled that costs 6k dollars then you might want to look into projectors. Until then, lol no.
Projectors are simply better than OLEDs for movie watching, and no amount of TV brain can change this.
The cult of OLED has convinced people that one single spec—perfect blacks—matters more than everything else. But cinema isn’t about looking into a glowing rectangle with black pixels. It’s about immersion, scale, and light that feels like the real world. If you care about movies as movies, projection isn’t just better—it’s the only serious choice.
Start with size. Movies were never meant to be watched at 65 or 77 inches. A decent 4K laser projector can give you 120 to 150 inches without breaking a sweat. That’s an image that swallows your entire field of vision and changes the way you perceive the film. Wide shots become landscapes. Close-ups feel intimate. Action feels overwhelming in the best way.
OLEDs top out at 97 inches in the consumer market, and anything beyond that is microLED wall territory—Samsung’s “The Wall” runs over $200,000 for 146 inches. Meanwhile, a 4K laser projector with a proper screen costs $5,000–8,000 and achieves the same or greater size. The scale difference alone makes OLED feel like a toy by comparison.
And then there’s light. OLED is emissive. Each pixel is a miniature flashlight shooting photons directly into your eyes. Yes, it produces perfect blacks, but it also produces eye strain. Your pupils are constantly constricting and dilating to deal with rapid HDR changes.
Projectors use reflected light, which is how our eyes evolved to see the world. A laser beam hits a screen, scatters, and bounces back softly and naturally. That’s why projection looks filmic instead of hyper-real. It’s why a three-hour epic is comfortable on a projector and fatiguing on OLED. When movies are mastered for theatrical presentation, they’re mastered on projectors. That reflected light is the reference, not a self-glowing slab.
OLED fans will argue that projectors can’t work in anything but a dark room. That used to be true when we were talking about bulb-based machines and matte white screens. But today’s high-end laser projectors have solved this.
Laser phosphor and RGB models push 3,000–5,000+ lumens without losing color accuracy, and when paired with an ALR (ambient light rejecting) screen, they can produce a bright, contrast-rich image in a normal living room with windows or lamps on. ALR screens are designed to selectively reflect light from the projector while absorbing or deflecting ambient light from other angles. The result is a 120–150 inch image that’s still crisp and cinematic even in spaces where an OLED would otherwise seem like the only option. That completely destroys the “dark room only” myth.
Aspect ratio flexibility is another win for projectors. Most movies aren’t shot in 16:9, but OLEDs are stuck with that rectangle. Watching Lawrence of Arabia or Dune on OLED means black bars eating up screen space. On a projection setup with masking, you can run constant image height—CinemaScope films expand to fill the frame completely. No bars. No compromises. It looks exactly as intended, and the image dominates the room the way it’s supposed to.
Motion also reads differently. OLEDs are sample-and-hold displays, which smear fast movement unless you enable black-frame insertion. That drops brightness and introduces flicker. Projectors have a different cadence. They mimic the way film frames roll in theatres, delivering movement that feels cinematic instead of soap-opera smooth. The difference isn’t subtle once you’ve seen both.
HDR is another place where OLED flexes numbers but misses the point. Yes, OLEDs can spike to 1,000–1,500 nits. But theatrical reference brightness is 48 nits (14 foot-lamberts). Movies aren’t designed to blind you with specular highlights. They’re designed to stretch across a massive screen with consistent luminance. A sunset on OLED is a bright pixel cluster. On a projector, it’s an expanse of color that fills twelve feet of wall. It feels expansive rather than harsh.
Color reproduction is where modern projectors push into true cinema territory. RGB laser light engines often exceed DCI-P3 and reach into Rec.2020. That means you’re seeing color closer to what filmmakers master for. On a massive screen, that richness envelops you. Reds feel deep, blues glow, and alien landscapes look otherworldly. On OLED, even with good calibration, you’re still looking at colors confined to a small rectangle.
Projectors also offer flexibility OLED cannot match. One projector can be scaled down to 90 inches for casual TV or pushed to 150 inches for movie night. Move houses? Resize the screen. Change aspect ratios? Use masking. Upgrade later? Swap the projector but keep the screen. An OLED is a glowing slab of fixed size. If you want bigger, you replace the whole thing.
And then there’s cost. You can get a cheap 1080p projector for under $100—something that instantly delivers a big-screen experience for next to nothing. Midrange models with decent HDR and brightness run a few hundred. A top-tier 4K ultra short throw laser with an ALR screen and HDR support might be $5,000–8,000. Compare that to an OLED panel at 97 inches, which costs around $30,000, or a microLED wall at 146 inches for over $200,000. The value proposition isn’t close. Projection scales from cheap-and-cheerful to reference-grade cinema, while OLED scales from expensive to absurd.
And this is exactly why people still go to the movies. It’s not because OLEDs don’t look great—they do. It’s because cinema is projection. It’s immersive size, reflected light, and framing designed for the big screen. For $15, you can get that at your local theatre. For $100, you can get it in your living room with an entry-level projector. For $5,000, you can have glorious 4K HDR laser projection beamed at a 150-inch canvas. OLED can’t touch that.
So let’s be clear: OLED has one advantage, and that’s pixel-level blackness. But movies aren’t about spec sheets. Movies are about experience. And in every way that matters—size, immersion, comfort, color, flexibility, cost, and alignment with the theatrical standard—projectors leave OLED in the dust.
OLED gives you perfect black bars. Projectors give you cinema.
I have a $600 projector and lol no, it’s not going to compete with even a $100 tv of any quality at being visible during daylight hours. It’s hardly visible even nearby a campfire.
I think it comes down to what you're trying to teach. Every language has pros and cons. Let's cover three things that beginners absolutely need to know.
General Language Design
Build systems
Package management
Ideally you choose the simplest language that accomplishes all of these things.
Let's start with build systems. Many languages have good build systems, many have very bad build systems.
Some examples of good build systems:
Rust
Elixir
Ruby
Java (this is probably the most contentious point I will make)
Some examples of bad build systems:
Python
Javascript (this is probably the second most contentious point I will make)
C
Scala
What makes some systems 'good' and some 'bad' are the amount of effort required to get even the simplest project running. With rust, elixir, and ruby (all built either completely based on or inspired by Ruby's Bundler build tool) it's a single command to get started. You get one file you have to manage. You get a single command to build. A single command to release.
With the others it's either piecemeal, or even multistep. JS you're building your own commands, depending on which package manager you've chosen and which build tool you're using. You have to understand both package management, your package manager, your build tool, how that build tool works, and how that build tool interacts with every other part of your system. There's no "1 right way". Same with Python, even though it claims to have the Zen of Python, there's over 15 build tools you can use (and that's just the popular ones). C has CMake and make and maybe another one, I can't remember. Actually, if all you wanted to learn was build tooling then C is probably a great choice. Same for language design. But trying to learn both at the same time is just not ideal for someone already overwhelmed by how different programming is to 'real life'. Scala's build tooling is just bad. It is built on gradle or maven, but because it is essentially a processor on top of those it's just not enjoyable to use. There's plenty of other examples here, I'm just giving ones that popped to the top of my mind.
Now package management. Most languages have fine package management. There's different ways of managing these 'artifactories', 'repositories', etc.
All JVM languages pretty much work alike, so you get a very consistent base with that. Almost any library written for Java will work for any other JVM language: Kotlin, Clojure, Scala, JRuby, Jython, etc.
Ruby, Python, and Javascript have a very similar method of package management, but Javascript has some bad things around node_modules depending on which package manager cli you use. This is the most annoying thing about JS and something a bunch of newbies have no business worrying about.
Rust's package management is very good, but has to deal with linking libraries and so can be confusing for anyone that doesn't already understand what is happening.
General language design:
What are you trying to teach? Are you trying to teach object oriented programming? Are you trying to teach enterprise related architecture and design? Are you trying to get someone ready for a real life job in the "real world"? Every one of these can have a different answer, not just based on what industry that person eventually wants to go into, but on what the course is trying to teach, what the student is trying to get out of it, etc.
I have biases, just like everyone. I'll list off my favorite languages in order (this means languages I like even if I don't ever program in them):
Kotlin
Ruby
Elixir
Rust
C#
I would never use Kotlin to teach an entry level 101 course, because I think the prerequisite to using Kotlin is to understand Java.
Elixir is the same way, you really need to understand Erlang to do so. Rust is also not really on that list, because getting a compiling program for a beginner is nigh impossible. They wouldn't understand any of it.
I think scripting languages are really well suited to teaching a beginner 101 course, which really leaves only a few popular options: Ruby, Python, and Javascript. Ruby and Python come installed by default on almost every linux and mac distribution (I think mac might not anymore as of very recently, can't remember). Javascript is a great language for programmers to learn, I just don't think that it gets out of your way enough to actually teach. Ruby is my choice, this is how easy it is for a beginner to get started (after install):
gem install bundler # explain what bundler is
mkdir mynewproject
cd mynewproject
bundler init # working gemfile displaying package management
I've actually been helping a friend with his python stuff and they just avoid the concept of build tools entirely. Essentially just teaching individual files and then just make file or python blah.py. This is fine, but then you start teaching that whitespace matters and students are confused why there isn't any sort of delimitation between areas. The friend I'm helping got confused because he couldn't tell when whitespace mattered and when it didn't. Also, I think that teaching list comprehensions (which aren't used in 99% of languages) is just a waste of time. Teach 'normal' functional paradigms like .each.
I wouldn't stick with Ruby past a certain point of teaching language structure and a few simple scripts. Don't try to build anything large in it. Switch to C# or Java for that, where the student can start to learn the importance of types and managing many modules and libraries.
But of course all of this is just my opinion. I learned Ruby and Python in the same semester by two different teachers in different courses when I was in school and it was vastly easier to learn Ruby for everyone in my class. I think Ruby is great for teaching the basics, including build and package management and even creating and debugging libraries. It will fall short in many ways for more advanced stuff though (same with Python).
This result reaffirms that Python is a great language for those early in their career.
This is a really bad conclusion to come to. In my experience (teaching others), Python is one of the worst starting languages because it differs so much from other languages. I’m helping teach someone right now and whitespace being significant has poisoned their mind against all white space. They think it matters in function parameter lists, before and after quotes, etc. This knowledge isn’t transferable, thus the large portion of people that only use Python (indicated in the survey).
But if they are like most medium to large businesses, this is an incredible waste of cloud compute expense (which also maps to environmental harm via spent energy).
I mean… come on. If you’re pulling that card then stop using Python all together. You said it yourself (talking to the author here), 50% of Python is ML. The largest user of energy on the planet.
The productive delta between those using it and those who avoid it is simply too great (estimated at about 30% greater productivity with AI).
Hasn’t this been proven wrong like….numerous times now? There’s actually a productivity decrease since you’re constantly correcting an AI. It’s essentially like getting an intern. All your energy is spent helping the intern.
I’m not going to comment on the venv/uv stuff…
I understand I’m in a Python community complaining about a post about Python, I’m sure to get downvoted, but I’ve literally taught numerous people or tutored them when they’re trying to learn Python and every time my conclusion is that earth needs to stop using Python so much. Especially for newbies.
"After the United Kingdom demanded that Apple create a backdoor that would allow government officials globally to spy on encrypted data, Apple decided to simply turn off encryption services in the UK rather than risk exposing its customers to snooping.
Apple had previously allowed end-to-end encryption of data on UK devices through its Advanced Data Protection (ADP) tool, but that ended Friday, a spokesperson said in a lengthy statement.
"Apple can no longer offer Advanced Data Protection (ADP) in the United Kingdom to new users and current UK users will eventually need to disable this security feature," Apple said."