- 3 Posts
- 1.57K Comments
I like typing yay and getting updates.
towerful@programming.devto
Television@piefed.social•Shogun Season 2 With Hiroyuki Sanada Now in Production for 7 Months in Vancouver
3·2 days agoBES is amazing. Absolutely fantastic
towerful@programming.devto
Technology@lemmy.world•SpaceX is seeking FCC approval to launch 1M satellites into space; SpaceX claims the fleet will orbit the Earth and use the sun to power AI data centersEnglish
33·2 days agoScott Manley has a video on this:
https://youtu.be/DCto6UkBJoIMy takeaway is that it isn’t unfeasible. We already have satellites that do a couple kilowatts, so a cluster of them might make sense. In isolation, it makes sense.
But there is launch cost, and the fact that de-orbiting/de-commissioning is a write-off, and the fact that preferred orbits (lots of sun) will very quickly become unavailable.
So there is kinda a graph where you get the preferred orbit, your efficiency is good enough, your launch costs are low enough.
But it’s junk.
It’s literally investing in junk.
There is no way this is a legitimate investment.It has a finite life, regardless of how you stretch your tech. At some point, it can’t stay in orbit.
It’s AI. There is no way humans are in a position to lock in 4 years of hardware.
It’s satellites. There are so many factors outside of our control that (beyond launch orbit success), that there is a massive failure rate.
It’s rockets. They are controlled explosives with 1 shot to get it right. Again, massive failure rate.It just doesn’t make sense.
It’s feasible. I’m sure humanity would learn a lot. AI is not a good use of kilowatts of power in space. AI is not a good use of the finite resource of earth to launch satellites (never mind a million?!). AI is not a good reason to pullute the “good” bits of LEO
towerful@programming.devto
Atheism@lemmy.world•The Texas Board of Education Plans To Force Kids To Read the Bible in Public SchoolsEnglish
20·2 days agoI’ve experienced reciting the pledge of allegiance and the lords prayer.
It’s all indoctrination
towerful@programming.devto
Technology@lemmy.world•The TV industry finally concedes that the future may not be in 8KEnglish
26·2 days agoYeh, do: 60fps, 30 bit color… and I guess HDR?
Do things that people can actually appreciate.
And do them in the way that utilises the new tech. 60fps looks completely different from 24fps… Work with that, it’s a new media format. Express your talent
towerful@programming.devto
Kubernetes@programming.dev•Ingress NGINX: Statement from the Kubernetes Steering and Security Response Committees
2·4 days agoThe retirement of Ingress NGINX was announced for March 2026, after years of public warnings that the project was in dire need of contributors and maintainers. There will be no more releases for bug fixes, security patches, or any updates of any kind after the project is retired.
This cannot be ignored, brushed off, or left until the last minute to address. We cannot overstate the severity of this situation or the importance of beginning migration to alternatives like Gateway API or one of the many third-party Ingress controllers immediately.I know it’s literally the first paragraph, but I thought it worth commenting for those that only read the title & comments
towerful@programming.devto
pics@lemmy.world•Computer wiring tunnel, abandoned coal power plant. Photo by Bryan Buckley.
4·4 days agoI’d take each of your metrics and multiply it by 10, and then multiply it by another 10 for everything you haven’t thought about, then probably double it for redundancy.
Because “fire temp” is meaningless in isolation. You need to know the temperature is evenly distributed (so multiple temperature probes), you need to know the temperature inside and the temperature outside (so you know your furnace isn’t literally melting), you need to know it’s not building pressure, you need to know it’s burning as cleanly as possible (gas inflow, gas outflow, clarity of gas in, clarity of gas out, temperature of gas in, temperature of gas out, status of various gas delivery systems (fans (motor current/voltage/rpm/temp), filters, louvres, valves, pressures, flow rates)), you need to know ash is being removed correctly (that ash grates, shakers, whatever are working correctly, that ash is cooling correctly, that it’s being transported away etc).
The gas out will likely go through some heat recovery stages, so you need to know gas flow through those and water flow through those. Then it will likely be scrubbed of harmful chemicals, so you need to know pressures, flow rates etc for all that.
And every motor will have voltage/current/rpm/temperature measurements. Every valve will have a commanded position and actual position. Every pipe will have pressure and temperature sensors.The multiple fire temperature probes would then be condensed into a pertinent value and a “good” or “fault” condition for the front panel display.
The multiple air inlet would be condensed into pertinent information and a good/fault condition.
Pipes of a process will have temperature/pressure good/fault conditions (maybe a low/good/over?)And in the old days, before microprocessors and serial communications, it would have been a local-to-sensors control/indicator panel with every reading, then a feed back to the control room where it would be “summarised”. So hundreds of signals from each local control/indicator panel.
Imagine if the control room commanded a certain condition, but it wasn’t being achieved because a valve was stuck or because some local control over-rode it.
How would the control room operators know where to start? Just guess?
When you see a dangerous condition building, you do what is needed to get it under control and it doesn’t happen because…
You need to know why.
towerful@programming.devto
Selfhosted@lemmy.world•Getting worn out with all these docker images and CLI hosted appsEnglish
41·4 days agoI love cli and config files, so I can write some scripts to automate it all.
It documents itself.
Whenever I have to do GUI stuff I always forget a step or do things out of order or something.
towerful@programming.devto
Programming@programming.dev•10 things I learned from burning myself out with AI coding agents
1·12 days agoSpoiler alert: it’s just yaml
towerful@programming.devto
Programming@programming.dev•Majority of CEOs report zero payoff from AI splurge
141·13 days agoThe bubble is propped up by governments.
They don’t need “as good as an employee but faster”. They just need “faster”, so they can process gathered data on an enormous scale to filter out the “potentially good” from the “not worth looking at”.
Then they use employees to actually assess the “potentially good” data.Seriously, intelligence agencies don’t need “good ai”, they just need “good enough ai”.
And they have that already.
towerful@programming.devto
World News@lemmy.world•Trump threatens 200% tariffs on French wine and champagne after Macron refuses to join 'board of peace'English
4·13 days agoWhat the fuck is a french fry? You mean Freedom Fries?
towerful@programming.devto
Technology@beehaw.org•Spotify’s 3rd price hike in 2.5 years hints at potential new normal
2·15 days agoNo idea. Haven’t started digging into it yet
towerful@programming.devto
Ukraine@sopuli.xyz•Ukraine forces killed 27 Russians for every soldier lost at Kupiansk
142·17 days ago27:1 kd would be accused of cheating in video games.
Because this stat isn’t really a stat and isn’t hyped or published, I’m going by DDG AI assist which suggests US k:d in Iraq is 44:1
The U.S. military suffered approximately 4,492 deaths and around 32,292 wounded during the Iraq War, while estimates suggest around 200,000 Iraqi civilians were killed. This results in a rough kill-to-death ratio of about 44:1, favoring U.S. forces, though this does not account for all combatants and the complexities of the conflict.
Considering that Ukraine isn’t killing civilians… Classic AI bullshit uselessness.
If I killed 27 enemy aggressors while defending my country, I would die happy. I don’t ever want to be in that position, I don’t think anyone should ever be in that position. But that is an achievement, under the circumstances, to be proud of
towerful@programming.devto
Technology@beehaw.org•Spotify’s 3rd price hike in 2.5 years hints at potential new normal
9·17 days agoTIDALs continued awesomeness suggests suitable alternatives.
Spotify pays Joe Rogan how much? And pays artists how little?
TIDAL does music.
I changed a few years ago, and all I miss are the integrations.
I’m lucky that I have decent speakers & dac on my desktop, and decent IEMs. So I can listen to music where I want.
But I can’t buy a “tidal speaker” in the way I could buy a “Spotify speaker”.
But I’m arrogantly confident enough to waste some money solving this with home assistant, some rpi/nucs, and some speakers. I feel I don’t need (I actually don’t want a vendor locked in) “just works” solution, and I’m happy rolling my own.
towerful@programming.devtopolitics on db0@lemmy.dbzer0.com•Trump Suggests U.S. ‘Shouldn’t Even Have’ November Midterms
7·18 days agoAnd so it begins
towerful@programming.devto
Linux@programming.dev•𝙛𝙧𝙚𝙚 Linux video editing 𝘽𝙀𝙏𝙏𝙀𝙍 than paid: kdenlive - BreadOnPenguins
4·20 days agoYeh, either proxy editing (where it’s low res versions until export).
Or you could try a more suitable intermediary codec.
I presume you are editing h.264 or something else with “temporal compression”. Essentially there are a few full frames every second, and the other frames are stored as changes. Massively reduces file size, but makes random access expensive as hell.Something like ProRes, DNxHD… I’m sure there are more. They store every frame, so decoding doesn’t require loading the last full frame and applying the changes to the current frame.
You will end up with massive files (compared to h.264 etc), but they should run a lot better for editing.
And they are lossless, so you convert source footage then just work away.Really high res projects will combine both of these. Proxy editing with intermediary codecs
towerful@programming.devto
United Kingdom@feddit.uk•Keir Starmer abandons plans for compulsory digital IDEnglish
27·20 days agoI’d rather they u-turned shitty ideas than waffle-stomp them through.
How the fuck the OSA seemed to just drift through is astounding
towerful@programming.devto
JavaScript@programming.dev•Stop turning everything into arrays (and do less work instead)
2·21 days agoI think it’s the
take(10)that makes it performant.
I guess things likeany()will also short-circuit having to process the entire array through a chain of operators before acting on a subset of the final processed array.
If that makes sense…But if you are filtering then mapping an array, an iterator won’t help. You need to go through every element of the source array (to filter) then every element of the filtered array (to map) anyway. There is no opportunity to short circuit, so an iterator won’t help.
Edit: and I bet there is overhead keeping iterators active/alive and context switching as each new iteration gets requested, instead of batch processing an array then moving on to the next batch process.





If the snowman was build on the road, the driver is at fault for driving carelessly, not paying attention.
Nobody else was hurt. Nobody else’s property was damaged. There is no one to be held liable.
This guy drove into a snowman, regardless of where it was.
A static object that only moves in Christmas music.
If it was a snowbank, same deal.
If it was a parked car, same deal.
If it was a fallen telephone/power pole, same deal.
If it was a pile of cinderblocks that fell off the back of a truck, same deal.
The guy either wasn’t paying attention, or was being an asshole.
Either way, driving carelessly. Asshole is at fault