

What do you mean you don’t wait until five different manual interventions are required before you update?


What do you mean you don’t wait until five different manual interventions are required before you update?


Training is a continuous expenditure. We’re nearly ten years into this craze and we’re still continuously pumping out new models. Whether they’re trained from scratch or not is immaterial. Both processes still consume energy. If you want to justify the claim that training cost is negligible, you would have to show that this cost is actually going down over time and that it’s going down sufficiently quickly.


It doesn’t look like that energy consumption blog post account for the cost of training the model. Otherwise, it should be telling us how many queries/sessions are assumed to be run over the course of the lifetime of a model.

Those 20 years of eating directly serve our primary evolutionary goal that is the continued existence of human beings.
Most of us also have the goal of enjoying our time here. Food also contributes towards that.
If it’s a liar that lies every time or most of the time, then yeah, don’t bother.
why […] am I asking it questions?
I can’t actually think of any specific scenario where something is unimportant enough to not matter but important enough that you’d ask. What I was originally thinking of were actually scenarios where I planned to verify the information at a later time, but I mistook that in my head as not verifying it.
Obviously, don’t rely on them to read important emails for you. But so many things don’t need additional checking. We’ve all done at least a decade of schooling. We all know basic math, science, and history. When we forget things, all it takes is a small reminder to get it back. Our brains are capable of recognizing whether we’ve seen something before or not. We’re also capable of reasoning to determine whether something we read is consistent with everything else we know.
So many other things are also so unimportant that it doesn’t matter at all if you’re wrong. For example, some actor looks familiar, it lies to you about what film they were in, and you believe it. Is your life any worse off for it?
Do you not cross reference multiple archived news articles and seek out past attendees to remind yourself of what Britney Spears wore at her last concert? smh
I like to keep to the same routine when possible. Birthdays and holidays interrupt that. No good. I can’t do much for holidays, but since my birthday is supposed to be my day, i can demand this from everyone around me.


I’m pretty sure OP is making chili to eat, not for a competition.


Multiple different accounts and their posts are all formatted exactly the same, so this does seem to be the case.
This is probably something that the instance admins can change.
Edit: It seems like they’re looking into it.
They never claimed that it was the whole thing. Only that it was part of it.


Our local Costco has these. They’re the previous day’s chicken and are sold at a discount.


TIL oil barrels are held together by giant Phillips head screws.


How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.
We have the term AGI because we sometimes want to communicate something more specific, and AI is too broad of a term.


Definitely. For the average home cook, that convenience is much more valuable than making your knives extra sharp.


Do they have the means to deliver from the rear? I would think that it’s more convenient for the business too if they could do that.


Do the diamond stones need any kind of maintenance? I’ve read that you need to regularly flatten the surface of your typical whetstone.


Yeah, that much I agree with.
You know what else takes far less energy than training a single model? One query. Yet, you argue that it’s the main contributor to the energy consumption. Why is that? It’s because there’s a very high volume of them, thus bringing up the total energy consumption. At the end of the day, it’s this total energy consumption that matters, not the cost of doing it once. Look at the total energy expenditure of training, not just the cost of doing it once.
We’re talking about AI here because that’s the topic of this thread. I’ve never seen anyone say that it’s the only problem worth addressing. Plus, if you want to compare energy usage of ads (or anything else) compared to AI, you would first need to know how much energy AI is actually using.