I mean if the goal was to discourage union membership, then I can understand why they did that. Obviously that backfired…
I mean if the goal was to discourage union membership, then I can understand why they did that. Obviously that backfired…
They are technically correct in that it’s the developers fault that they tied themselves to a proprietary game engine.
In the other hand Godot was nowhere near mature when the slay the spire devs most likely started development. They would be dumb if they used unity for their next game 🤷
I’d guess that companies that failed to turn profit when money was cheap are most likely doomed. However not all of the hype companies are like that. Some could be barely profitable, but shareholder pressure might push them to heavier monetization practices.
I found steamdb.info. According to them Godot seems to be growing steadily.
In this case does packaging mean packaging the silicon die to a processor or soc that can then be used? Or does it mean the assembly of the end product, such as a phone or laptop?
In either case it seems like a moot point to complain that this is a major issue for the long term. Shouldn’t assembly lines for said stuff should be much easier to build in comparison to a chip fab?
Also the fact that the Arizona fab only produces a small fraction of TSMC’s total output is kind of obvious. There are a lot of chip fabs, so US encouragement for domestic production has to be an ongoing effort.
Seems like hollywood. Dangling career opportunities as a reward for constenting to unwanted advances etc.
Yes. OSRS reviews on steam are still listed as very positive.
That’s a job for the parents though isn’t it? And for early teenagers people seem to forget what positive influence the internet could have on their lives. Eg. many IT workers started fiddling around with stuff when they were quite young.
Obviously that has to be reflected in the price of the product. Presumably even more so with storage.
Also there might be a use case, where cost is paramount and the drive would experience very limited writes.
I’ve got a personal anecdote that’s not entirely the same, but I’ve bought a bunch of flash chips from china to use with retro games. Those are often salvaged, but they are also cheap and available to buy. It doesn’t matter if the chips can’t take too many write cycles, if you only flash them a couple of times.
Should the delays and subsequent costs overruns then be simply attributed to increased regulatory complexity or corporate greed?
I’m looking at the list of reactors in France, most of the builds during the last millennium were completed in more or less 10 years. Then there was a gap, and the new one is taking way longer than previous ones.
Same thing has happened in many other countries. Including finland, where at first we got 4 reactors in 6-10 years, and then after a gap of 25 years the next reactor was a clusterfuck that took almost 20years to build.
Both of these reactors are of the same design, and the issues are at least partially attributed to the company having forgot how to manage such large projects due to the years long gap in construction.
Competent nuclear engineers and technicians have retired without being able to pass on their know-how, and cutting-edge nuclear-related industries have disappeared or been converted.
This same fear has been enough to fund SLS and Ariane programs. Basically to avoid the loss of a capability in case it’s needed later on. For some reason it doesn’t seem to apply to nuclear. And now people are complaining that building new reactors is expensive, arguably at least partially due to the supply chains no longer existing in the same scale as before.
Most likely you won’t even notice some of the changes. Reasonably believable cars can already be added to films in post, so no reason why humans couldn’t be. This might not be driven only by AI, but instead on more general tech developments in vfx and such.
Thats the point of having energy abundance. When electricity costs are low enough, it wouldn’t matter, if the source of synthetic fuel was not the most energy efficient one.
But the naysayers will argue that your problem is not novel and a solution can be trivially deduced from the training data. Right?
I really dislike the simplified word predictor explanation that is given for how LLM’s work. It makes it seem like the thing is a lookup table, while ignoring the nuances of what makes it work so well.
The same guy x:d this. Apparently a chinese university has replicated at least the diamagnetism claimed in the paper.
Shouldn’t a different algorithm that adds a some sort of separate logic check be able to help tremendously?
Maybe, but it might not be that simple. The issue is that one would have to design that logic in a manner that can be verified by a human. At that point the logic would be quite specific to a single task and not generally useful at all. At that point the benefit of the AI is almost nil.
I guess this is a joke, but regardless. The current climate is quite different from having an ice sheet 3km thick on the ground. This summer we were nearing 30°C/85°F on some days.
I would be very interested to know why the trend has moved away from building reactors in time and within a reasonable budget. It seems that most projects after the turn of the millennium haven’t been cost effective.
Why did we manage to build reactors well before but not now?
Doesn’t matter. Bad news at the time was enough to scare people for the next 30 years.
Your comparison between phones and VR/AR is reasonable but a bit different as when windows phones were discontinued, Microsoft had pretty much lost the phone os race. Also the windows phones sucked, I’ve used them…
IMO microsoft gave vr/ar a fair chance. They might have been early, but if we are eg. a full decade off must buy VR, then it might not be worth waiting.