I wish TLoU 1 gave you the option to sacrifice Ellie. Have an alternate ending where they find a vaccine and everyone lives happily ever after (except Ellie).
I wish TLoU 1 gave you the option to sacrifice Ellie. Have an alternate ending where they find a vaccine and everyone lives happily ever after (except Ellie).
Yep. The best people will leave first because they have options. It’s called the dead sea effect
Lights using a 18650 seem to be the rage these days, at crazy cheap prices, but they all use some UI with clicks, holds, etc.
I have an Olight Seeker Pro 4 and it’s pretty simple to use. The on/off button rotates and controls the intensity. You do have to either hold it for a few seconds to turn it on or rotate the button 90º and then click but that’s unavoidable with these kinds of flashlights.
These lights are very small and yet very powerful. That means you can easily pocket them, but because they are so powerful they also get very hot. You don’t want a flashlight like this to accidentally turn on while in your pocket. If you look at these lights, the head is almost always ribbed, it’s basically a heatsink. Even then when you run them at full strength they usually throttle themselves down after a few minutes to prevent overheating.
They are also very different organizations with very different goals.
NASA is focussed on science, they are trying to learn as much as possible about our solar system and the universe.
SpaceX by contrast is focussed on engineering. They aren’t trying to find life on Mars, they are trying to build the ferry service to it.
When NASA built rockets back in the 60’s, space flight was a science problem. We needed to figure out if it was even possible to do so. Can we even get a capsule into space? Can humans survive in zero gravity? Nowadays space flight is an engineering problem. We know it’s possible, we know the math, but can we actually build those things?
I mean, you have to explicitly give permission before apps can access the camera.
How can an app turn on the camera without your consent?
Meg, they are kinda disappointing. Even more tasteless than a normal big mac.
But they aren’t playing with their food. They’re playing with yours.
Trump Watches are intended as collectible items for individual enjoyment only, not for investment purposes. The images shown are for illustration purposes only and may not be an exact representation of the product.
So they are absolute trash.
It’s probably his handlers telling him to refuse.
Fat binaries contain both ARM and x86 code, but I was referring to Rosetta, which is used for x86-only binaries.
Rosetta does translation of x86 to ARM, both AOT and JIT. It does translate to normal ARM code, the only dependency on a Apple-specific custom ARM extension is that the M-series processors have a special mode that implements x86-like strong memory ordering. This means Rosetta does not have to figure out where to place memory barriers, this allows for much better performance.
So when running translated code Apple Silicon is basically an ARM CPU with an x86 memory model.
it’s transpiling the x86 code to ARM on the fly. I honestly would have thought it wasn’t possible
Apple’s been doing it for years. They try to do ahead of time transpiling wherever they can but they also do it on-the-fly for things like JITed code.
By now this has to qualify as elder abuse.
I can imagine wanting to learn a newer, more modern language than python.
And yet, I’ve never run into RAM problems on iPhones, both as a user and as a developer. On iOS an app can use almost all the RAM if needed, as long as your app is running in the foreground. Android by contrast is much stingier with RAM, especially with Java/Kotlin apps. There are some hard limits on how much RAM you can actually use and it’s a small fractIon of the total amount. The actual limit is set by the manufacturer and differs per device, Android itself only guarantees a minimum of 16MB per app.
The reason is probably because Android is much more lenient with letting stuff run in the background so it needs to limit the per-app memory usage.
Those apps also use more RAM than an equivalent iOS app, simply because they run on a garbage-collected runtime. With a GC there is a trade-off between performance and memory usage. A GC always wastes memory, as memory isn’t freed immediately once no longer in use. It’s only freed when the GC runs. If you run it very often you waste little RAM at the cost of performance (all the CPU cycles used by the GC) if you run it at large intervals you waste a lot of RAM (because you let a lot of ‘garbage’ accumulate before cleaning it up). In general, to achieve similar performance to non-GC’d code you need to tune it so it uses about 4 times as much RAM. The actual overhead depends on how Google tuned the GC in ART combined with the behavior of specific apps.
Note that this only applies to apps running in ART, many system components like the web browser are written in C++ and don’t suffer from this inefficiency. But it does mean Android both uses more RAM than iOS while at the same time giving apps less RAM to actually use.
It basically comes down to different architectural choices made by Google and Apple.
It’s not hard to target the older models, with iOS it’s mostly just a few small tweaks.
It depends what you are doing. Targeting the iPhone 7’s GPU can be quite a PITA.
Upgrade your dinosaur of a phone.
Doesn’t matter either way because everyone uses WhatsApp anyway.
RCS will never be able to compete with either because it’s a GSMA standard. Apple or Meta can think of a cool new feature, add it to their client and roll it out to all their users with the next update.
If they want to add a new feature to RCS, the GSMA (An organization with over 1500 members) will have to form a committee, they can then talk about their conflicting interestes for a few years before writing down a new version of the standard, then dozens of clients and servers at hundreds of different operators need to be upgraded before everyone can use the new feature. Due to this bullshit RCS will never be able to keep up.
Why always the knee-jerk anti-apple reaction even if they do something good?
FYI: Apple isn’t telling anyone they invented this. In fact, they didn’t even tell anyone about this feature and declined to comment after it was discovered and people started asking questions.