• 16 Posts
  • 77 Comments
Joined 9 months ago
cake
Cake day: September 29th, 2023

help-circle
  • The issue isn’t emissions, it’s costs. Sadly we don’t live in a dream world, and everything has a cost.

    Even running excess production into hydrogen production has costs (transport, storage, infrastructure…).

    The current (not taking in consideration the new tech currently in testing) beeing highly ineficient creates many cost issues.

    Less effieicnt means that more power needs to be used to get that amount of hydrogen, reducing the gains on electricity surplus.

    The storage beeing ineficient means a higher running cost, more space used, less of that space…

    The transport beeing ineficient also increases the running costs, but also the emissions if the transport uses fossil fuel. Of it uses hydrogen, well it increases the running cost even more. That expensive produced hydrogen is used for transport…

    The electricity production from hydrogen being ineficient increases the used hydrogen to get the same energy amount, which then increases the costs because more of that expensive hydrogen has to be used.

    So taking all this into account, being “clean” doesn’t necessarily make it is viable compared to other storage or energy production tech.

    The costs have to be taken in account because resources don’t appear magically.

    Mining Uranium has a cost. Buying it from abroad has a cost, paying people to maintain all that has a cost…



  • Well the issue with renewable power like wind and solar, is that they are not stable.

    Having a battery in order to store the energy and release it when the demand is higher than production is one part of the solution.

    But what happens when there wasn’t enough solar and wind to replenish the batteries if those batteries aren’t enough for the demand? Power shortages, which are pretty bad to get.

    One of the solutions to this is natural gas for a simple reason : it’s very fast to start generating power or to stop. It’s also not very expensive, at least when there isn’t a war… The co2 equivalent emissions aren’t as high as coal either.

    Nuclear power on the other hand is very hard to stop. Having a surplus of power on the grid is also very bad. Some of it could be used to recharge the batteries, but there would be some loss at some point.






  • One of your questions don’t seem to be that based?

    “shits on Linux gamers”, are you talking about the store not beeing available on linux? Meh already got heroic which is better.

    Their easy anticheat is available through proton tho, it’s on the game dev to chose to enable it or not (and I understand why they don’t do it for fortnite : the Linux market is pretty small, but also because the game is so huge that hackers will not hesitate a bit to switch to Linux in order to hack with custom kernels).


  • Nah, the game is utter trash not the bugs. Let’s look at 3 games very hyped :

    • Redfall : game had game breaking bugs and performance issues at launch. Gameplay was bad. No one played it.

    • Gollum : game had game breaking bugs and performance issues at launch. Gameplay was destroyed due to bugs. The studio closed their gaming branch.

    • Starfield : very hiped, bought by a lot of people, the game looks like 2010-15 game with some little 2023 enhancements…

    Redfall and Gollum were failures. High budget failures. They most likely layed off people.

    Starfield : Microsoft layed off people at the start of the year https://www.polygon.com/23561210/microsoft-layoffs-xbox-bethesda-halo-infinite-343-industries for who knows why. The game got delayed, and then it gets out very mixed due to bad exploration gameplay and no love put into population design (population characters look like 2010 or even worse).

    All of these 3 games have been very hyped, with a high price, but none of their failure have anything to do with gamers “fault” and “opinion”. It’s all on the studios fault on not delivering something good.







  • Unreal engine is pretty bad for open maps. It generates a lot of cpu usage when changing zones. And heavy textures and other heavy elements don’t enhance the experience.

    The vram, I’m not sure what your questions is about.

    The vram is special ram (much higher bandwidth, but slightly higher latency than cpu ram, also supports special extra things) included on the board of the graphics card.

    It is necessary because it stores textures and others game elements the graphics card needs to operate the game (shadow info,…). The elements are loaded into vram, from the very slow (in comparison) drive (even nvme 5.0 ssds are extra slow compared to vram or ram) to allow the gpu to process whatever it has to do. Background tasks, windows, the desktop… Also use vram to be able to have the app windows and desktop displayed, so the total available for the game can vary.

    If there isn’t enough vram, there can be multiple things happening (I’m talking about textures but vram includes others things too) :

    • Resizable bar ( or SAM on amd) is not enabled : the gpu will not be able to load all the textures, so it would either have missing textures, or lag a lot due to texture swapping. The textures can also take a lot of time to load instead of completely missing depending on the game optimisation, due to swapping with previous textures. The game can even crash.

    • Resizable bar is enabled : it is possible with this pci-e configuration for the gpu to access system memory. So in some cases, textures may spill into system memory (cpu ram), which isn’t great either, because system memory has a way higher latency to the gpu (it has to go through the cpu, pci-e slot…), and way lower bandwidth. And so generates lots of lag.

    If a game is well optimized, the lower the settings are the lower vram usage there is. Some games however did not have such great optimisation. Vram usage mostly depends on the texture quality and resolution. (increasing the texture quality will use a very few/negligible amount of extra gpu power, but increase the vram usage).

    There is also a baseline the devs may put for optimisation. The less vram there is, the less the textures can have data available to use. So the more compromises have to be done, with less and less quality. So fixing a baseline quality depending on the current most used vram capacity is not that bad. Tho it does have issues for people having less available.



  • A big issue with recent games is Vram usage (the gpu has vram). If you don’t have enough vram the game will stutter. At the moment where there isn’t enough vram, even a tiny bit not enough, the game will stutter.

    Another issue is also ram and cpu utilisation which in some games is pretty extreme.

    Othrt issue can be very heavy graphics and badly optimized lower settings.

    Some games also have transition stutter, where you change zone. It will try to load the new zone and unload the precedent one. But it uses cpu power and requires a fast ssd depending on the size of what has to be loaded.