What do hydrogen, quantum computers and satellites have in common? 🫧💻🛰️

They're all key technologies to reach climate neutrality by 2050.

Our #StrategicForesight report identifies 10 areas for action to maximise synergies between the green and digital transitions in the EU.

@EU_Commission These claims go entirely against the consensus in the field.

The projected growth in AI, blockchain, IoT will lead to a massive rise in emissions, not a reduction. And contrary to your claim, none of these technologies is essential in reducing emissions.

Quantum Computing is unlikely to be mainstream by 2050 and has currently no promise of energy efficiency. There are much more promising compute technologies. Space-based services cause emissions in the upper atmosphere which leads to additional warming of those layers, making global warming worse.

Please check with experts before posting things like this.(fwiw, I am an expert in low-carbon and sustainable computing so green & digital transition is my area)

Follow

@wim_v12e @EU_Commission I find these proposed 'solutions' seem like more tech solutionism. There's no points on reducing emissions of current activities either.
Do you have some references to publications when you say this is the consensus? I'd be curious to read up on it

@frox I was a bit vague on purpose there by not specifying "the field". There is no such consensus in the wider computing science community because the AI and blockchain and IoT researchers want to keep on doing their thing. But in the low-carbon/sustainable computing field, there is a consensus that there is no tech fix. The best paper to my mind is this one: sciencedirect.com/science/arti
It looks at AI, IoT and blockchain.

Quantum computing is not covered but it is a red herring because nobody really sees it as a viable low-carbon technology in the next two decades.

The issue with space industry and global warming is discussed here: aip.scitation.org/doi/abs/10.1

hello I am student who has completed a thesis in AI which makes me biased and cherrypicking 

@wim_v12e @frox wanna chip in as an incredibly disillusioned compsci AI person that language models like GPT-3 are highly centralized research computation efforts that throw grant money at more compute power nobody else can access. And that's just for the specific problem of generating some convincing text. We are just understanding about how much computational resources is required on par to make this kind of effort even *feasible* , much less what kinds of average power draw we can expect.

There are probably better approaches for natural language generation, but I'd suspect a computer scientist might have to actually tackle with the knowledge base of linguistics as well (this may be a bit snarky, so please forgive me). While computer scientist graduates generally can tell you what a regular grammar is there is zero chance they've read Chomsky's "On Language" or express any interest in actually tackling the problems of that field to build better AI. Unfortunetly this general syndrome of computers being able to do the abstract thinking for us rather than believing that computers have not, in fact, already obseleted every other field of science, has permeated through many a student exposed to the hubris of an alumni speaker who ran git checkout on hotdog not hotdog.

hello I am student who has completed a thesis in AI which makes me biased and cherrypicking 

@thufie @frox Thanks for sharing your insights!

@frox In terms of how much consensus there is on this, I was reviewing a draft whitepaper on this topic (what they call "Twin Transition", green+digital) by the organisation grouping twenty one of Europe's most distinguished research-intensive universities in sixteen countries, and they echo the views in that article (and fwiw, in my article wimvanderbauwhede.github.io/ar)

@wim_v12e Does performance per Watt really not increase exponentially anymore? My main computer is now 3x as fast (when clocked down to consume the same 30W of energy) as my previous home-server was. Processors in notebooks have reduced their consumption and are much better at clocking down when not needed. @frox

@ArneBab @frox Good point, thank you, I need to clarify that. It is still exponential but it's less than linear in a log-log graph, because the time required to double performance per Watt gets longer and longer. The projection is that it will saturate in the next two decades. I will amend that in the article. It does not really matter as the projected growth in computation is higher than the project rate of improvement in performance per Watt.

@wim_v12e Thank you! Did you take into account that electricity production might have much lower CO₂ emissions if we manage to get to renewable Energy? @frox

@ArneBab @frox Yes of course, but the all projections I have found indicate that this will not be in the next two decades (if we can get to 30% by 2040 it will be a success), and that is the time we have to reduce emissions in order to keep warming below 2degC.

Sign in to participate in the conversation
Tooting.ch

Instance générique de Mastodon hébergée par l'association FairSocialNet Generic Mastodon instance hosted by the FairSocialNet association.