Permacomputing | INFJ Forum

Permacomputing

uuu

Donor
Jan 31, 2011
1,133
1,407
777
MBTI
I
Saw it on Hacker News and thought I would share:

Permacomputing is a radically sustainable approach to computing inspired by permaculture. The term was originally coined and described by Viznut in July 2020.

Permacomputing asks the question whether we can rethink computing in the same way as permaculture rethinks agriculture. Is there even place for high technology (such as computing) in a world where human civilizations contribute to the well-being of the biosphere rather than destroy it? Permacomputing wants to imagine such a place and take steps towards it. It is therefore both utopian and practical.

Permaculture
Permaculture is the science and practice of creating semi-permanent ecosystems of nature. The resilience of any such ecosystem is equal to it's diversity and interconnectedness. Permaculture design is a system of assembling conceptual, material and strategic components in a pattern which functions to benefit life in all its forms. It seeks to provide a sustainable and secure place for living things on this earth.

Computing
A technology that depends on a wasteful use of finite resources can hardly be permanent. This is why a radical reduction of that wastefulness is a major concern to us: maximize the hardware lifespans, minimize the energy use. And this is not just about a set of technical problems to be fixed – the attitudes also need a radical turn.

Understandability is a form of beauty, virtual does not mean immaterial and doing things with less is not a return to the past.

from the Permacomputing Wiki: https://permacomputing.net/

which is a fairly new site.

What do you think about this idea? Do you have any technological practices that you consider compatible with permacomputing?
 
Saw it on Hacker News and thought I would share:



from the Permacomputing Wiki: https://permacomputing.net/

which is a fairly new site.

What do you think about this idea? Do you have any technological practices that you consider compatible with permacomputing?
It seems like nonsense feelgood moralising rhetoric.

Chips and power are in short supply. One could simply say that technological progress will be slower in the medium term, especially if China takes Taiwan. Instead they come up with permacomuting.
 
  • Like
Reactions: aeon and Wyote
What do you think about this idea?
I can't see that it's practical. Computing technology is evolving at an incredible pace - we've become used to that but it would be bewildering to people of long ago. I don't mean just the technology itself, but the speed of sustained innovation. It isn't the hardware that's evolving by itself, but the software too. It's hard to imagine even if it were modularised, the hardware in the first generation cell phones could be used in modern smart phones for example. They used analogue telephone technology for example, and were the size of a brick, and they couldn't do anything else but voice communication. That wasn't all that long ago.

I do wonder if more could be done to ensure the parts are recyclable though if the industry was forced into it. The car industry is going electric because of that sort of pressure and that wasn't thought feasible not too long ago.
 
Interesting concept. I've been thinking a lot about sustainable computing since Beeple sold an NFT for $6.9 million. NFTs and crypto are not sustainable. There has to be a way to merge these kinds of ideas even if permacomputing seems like an unrealistic concept right now.
 
I think we are already driving in this direction, it's just a matter of a few breakthroughs in technology and for those things to be disseminated throughout the field of computing.
Granted these breakthroughs could happen next year or the next hundred years, hard to say.
 
Like any ideology, I think that permacomputing makes some good points, and that as long as you treat it as an overall compass rather than following every principle religiously it could have some merit. For example, @John K mentions the incompatibility between modern cell phones and the earliest cell phones of 25–30 years ago, and while it would be impossible to use one of those first-generation phones, there are many people using phones that are 5–10 years old with no issue.

I also think the major advances in technology have, at least in the past 5 years or so, occurred mainly on the software level. Moore's law about things getting twice as fast every two years has slowed down, but new code optimization tools like LLVM have gotten really, really good, as has the ability of code to take advantage of not-so-new hardware features like SIMD. If you are comfortable with a bit of tinkering, a mainstream laptop from 2010–12 can be perfectly usable today if you remove bloatware (and ideally install a linux distro).

But we might be on the cusp of new hardware revolutions that will require everyone to buy new hardware again, with things like quantum computing and specialized processor architectures for neural networks. So I'm not sure.