I remember booting up Debian into an X11 session on a laptop with only 8 MB of RAM.
(This would have been circa 2000, and I think I had to try a few different distros before finding one that worked. Also I don't think I did anything with it beyond Xterm and Xeyes.)
Ran linux in an 8 mb 486 in the 90s. X ran in 256 color mode and twm or mwm were the window managers. It was so hard to use though. Had to setup modelines settings for your monitor in a textfile and theoretically could damage it with wrong iputs. Programming X fuggedabout it - I was from turbo borland msdos land where everything was neatly documented and designed with clear examples to make programming easy. I was lucky to get an x program to even compile. Hard to find books back then. Pre Amazon. Xv image viewer probably the only thing i used X for. Actually used the machine most of the time in the text mode terminals using alt function keys and used lynx as a browser (before javascript… but gopher was becoming obsolete at that point… ftp still popular though ) with random assortment of svgalib programs for any graphical stuff. Still there was something magical about seeing that black and white check pattern come up and the little X mouse cursor appear.. like there were… possibilities.
That would have been then already some kind of anachronism. 8MiB RAM was workable (but only barely so with X11) in the early nineties. Late nineties 64MiB or more were common.
My first PC had 16 MB of RAM, which later obviously became too slow to be usable. I remember I had to wait around a minute for Fallout to load a level, which you had to do fairly frequently.
I remember buying a bulky external 2MB RAM extension (I think I bought another 2MB) before that for my Amiga 500 running a full desktop OS already on 512k 'Chipmemory' using it mostly to actually as a TempFS to accelerate loading. That was beginning to mid 90s, I guess. But running netbsd on the Amiga meant that you would already at that time need 16MB of RAM and a CPU with an MMU as well as an HDD (my friend across the street did that with his A1200 I think I remember). You would only do it if you wanted more networking beyond BBS I guess.
I don’t know how resolution maps to ram in x11 but I assume at least one byte per pixel. Based on that assumption, there’s no chance you’d even be able to power a 4k monitor with 8mb of ram, let alone the rest of the system.
Correct, 4k is very modern by these standards. But then I'm old, so perhaps it's all about perspective.
Back in the days when computers had 8MB of RAM to handle all that MS-DOS and Windows 3.1 goodness, we were still in the territory of VGA [0], and SVGA [1] territory, and the graphics cards (sorry, integrated graphics on the motherboard?! You're living in the future there, that's years away!), had their own RAM to support those resolutions and colour depths.
Of course, this is all for PCs. By the mid-1990s you could get a SPARCstation 5 [2] with a 24" Sun-branded Sony Trinitron monitor that was rather more capable.
[0] Maxed out at 640 x 480 in 16-colour from an 18-bit colour gamut
[1] The "S" is for Super: 1280 x 1024 with 256 colours!
This was the main driver of VGA memory size for a time - if you spent money on 2MB card instead of a 1MB, you could have higher resolution or bit depth.
if you had a big enough framebuffer in your display adapter, though, X11 could display more than your main ram could support - the design, when using "classic way", allowed X server to draw directly on framebuffer memory (just like GDI did)
It is now, but back then it was 1 byte, with typical resolutions being 800x600. There were high-color modes but for a period it was rare to have good enough hardware for it.
I have run x11 in 16-color and 256-color mode, but it was not fun. The palette would get swapped when changing windows, which was quite disorienting. Hardware that could do 16-bit color was common by the late 90s.
Fun thing - SGI specifically used 256 color mode a lot, to reduce memory usage even if you used 24bit outputs. So long as you used defaults of their Motif fork, everything you didn't specifically request to use more colors would use 256 color visuals which then were composited in hardware.
If someone wants really low ram consumption for a desktop. They should try out tinycorelinux which I have ran the whole system in <25/20 MB of ram from its most minimal option.
It's truly the most minimalist gui option just out there. It uses flwm & there own iirc very minimalist xorg server but most apps usually work
The one issue I have is that I can't copy paste text or do some simple stuff like moving my mouse on some text but aside from that, Tinycorelinux's pretty good
Can your "one issue" be tweaked by adding more RAM and allocating it thusly?
I'm using Void with 24gb ddr5 and frequently get system freezes during high productivity. Browser tabs in the background are often contributors, but working with openshot or odb crashes often.
I have several old nuc's and I might try tinycore on one. What do you or most others use it for, primarily?
I am not sure how my one issue can be fixed. It seems to be fundamentally an issue of their minimalist xorg server itself but I am pretty sure that there must be a way
> I'm using Void with 24gb ddr5 and frequently get system freezes during high productivity. Browser tabs in the background are often contributors, but working with openshot or odb crashes often.
Kdenlive's' pretty good for what its worth and I use Archlinux/cachy on an 8 gig system and browser tabs aren't that often atleast in here
> I have several old nuc's and I might try tinycore on one. What do you or most others use it for, primarily?
I used it to revive my 15 year old laptop and even ran complete modern firefox on it (its specs are 1 gigs 32 bit ram simple mini laptop) and ran wifi and ran firefox and ran pomodorokitty on it and I can sort of treat it as a second monitor
It's battery is removable so I am gonna change its battery as currently the setup takes time to install and I have to install it everytime I open/it shuts down which can happen quite a lot if I don't have it plugged in so currently its shutdown for over a month but I really liked the tinkering I did with when I ran pomodorokitty on it
It used to be like that, computer had limited resources and desktop environments were light. Then at some point RAM became less and less of an issue, and everything started to get bigger and less efficient.
Coyuld anyone summarize why a desktop Windows/MacOs now needs so much more RAM than in the past? is it the UI animations, color themes, shades etc etc or is it the underlying operating system that has more and more features, services etc etc ?
I believe it's the desktop environment that is greedy, because one can easily run a linux server on a raspberry pi with very limited RAM, but is it really the case?
The web browser is the biggest RAM hog these days as far as low-end usage goes. The browsing UI/chrome itself can take in the many hundred megs to render, and that's before even loading any website. It's becoming hard to browse even very "light" sites like Wikipedia on less than a 4GB system at a bare minimum.
> Coyuld anyone summarize why a desktop Windows/MacOs now needs so much more RAM than in the past
Just a single retina screen buffer, assuming something like 2500 by 2500 pixels, 4 byte per pixel is already 25MB for a single buffer. Then you want double buffering, but also a per-window buffer since you don't want to force rewrites 60x per second and we want to drag windows around while showing contents not a wireframe. As you can see just that adds up quickly. And that's just the draw buffers. Not mentioning all the different fonts that are simultaneously used, images that are shown, etc.
(Of course, screen bufferes are typically stored in VRAM once drawn. But you need to drawn first, which is at least in part on the CPU)
You don't need to do all of this, though. You could just do arbitrary rendering using GPU compute, and only store a highly-compressed representation on the CPU.
Yes, but then the GPU needs that amount of ram, so it's fairer to look at the sum of RAM + VRAM requirements. With compressed representations you trade CPU cycles for RAM. To save laptop battery better required copious amounts of RAM (since it's cheap).
> is it the UI animations, color themes, shades etc etc or is it the underlying operating system that has more and more features, services etc etc ?
...all of those and more? New software is only optimized until it is not outright annoying to use on current hardware, it's always been like that and that's why there are old jokes like:
"What Andy giveth, Bill taketh away."
"Software is like a gas, it expands to consume all available hardware resources."
"Software gets slower faster than hardware gets faster"
...etc..etc... variations of those "laws" are as old as computing.
Sometimes there are short periods where the hardware pulls a little bit ahead for a few short years of bliss (for instance the ARM Macs), but the software quickly catches up and soon everything feels as slow as always (or worse).
That also means that the easiest way to a slick computing experience is to run old software on new hardware ;)
Indeed. Much of a modern Linux desktop e.g. runs inside one of multiple not very well optimized JS engines: Gnome uses JS for various desktop interactions, and all major desktops run a different JS engine as a different user to evaluate polkit authorizations (so exactly zero RAM could be shared between those engines, even if they were identical, which they aren't), and then half your interactions with GUI tools happens inside browser engines, either directly in a browser, or indirectly with Electron. (And typically, each Electron tool bundles their own slightly different version of Electron, so even if they all run under the same user, each is fully independent.)
Or you can ignore all that nonsense and run openbox and native tools.
Which is baffling as to why they chose it - I remember there being memory leaks because GObject uses a reference counted model - cycles from GObject to JS then back were impossible to collect.
They did hack around this with heuristics, but they never did solve the issue.
They should've stuck with a reference counted scripting language like Lua, which has strong support for embedding.
A month with Crunch Bang Plus Plus (which is a really nice distribution based on Openbox) and you'll appreciate how quick and well put together Openbox and text based config files are.
Exactly. The issue today is that even if you optimize your OS and DE to be very memory efficient, it matters very little as soon as you open a modern web browser. And without a modern web browser a big part of the online experience is broken.
Eh, kinda. Work forces me to have Jira, Confluence, Gitlab, Copilot, the other Copilot formerly known as Outlook, the other other Copilot formerly known as Teams, as well as Slack of course, and a dozen other webslop apps open… and it still all fits in <8GB RAM.
Which is a lot worse than the <1GB you'd get with well-optimized native tools, but try running Win11 with "only" 8GB RAM.
At the end of the post there is a comparison of ram usage of different desktop environments and the used ram is reported differently by every tool. So what exactly is being here measured as the used ram?
Could say the same thing about why it's in the blog post.
You don't have to care at all. It's just an odd blog post that just from technical intro to rant about DEI and censorship and back to technical details. And joecool1029 just provides more context to what was said in the blog post.
About Nemo (Fran J. Ballesteros from plan9/9front) he has half as encuse as he grew up (for sure) under the Francoist regime probably from the loaded family side, and, thus, he had to swallow tons of literal extreme right wing ideology even at school (Franco's regime). But the point on being a conspiranoid about the Covid... I would expect more sanity from the mindset from a guy perfectly abled in algoritmics, math and by proxy, science. Echo chambers create these kinds of idiots even on really smart people (the far right in Spain used cult like mechanics too), and I'm sure Fran changed a bit over time for the better.
On the Cosmopolitan/APE person, I remind you that if you want to get back to Reissanance times, I'm a Spaniard, and thus, your whole ideology pales against the Iberian Humanism from the School of Salamanca, where at the time we were the Enlightened ones and you were just a bunch of WASP uneducated hicks living in filthy villages in the middle of Europe.
Back to 9intro, even if you dislike ~nemo, 9intro it's still worth to learn programming on 9front, it's a great book to share and learn from.
If would be a waste to ditch it just because some old fart doesn't get into the times.
EDIT: ok, now I see ~nemo it's not that old, so a plausible indoctrination from the Francoism wouldn't apply there; but I'm pretty sure being a conspiranoid on Covid doesn't look like the normal socialization out there.
seriously, what's with people's love of this guy? besides politics, I have not seen anything that suggests engineering prowess from this guy, only "rust bad".
People like his technical opinion because they like his politics. That’s the whole grift-influencer economy. If someone is good at one thing (and validates some of my views), then obviously he’s right about everything.
When people feel underrepresented to the point of being bullied they turn to any voice which seems to reflect even a tiny fraction of their frustrations.
There's a real mean spirit in open source lately and a lot of it seems to revolve around political views. There's become this idea that if you and I disagree on politics then it would be impossible for us to write quality software together. It's damaged a lot of good will and cohesion that used to exist within the open source software community.
This used to be about making free software to people so that they weren't abused by corporations. Now it's about pushing agendas and creating exclusion criteria. There's only one group in this scenario that benefits from this outcome.
If you don't like Lunduke then you should recognize the factors that give rise to people like him. Unless your solution is to completely eliminate anyone who disagrees with you then your apparent mindset only furthers the problem.
I wish we could put all this aside and just enjoy open source again.
Dont present our hypothesis as a hard fact. I actually think it is completely false. Not only I was never interested in his political opinions, and followed him because of his humoristic takes "Linux sucks", and not about Rust or whatever; I actually never encountered a single video before joining his "lunduke journal" where his right-wing views would be visible.
He has made funny videos, it was fun to watch. Its kinda hard to enjoy them now after learning he s dumb as a rock and justifies killings if you are of tje wrong nationality
Skilled enough but the main use is as a news resource like this. The guy ion the blog would not have found out about this unless Lunduke posted about it.
The maker of the provocative "Linux sucks" series is a bit of a troll.
He's made videos on technical projects he doesn't understand (or care about) and just mocks them if they don't gel with him.
As far as I can tell he doesn't really care, or if he thinks he does - his actions aren't translating well.
How do I know? As a FOSS developer myself with a decade plus public history I also happen to know a few people running prominent FOSS projects.
He's burned bridges for no good reason. He doesn't care.
I have no idea who he is, never heard of him. You shall not judge a book by its cover but .. he is making it hard. His video titles are:
* Devuan: The Non-Woke Debian Linux Fork (Without Systemd)
* NeoFetch But in Rust and More Gay
* Chimera Linux is "Here to Further Woke Agenda by Turning Free Software Gay"
* Are Jews the Cause of DEI in Big Tech?
Yeah .. I did not watch a single video of his. But just from a short few seconds It's not anything I want to invest time in to see if he has a point or not. Life is too short.
Whatever I might agree or disagree with, this is annoying to look at, but his stuff keeps coming up in my YouTube feed. Even it looks slightly interesting, I know it will be some rant involved about a thing not related to technology, but some developer's personal opinions on non-tech ideas. I get it - people are horrible! Sheesh!
FWIW, probably not much, he said he had a Jewish background ... in, like, the one video I watched and eventually gave up on.
what's especially strange to me is that in the more distant past, he was a pretty normal guy - at least as normal as any other linux user. Heck, he had a super great podcast (Linux Action Show).
Something changed in the 2014ish time-frame when it got more and more politically extreme.
Xlibre has no CoC. Allowing such people to enter our hallowed halls would be a perverse degredation of these storied institutions. The CoC-less are quite obviously not ready to do serious work or they would have a CoC.
(This would have been circa 2000, and I think I had to try a few different distros before finding one that worked. Also I don't think I did anything with it beyond Xterm and Xeyes.)
That was probably around 2010 or 2015.
Those images had to run on a thin client with 512 MB RAM.
I think I chose XFCE as the DE.
Then again, the X desktop was really minimal and I would use them mostly to code in C using a terminal.
With either 4 MB or only a 386 CPU, it was definitely crippled, making an upgrade not worthwhile.
https://youtu.be/Pw2610paPYM?t=72
But most 386 didn't have 8+ megabytes, and some 386 had a 286 like data bus, making it even slower. (386SX)
Back in the days when computers had 8MB of RAM to handle all that MS-DOS and Windows 3.1 goodness, we were still in the territory of VGA [0], and SVGA [1] territory, and the graphics cards (sorry, integrated graphics on the motherboard?! You're living in the future there, that's years away!), had their own RAM to support those resolutions and colour depths.
Of course, this is all for PCs. By the mid-1990s you could get a SPARCstation 5 [2] with a 24" Sun-branded Sony Trinitron monitor that was rather more capable.
[0] Maxed out at 640 x 480 in 16-colour from an 18-bit colour gamut
[1] The "S" is for Super: 1280 x 1024 with 256 colours!
[2] https://en.wikipedia.org/wiki/SPARCstation_5
if you had a big enough framebuffer in your display adapter, though, X11 could display more than your main ram could support - the design, when using "classic way", allowed X server to draw directly on framebuffer memory (just like GDI did)
Like in Sun SPARCStation ELC. No confusing colors or shades.
Which proves time travel exists, all those "two bits" references in old Westerns.
It's truly the most minimalist gui option just out there. It uses flwm & there own iirc very minimalist xorg server but most apps usually work
The one issue I have is that I can't copy paste text or do some simple stuff like moving my mouse on some text but aside from that, Tinycorelinux's pretty good
I'm using Void with 24gb ddr5 and frequently get system freezes during high productivity. Browser tabs in the background are often contributors, but working with openshot or odb crashes often.
I have several old nuc's and I might try tinycore on one. What do you or most others use it for, primarily?
> I'm using Void with 24gb ddr5 and frequently get system freezes during high productivity. Browser tabs in the background are often contributors, but working with openshot or odb crashes often.
Kdenlive's' pretty good for what its worth and I use Archlinux/cachy on an 8 gig system and browser tabs aren't that often atleast in here
> I have several old nuc's and I might try tinycore on one. What do you or most others use it for, primarily?
I used it to revive my 15 year old laptop and even ran complete modern firefox on it (its specs are 1 gigs 32 bit ram simple mini laptop) and ran wifi and ran firefox and ran pomodorokitty on it and I can sort of treat it as a second monitor
It's battery is removable so I am gonna change its battery as currently the setup takes time to install and I have to install it everytime I open/it shuts down which can happen quite a lot if I don't have it plugged in so currently its shutdown for over a month but I really liked the tinkering I did with when I ran pomodorokitty on it
Coyuld anyone summarize why a desktop Windows/MacOs now needs so much more RAM than in the past? is it the UI animations, color themes, shades etc etc or is it the underlying operating system that has more and more features, services etc etc ?
I believe it's the desktop environment that is greedy, because one can easily run a linux server on a raspberry pi with very limited RAM, but is it really the case?
Just a single retina screen buffer, assuming something like 2500 by 2500 pixels, 4 byte per pixel is already 25MB for a single buffer. Then you want double buffering, but also a per-window buffer since you don't want to force rewrites 60x per second and we want to drag windows around while showing contents not a wireframe. As you can see just that adds up quickly. And that's just the draw buffers. Not mentioning all the different fonts that are simultaneously used, images that are shown, etc.
(Of course, screen bufferes are typically stored in VRAM once drawn. But you need to drawn first, which is at least in part on the CPU)
...all of those and more? New software is only optimized until it is not outright annoying to use on current hardware, it's always been like that and that's why there are old jokes like:
...etc..etc... variations of those "laws" are as old as computing.Sometimes there are short periods where the hardware pulls a little bit ahead for a few short years of bliss (for instance the ARM Macs), but the software quickly catches up and soon everything feels as slow as always (or worse).
That also means that the easiest way to a slick computing experience is to run old software on new hardware ;)
Or you can ignore all that nonsense and run openbox and native tools.
They did hack around this with heuristics, but they never did solve the issue.
They should've stuck with a reference counted scripting language like Lua, which has strong support for embedding.
Which is a lot worse than the <1GB you'd get with well-optimized native tools, but try running Win11 with "only" 8GB RAM.
Here is how I set up minimal Desktop, WATCH 4 VIDEOS ABOUT HOW DEI IS KILLING OPEN SOURCE PROJECTS, and here is my loader.conf ...
You don't have to care at all. It's just an odd blog post that just from technical intro to rant about DEI and censorship and back to technical details. And joecool1029 just provides more context to what was said in the blog post.
https://git.sr.ht/~rabbits/fashware
About Nemo (Fran J. Ballesteros from plan9/9front) he has half as encuse as he grew up (for sure) under the Francoist regime probably from the loaded family side, and, thus, he had to swallow tons of literal extreme right wing ideology even at school (Franco's regime). But the point on being a conspiranoid about the Covid... I would expect more sanity from the mindset from a guy perfectly abled in algoritmics, math and by proxy, science. Echo chambers create these kinds of idiots even on really smart people (the far right in Spain used cult like mechanics too), and I'm sure Fran changed a bit over time for the better.
On the Cosmopolitan/APE person, I remind you that if you want to get back to Reissanance times, I'm a Spaniard, and thus, your whole ideology pales against the Iberian Humanism from the School of Salamanca, where at the time we were the Enlightened ones and you were just a bunch of WASP uneducated hicks living in filthy villages in the middle of Europe.
Back to 9intro, even if you dislike ~nemo, 9intro it's still worth to learn programming on 9front, it's a great book to share and learn from. If would be a waste to ditch it just because some old fart doesn't get into the times.
EDIT: ok, now I see ~nemo it's not that old, so a plausible indoctrination from the Francoism wouldn't apply there; but I'm pretty sure being a conspiranoid on Covid doesn't look like the normal socialization out there.
I kinda wanna try linux again...
sees lunduke
closes blog post
There's a real mean spirit in open source lately and a lot of it seems to revolve around political views. There's become this idea that if you and I disagree on politics then it would be impossible for us to write quality software together. It's damaged a lot of good will and cohesion that used to exist within the open source software community.
This used to be about making free software to people so that they weren't abused by corporations. Now it's about pushing agendas and creating exclusion criteria. There's only one group in this scenario that benefits from this outcome.
If you don't like Lunduke then you should recognize the factors that give rise to people like him. Unless your solution is to completely eliminate anyone who disagrees with you then your apparent mindset only furthers the problem.
I wish we could put all this aside and just enjoy open source again.
He has made funny videos, it was fun to watch. Its kinda hard to enjoy them now after learning he s dumb as a rock and justifies killings if you are of tje wrong nationality
Skilled enough but the main use is as a news resource like this. The guy ion the blog would not have found out about this unless Lunduke posted about it.
Do you understand? :)
Is he ""bigoted"" ? :(
How do I know? As a FOSS developer myself with a decade plus public history I also happen to know a few people running prominent FOSS projects.
He's burned bridges for no good reason. He doesn't care.
* Devuan: The Non-Woke Debian Linux Fork (Without Systemd)
* NeoFetch But in Rust and More Gay
* Chimera Linux is "Here to Further Woke Agenda by Turning Free Software Gay"
* Are Jews the Cause of DEI in Big Tech?
Yeah .. I did not watch a single video of his. But just from a short few seconds It's not anything I want to invest time in to see if he has a point or not. Life is too short.
FWIW, probably not much, he said he had a Jewish background ... in, like, the one video I watched and eventually gave up on.
Something changed in the 2014ish time-frame when it got more and more politically extreme.
apt-install --fuck-yes gay-rust-neofetch
I’ll look to migrate to chimera shortly, but only if it includes gay neofetch.
...errrrrrrrrrrrrr, plot twist, he is a jew himself, or at least he claimed he is.
annnd its another XLibre shill proselytising.
Or, what's the popular line in this scenario, "if you don't like it go make your own?"
I'm sorry.. I just find the reliable cult of software personality on HN to be a little frustrating.
It's like "your car is going to get dirty why even wash it?"