I wonder: If we required all programmers to use a slower device than their target for development would we see improved software? Limitations do breed creativity, after all.

Speaking of which, I’m making some progress getting my Pi environment set up to my liking. The hardest part so far is that my monitor is massive — I have a work-provided ultra wide display that the Pi has a bit of trouble driving. Trying to get the mouse to stop being so janky at the moment, but everything else is working reasonably well now.

@ndpi heavy electron use and docker setups have created this on high powered machines. If you’re running webpack and docker and multiple electrons you feel the lag.

@peregrine absolutely. It’s pervasive in the web platform, but in general I think having super beefy dev machines allows people to ignore performance until deployment and then be caught by surprise by how slow things are. Worse yet, a lot of devs may not even notice if they have the latest hardware, phones, etc which many do.

@ndpi @peregrine this toot however does describe a common problem well - I'm not disagreeing with you there :)

@ndpi I think it's overkill to require that they use slow machines all the time ... they genuinely don't want to be made to wait for things to happen while they're actually developing.

But absolutely they should be made to use "typical" machines at least one day a week, and to "dog food" their product.

@ColinTheMathmo my point exactly! Using a more constrained environment requires you to put more effort into making efficient systems, both in your final product and your tools. It’s funny in a sad kind of way how bad our tools are, to the extent that to use them you need the latest and greatest hardware. Anything else just chugs.

@ndpi do the same thing to the bandwidth when developing applications that work over a network, and you might start seeing real thought put into how the bandwidth is utilised

@laumann totally. I think games are a prime example of this wastefulness. I recently had to cancel a gaming session with a friend because my Xbox decided it needed 64GiB (!) of updates and I’m on a standard UK connection of ~40Mbps.

@ndpi oh wow, that's just... insane. I notice it mostly with webthings. Just preventing most sites from downloading ads from third party sites saves at least half of the bandwidth.

@ndpi that's still on my todo! I'm currently using an expanded hosts file + script blocker. Is it worth it?

@laumann I think it’s complementary. The Pi Hole is nice because it applies to the entire network.

@ndpi this sounds like an unworkable technical solution to an awareness/empathy problem?

(maybe workable if by "all developers" you mean a group of devs you can direct to use slower hardware, but still a fix at the wrong layer imo)

@ndpi i think what you're touching on is accessibility issues due to digital divide (possibly a better term needed to describe digital gap btw high spec and mid/low-end consumer experience)

Technical solutions like defining requirements for experience on lower end gear and testing as such would be more efficient imo than giving devs low end kit?

And matching social solution is ensuring devs are aware of & don't dismiss this experience consideration. Also a requirement.

@xurizaemon it’s more of a shower thought than an actual suggestion. I think you’re right that it is a matter of having understanding of your users, but also of being anchored in the same world so that by improving your own experience you elevate that of others.

@ndpi @neauoire I'm not sure there's a ton to be gained by doing this to developers. Product and project managers, though…

See, a lot of the time, developers want to ship efficient things. But management doesn't want to spend money/time on technical debt: they want features, features, features

Sure, some developers have internalized that pressure too. But helping more developers understand/value performance won't help if the people signing their checks will punish them for spending time that way

@calcifer I think you’re right about the system pressures for making bloated software, and addressing that is certainly necessary.

That said, as a developer myself I can say that a lot of devs don’t actually know how to or care about shipping efficient software; they’re more interested in features too. There’s plenty of open source stuff that is janky as all get out and relies on the user having access to a ton of computation, disk, or network I/O.

@ndpi that's what I meant by "some developers have internalized the pressure too"

And I'm not saying developers wouldn't benefit by using software under tighter constraints; I'm saying there's limited utility in that while the "customers" reward inefficiency. If PMs and other managers and leaders could be shown the importance, the developer care would follow, because leaders heavily influence culture.

As a developer, I care about efficiency and maintainability because as I was learning I was working in cultures that valued those things. So I learned by experience and association why it mattered. I didn't need to use slow machines myself, because my mentors and org leaders helped me see that respecting resources was a sort of "good citizenship" it kills productivity though, that why is is discouraged in business, where you have to give asap or lose nerves and money Yeah, stuff like NES and other simmilar systems had state of art optimization

@replikvlt indeed, and you can see the way the restrictions forced people to be very intentional about how they made their software because to do otherwise just wouldn’t work.

@ndpi @courtney why not. Sound engineers will play their mix on high end equipment and shitty car stereos to make sure the experience is as good as it can be for as many as possible. Use? I don't know. Having a fast machine avoid day-to-day friction on the job. I like to have lots of tabs open, music playing, quick compiles, and an IDE that helps me get my brain into the computer smoothly.
On the other hand, we need to be clear what the target is, and actually have that target computer on hand.

In my case, I hassled my company to buy second-hand smartphones and laptops that we test on, and make sure the experience is acceptable.
One thing that this methodology has led me to in the past is to do video game style "quality settings", dynamically limiting animations, shadows and other "nice to haves" on lower end machines. Since the best experience I can offer you will change depending on how much I can ask your computer to do.

@gaeel it’s exactly that friction which this thought experiment proposes would encourage developers to make leaner, more focused software. Why is it the case that we need so much to do so little? Why does it take gigabytes of RAM to display documentation or to play music? Why do our compilers and other dev tools waste so many cycles? What justifies using hundreds of megabytes, or even gigabytes in network I/O for common tasks?

I think in general we have a waste awareness issue.

@gaeel all this coming from a Rust programmer who needs to use some massive Docker images for work stuff, so I’m including myself and my toolset here. It is rather sad that increased machine capacity usually (not always, if the incentives are right) winds up being wasted in the name of “developer productivity”. I think there's a balance. Not needing to rely so much on optimised code means that we can advance on features quicker, and less technical people can learn to make programs.
There's a wasteful side to all this too, of course, and I don't think we can afford to not care about performance, but I love that we live in a world where it's possible to quickly make software that looks nice and is useful, without having to study the inner workings of a CPU

I think we're hitting peak "free performance" quickly though, we're capping out Moore's law, and while we're able to crank up core counts higher than ever before, most software isn't really able to take advantage of multiple threads. I didn't feel the same boost going from 4 to 8 cores as I did when I went from 2GHz to 4GHz.

@ndpi and force designers to use old & small CRT monitors in bad lighting, whilst also wearing vision-impairment-simulation glasses (e.g. )

@fluffy data rains from the cloud into the streaming, before falling off the waterfall

Most developer do use a slower computer (less RAM, less CPU) than the server that will execute their software. (Even dev servers are always less capable than prod servers).
But I guess you mean android and ios app developers.

@rimugu I was mostly thinking about developer tools and consumer software where the developer might ignore that their users have lower end devices. Not much of a problem for server work though that can be hugely wasteful as well. For mobile most people use a test device anyways.

@ndpi That used to be a thing here - Australian web developers were really good because for years our Internet speeds were terrrrrrrible.

@ndpi Yes. So many websites I've been on fail if you have a connection blip, which was a MAJOR issue when I was living in a student house with 7 people using a cheap router the landlord bought, or when I was in a house up north with bad internet.

Loading even Blogger pages was a huge pain as they load so many extra scripts that it was a nightmare on a slow connection, just to read a blog post.


A couple of years ago I suggested all web developers be forced to spend a few months using mobile data exclusively on an old laptop as part of their education.

'dogfooding reality'

Sign in to participate in the conversation

Merveilles is a community project aimed at the establishment of new ways of speaking, seeing and organizing information — A culture that seeks augmentation through the arts of engineering and design. A warm welcome to any like-minded people who feel these ideals resonate with them.