Follow

So I feel like everyone is reallll fed up with the status quo of digital electronics regarding efficiency, simplicity, learnability, (list probably goes on).

I have two questions on my mind:

1.) Does it just seem like that? Is it the programmers in my bubble becoming old and grump, or is there really something brewing? Do we have enough powder for the computing revolution of the 2020s?

Show thread

2.) Who is already out there tackling this problem, and at which level? I fear we may have to go very deep and compromise intermittently to improve. What projects are out there, with vision and roadmaps?

Show thread

@s_ol aral may have some suggestions of projects... @aral - and something to say on the subject.

@Truck @aral
oh nice, i hadn't found their Mastodon account yet!

@aral

yup, I remembered this, but couldn't find it. Thanks @Truck for pointing me back towards it:

The small technology foundation
small-tech.org/

@s_ol
@aral @Truck the #smalltech vision and principles are very inspiring and can be readily adopted by anyone disillusioned by big tech.

In these times we come to see again that #smallishuge

@s_ol Chuck Moore has been building his own parallel universe of computing-without-the-suck for several decades

the problem is it just doesn't interoperate; like ... I don't want to switch to a platform that doesn't let me participate in stuff like the fediverse or make SSH connections to other machines.
@s_ol there's different levels of what a "paradigm reboot" could entail, right?

there are people making incremental improvements that are very practical like the MNT Reform or the Novena from a few years back (heck; I'm building my own modest contributions putting a Pine64 in the Atreus Deck) that just strip away the worst bits of capitalism but leave the Unix intact, and there are people starting over from the silicon gates, and then a bunch of stuff in between.

I really want to throw away Unix and the shackles of decades of C-induced 1970s thinking to trade it for a system that invites end users to shape everything to their will, but I also want whatever I build to be a thing I can use every day at my job, and those seem mutually exclusive.

@technomancy
Yup, I think you nailed it here, especially with the last part. I think somehow whatever comes next *has* to break compatibility if it wants to be better, but maybe there needs to be a mid-term plan for coexisting. One of the biggest pain-points for sure is supporting the web; without a browser you can't convince a single end-user.

@technomancy
Since reimplementing an OS and a Browser from scratch and better is essentially impossible, I have only two ideas:

- Build a whole alternate device formfactor that just doesn't web. Back to the personal-personal computer or something, a machine for hacking.

- Build something lightweight that has the ability to run bare-metal, but also virtualized, and have it integrate A+ with existing operating systems, so users can use both until the current web is obsolete.

@s_ol @technomancy the web should be abandoned and relinquished to the capitalists. fundamentally this technology is inadequate to achieve its goals and must also be re-architected. we could have such an amazing hypermedia web but unfortunately projects by Tim Berners-Lee, Semantic Web and Solid, won't succeed because they rely on backwards compatibility in a hostile and chaotic space.

@theruran
imho at #solid TBL and Inrupt also got their focus wrong. Having a commercial approach and overly technical mindset just doesn't cut it. Plus they seem inwards focused (not all in the community of course, but most of core team are).

@s_ol @technomancy

@s_ol Guix is one project that I feel strikes a good balance between doing something radically different (and imho better) while also staying reasonably compatible with the existing Linux ecosystem.
And they are seriously targeting the Hurd, and it could be a great foundation for other projects. The community is also pretty cool!
It also tackles bootstrapping.

@s_ol Also, I'm only in my twenties and didn't start programming until I was like, idk, 16? So I'm not just old and grumpy. But I did read a lot about old tech and I hang around some people who have seen and experienced the old ways as well as the new.
When I started out, I was all starry eyed about Emscripten. Now, I can't wait to delete all modern browsers from my machine.

@s_ol I am researching this problem and incubating a project to replace the whole computing stack. unfortunately, I don't have much to show at the moment.

yes to the computing revolution of the 2020s!

@theruran
I would be excited to hear more as soon as you have something!

Maybe you could share what level of the stack you are branching off at?

silicon / kernel / userspace / browser

@s_ol branching off at the silicon level to avoid proprietary and user-hostile technologies. but really branching off philosophically because the lambda calculus is a safer way to formalize computing abstractions.

it's a great question and I am glad you asked it. there is sincere sentiment especially in the fediverse that modern computing is fundamentally broken. people are attacking this problem from all different angles and it's amazing, fun and beautiful.

@s_ol They are things that were not explored thoroughly like #p2p.

@zig @s_ol ad-hoc mesh networks could be huge right now, when people are looking for a stronger sense of connection and community.

@s_ol Myself, I'm tackling The Web's complexity! And arguing that due to it we lost something beautiful, that is needed not only for accessibility but also the diversity of new devices.

And @akkartik has done some interesting stuff regarding programming languages, very low level!

As for 1) the best I can answer is: I'm 25, and have been programming for about 15 years.

@alcinnz @akkartik
I have to take a better look at Mu, but it is awesome that you are working on these goals. I also think that going through SubX is the right (although hard) choice.

In the Mu paper, you seem to be building up to be able to host a C kernel. Is this a self-imposed scope limitation to focus on the foundation first? Do you have some plans, thoughts or dreams for how Mu might grow up to the user level in the future?

@s_ol Mu is actually hosted _by_ a C kernel right now. That's a compromise I hope to undo over time.

The dream for the future is to be able to start writing interesting programs, while preserving two properties:

a) Source present, rebuilt on demand.
b) Drill down into the trace for any command on the commandline.
c) Turn any sequence of commands into a reproducible test.

Major open questions: graphics, forking hypothetical file systems.

@alcinnz thank you so much for putting us together!

@s_ol Since you asked about visions and roadmaps:

I want The Web to work great on any I/O medium! And I try to balance being compatible with existing websites against doing something new.

Currently I'm implementing an auditory web browser "Rhapsode", with hopes/plans to tackle print then smart TVs. Or at finer-grained level I'm currently tackling in-page navigation & SpeechDispatcher integration, and after that I'll tackle links, voice recognition, and then forms.

@s_ol Oh, and you'd be interested to know: the geospatial standards organization "The OGC" are working on new models for representing locations on earth called "Discrete Global Grid Systems".

They're really trivial to implement & render! But there's been challenges in getting buy-in from the broader industry.

And yes I've been working on this as well.

@s_ol Speaking for myself, yeah, I totally turned grumpy in the last 6 months, I've had a total breakdown when it comes to technology. I might never recover. I have not yet found anyone working in the direction that you mean, I'm actively looking. If I find anything, I will report back.

@neauoire @s_ol I'm sure some of it is simply aging in my case, but I've been getting more and more pissed off at technology for the last two decades. I know that, objectively, my PalmPilot was kind of shitty compared to my Samsung Galaxy. But I actually owned (or wrote!) everything running on it, and it wasn't surveilling me for advertisers and other miscreants.

I've also been programming long enough to watch the wheel reinvented several times, and usually not in ways that make it roll any better. The tendency seems to be toward things that might feel a little better to work with, but require an enormous stonking ecosystem just to pipe out "hello, world."

Much of this might just be my age showing, but I'd like to think that lack of ownership and complexity for its own sake should be pressures that fuel some kind of revolution. At the very least, I'm keeping my C skills sharp in case I need to manufacture ammunition for my fellow revolutionaries.

@lonnon @s_ol @neauoire Revolution can come only if people are frustrated with tech and get critical about it, but most of people are super happy and excited *users*.
(I am not)

@raphaelbastide @s_ol @neauoire My worry is that people just don't know any better, having grown up without the context of where the technology came from. This has a lot of my own bias behind it, though; I'm in my late 40s, and I watched it evolve from exclusive to ubiquitous.

@lonnon @raphaelbastide @neauoire
I'm in my early twenties and feel the same, so it's not *just* that. But I do know more than the average person about the history of computing, maybe more than the average front-end developer.

@s_ol @lonnon @neauoire I think the issues are more related to the way we consume tech (should we even do that?) than tech choices themselves. The technological objects became objects of power for industries and dev -I think we are part of the problem-, and objects of desire/placebos, users user and consumers. This is bad because it hijacks the perspective of designs from function to seduction. I think the emergency of time doesn’t need seduction. @kensanata

@lonnon @s_ol @neauoire It strikes me that the economics of all of this are really important. Developer salaries create a *desirable* barrier to entry. The more wonky your stack gets, the more easily justified a salary seems to feel. With many of the worlds highest earners coming from tech, it self-perpetuates myths of complexity, but also the desire to own "the next big thing", exacerbating the number of options, especially at higher levels of the stack with the steady increase in developers.

@lonnon @s_ol @neauoire palm pilot is objectively better. I have a list of points. only real problem with it is it needs to sync with a mothership and that mothership needs to be running an old operating system. i won’t get into it here but i could talk for hours avout why it is better. but if you want an idea of where i am coming from you can check my list of ui principles pinned to my tooterfall.

@lonnon @s_ol @neauoire as to the original poster question: things are getting worse. in the 1970s and 1980s ui designers learned a lot hard lessons at xerox and apple. they wrote these down. everyone who gave a shit about those lessons were ousted from apple after jobs died. facebook and microsoft never cared much in the first place. if you take the list of lessons you can use it to review UI’s like a checklist. i have a version of the checklist. fewer and fewer points every year check out.

@lonnon @s_ol @neauoire google, they uh? sorta care? but in a data centred way. see: 87 shades of blue.

@lonnon @s_ol @neauoire more and more actual expertise is being ignored in favor of an analytics driven microoptimisation approach, the goal of which isn’t to help you get work done, like the bicycle for the mind. the goal is single minded improvement of “engagement”, often taking the form of emotional manipulation and distraction tactics, the users are the product yadda yadda, but this is crystal clear in the design of a palm pilot vs. an iphone or android.

@zensaiyuki @lonnon @s_ol @neauoire I still have a copy of Human Interface Guidelines (1987) kicking about in my library somewhere.

My sense is that this isn't just about labor markets or capitalism. By this time, Dijkstra's structured programming turn had been normative for an entire generation.

Per Mills (1986), programming had gone from "writing computer instructions" to a "mathematics based activity". These "fundamentally altered human expectations" informed the decades that followed.

@beadsland @lonnon @s_ol @neauoire i have just read the rest of the thread. i am , kind of unsympathetic with wanting to rebuild the entire tech stack. i mean, if you’re doing it as a hobby fine, but it won’t fix the UI problems. not that the underlying tech has nothing to do with UI, but, you can’t boil an ocean and whatever nitpicky issues with unix or javascript or c or whatever you might have, doesn’t fix the ui issies. it’s a distraction.

@beadsland @lonnon @s_ol @neauoire there is nothing so fundamentally wrong with the current tech stacks that you can’t build a new ui on top of it, or even a new language or whatever. you don’t need to grow your own wheat and mill your own flour to bake bread.

@beadsland @lonnon @s_ol @neauoire you can go through prototypes until its sticky, then sure, go ahead and optimise the engine til it can run on an ez80 on two AA’s.

@beadsland @lonnon @s_ol @neauoire but that’s just cranky me. i am sick of shitty ui and i am sick of programmers more concerned about icky programming languages than making the actual things capable of being used when i am 80 and have dementia

@zensaiyuki @lonnon @s_ol @neauoire There's an anthroplogist, Annemarie Mol, who last I touched base with was doing research into episteme of eating, her site of study being how eating is articulated (whether as "nutrition" or "meal") in nursing homes. What she's finding is that the episteme that organizes (to relate to Foucault, that "orders") eating has profound (and counterintuitive) effects on outcomes for nursing home residents.

Shitty UI is a product of institutionalized thinking.

@zensaiyuki @beadsland @lonnon @neauoire

I see this, but it's talking about user UI. For users, yes, it's possible to pave over whatever is underneath, and it's absolutely possible to build *anything* on the current stack. There might be drawbacks and inefficiencies, but it will be faster than building up from the ground for sure.

The problem, in my eyes, is that programmer UX matters. And we cannot pave over that with more layers I think (that's what no-code startups are trying though)

@zensaiyuki @beadsland @lonnon @neauoire
I would like to live in a world where end-user programming is the norm, and where people use tools they can trust and understand. I don't see a path to that on top of the amassed layers of cruft.

@s_ol @beadsland @lonnon @neauoire i empathise with that vision, but i also know the makeup of the “user” population today is very different from the days of hypercard and basic. except for the outlyers like us, people use computers because they need to now, they’re a requirement for participation in society. they don’t care about writing their own scripts and you can’t make them care. that doesn’t mean they don’t matter.

@s_ol @beadsland @lonnon @neauoire also, there is validity to having our tech understandable, right to repair, that’s important too. all i am saying is not to assume it’s for everyone.

@zensaiyuki @s_ol @lonnon @neauoire The line between "don't assume it's for everyone" and "this is who it's for"...

Perhaps when you're 80, whatever manifests for tech in that decade won't be for everyone, because it's only for those young enough to grok it.

@beadsland @s_ol @lonnon @neauoire that’s a problem that we have now- in which there are 80 year olds today who can’t access benefits because they need to log in through a broken website from a broken os. who’s working on that problem?

@zensaiyuki @s_ol @lonnon @neauoire No one.

Bracketing our societal dereliction of infrastructure, as such, I'd argue that institutional procurement systems that hew to _specifications_ for ordering the design and deployment of computing systems is a significant factor.

It happens that this framing—emergent of the structured programming turn—fits well the industrial bureaucratic mode, even if it poorly represents (as in, not at all) the fluid process of iterative development and user testing.

@zensaiyuki @s_ol @lonnon @neauoire Folk using hypercard didn't care about writing scripts, if I recollect that era at all, more often than not, they cared about communicating a story in new and experimental ways, and hypercard provided visual intuitions about how to do that.

The key is that scripting wasn't a "feature" of hypercard tucked away in a submenu. Organizing the presentment of ideas and the relating of thoughts was the purpose of hypercard. And so those who wanted to do that, did so.

@zensaiyuki @lonnon @s_ol @neauoire Yeah, I don't know that a rebuilding of the stack. We'd just end up with a new sedimentary history.

But the UI issues come of an epistemology. That those checklists that came out of Xerox PARC are largely forgotten is that they represent a different way of thinking about human-computer interaction. A way of thinking that fundamentally incommensurate with the current episteme, because the current episteme doesn't articulate interaction, as such.

@lonnon @neauoire @s_ol I think Palm specifically had some advantages from the circumstances of their target market, too, that modern mobile devices don’t have.

A breakneck development pace means you can’t slow down and consider whether things are a good idea or optimize things, and cheap performance makes optimization even less important.

Palm had been one of the first companies in the PDA market (having done the handwriting recognition and PIM for the Zoomer PDAs), and with those devices having basically flopped, they had the advantage of having learned from their mistakes.

Additionally, when Palm was learning, and preparing Palm OS, the PDA market was already considered a failed market, which reduced competitive pressures, and gave them time to think things out more thoroughly. (Note that this is why the first Pilots were released as “connected organizers”, not PDAs - retailers didn’t want PDAs, whereas organizers were an established product category.) This actually became a disadvantage when competition did appear - Palm moved too slowly to adapt to smartphones becoming a thing, and was forced to bodge phone support onto their OS, which seriously negatively affected stability. (And there was never a usefully good browser.) But, Palm OS was excellent in its prime (up to about 4.1) just because it had a chance to be so focused.

Contrast with modern mobile devices, where robust competition means everyone has to release new features at a breakneck pace, not having a chance to stop and ask whether things are a good idea… or even knowing something’s a bad idea but having to ship it anyway. And, then, there’s the whole thing where maintaining the eleventy billion layers of abstractions that should have never been created, but can’t be collapsed, costs so much that even with absolutely absurd economies of scale, you either have to pay the Apple Tax (although iOS has been buggy for quite a while, and Android flagships have gotten hideously expensive too) or deal with surveillance capitalism.

And of course the browser is its own eleventy billion layers of abstractions, which is why we’re down to a surveillance capitalist and a non-profit that’s selling its users out to surveillance capitalists making modern browser engines. (My own thought for a gordian-knot-cutting mobile device boils down to “pull an Opera Mini and run the rendering engine on a Xeon/Epyc in a datacenter somewhere”, which at least gets it off of the mobile device and likely reduces bandwidth requirements as well, but that’s not the most satisfying answer.)

Sign in to participate in the conversation
Merveilles

Merveilles is a community project aimed at the establishment of new ways of speaking, seeing and organizing information — A culture that seeks augmentation through the arts of engineering and design. A warm welcome to any like-minded people who feel these ideals resonate with them.