You're absolutely right! I wasn't there either, and i wouldnt want a terminal-only machine.
My goal is providing means and knowledge to those that want it. The Ashet is more meant for technical schools, apprenticeship and "the interested", and definitly requires an initial spark that will make that project interesting to you.
It's here to fill a (percieved) gap and wants to sit more in the professional education part than the "get the kids to computing".
I'd love to have had such a hands-on device in school instead of the bare theory of circuit operation and basics of programming.
That's gonna be a while coming - we're now entering a stage where we won't even understand the code that gets written.
Now, sure, some holdouts will understand the code, but that's not going to be the norm soon.
Coming from HomeKit, I've tried (_really_ tried) to move to Home Assistant. The gap in usability was too enormous for me to cross, and I'd brand myself a hacker.
I won't trust any kind of a "smart" device to operate the front door lock - ever - but smart lightbulbs are still stupid lightbulbs. I can just flick the switch.
Privacy concerns are valid - I can be profiled based on usage. But it's not like Apple doesn't know my precise location already.
With all that in mind, I'd say usability comes first.
If the lock freaks out - and yeah, I'm having problems with my HomeKit stuff every now and then - I need the physical key on me. That already defeats the purpose. Otherwise I need to grab the spare key from wherever I keep it. This is not a mere inconvenience, this is an emergency procedure.
But I do wonder if LLM-driven code analysis might actually increase code comprehension and agency for laymen?
I've been quite impressed with AI's ability to decipher and visualise code and system relationships in e.g. mermaid diagrams.
Perhaps the representation of code will become more elastic (i.e. you have literal code, or AI-produced translations you can directly manipulate)
Even on Amazon the ESP32 is less than $5 - means like $1 in Shanghai. Various sensors (even the ones with Bluetooth connectivity) are similarly dirt cheap. You can have a bin of such components like you would have a bin of bolts and nuts 30+ years. Basically we live in a golden era of development (which can disappear in US due to tariffs)
>If we think what can come beyond screens and imagine more ambient computing systems - maybe we’ll see new and interesting innovations
my bet is that it will be more robotics related with practically no humans involved. It is a bit of paradoxical - like for example if we add enough development to existing robots we can for example have an AMZN warehouse run fully without people which in turn would mean that we can have robots there much simpler in various aspects as the absence of humans relaxes a bunch of requirements.
Locally, 4$ is probably "more money" to us than 5$ is to you.
Don't get me wrong, it's still a marvel that we can have something so good so cheaply -- but correcting for cost of living, it feels less affordable for us here in Asia.
Anyway, not a criticism. Just sharing a slice of life from over here in case you were curious.
Also, besides some really huge companies, I would be nervous as a business to rely on a third party so much that I didn't have a workforce of my own.
Yes, it works for automotive (extremely consolidated sector with huge capital), or Amazon, or chipmakers... But they've already gone through that transition. Who else needs that?
The real bottleneck is getting custom pcbs made.
The best companies that do this are in China and soon will be tarrifed
I did consider Euro Rack, 19" racks, MiniATX and other case standards. But with any of these options, i wouldnt be able to keep the price low. The case including all parts and assembly cost is roughly at 20€ per device. It's an off-the-shelf part, with only lasercutted parts in two materials, and a single custom component which is yet to be determined what's the cheapest manufacturing option.
I wanted to keep the price low and any of the options above would increase the price by at least 40€, which is a subatantial part of the components
Heightwise, it fits: 100mm vs ER's ~128mm; trickier with depth - 180mm vs "deep" ER modules ~100mm. One option would be a rack with no back panel. Power in ER is already being distributed via a ribbon cable so this shouldn't impact the ability to install standard modules. Electrically - 12V, perfect match.
What is the width of each module? Assuming (250mm-case)/8 ~= 30-32mm?
It's definitly possible to re-layout the backplane into a different mechanical design, and reduce the number of slots
Final specs will be shared soon, i have to do another revision of the hardware design as i figured mechanical properties like that the "stickyness" of PCIe slots is good enough so you dont actually need mechanical screws to fasten the modules
I'm also having ADHD but you can get help with the focus problem ;)
It's incredible to work for many years on the same thing and it's so pleasing to have steady progress.
So: Go, start building a vision for your cyberdeck! Having a clear goal helps you working towards building it, and you cam embrace the ADHD by jumping around between the many aspects of such a project.
sw design, hw design, visual design, mechanical design, handcrafting, soldering, programming, making websites, asking people, ...
The most interesting part to me right now is running Zig on a microcontroller, since that's something I really wanted to do this year (like a new year's resolution thing). I've been interested in Zig for a while, but I've never done any serious learning/programming with it, so I might look at your OS repo to at least gain some inspiration on how deploying to a microcontroller works.
It also seems like it would be a really good way to learn embedded electronics in general, since if I wanted to learn something new, eg lower level network programming, I could make and program a network card, or usb hosting, or anything like that.
I'll definitely try to do it one day, once I've finished all my current projects haha
The best thing that helped me: Force myself to one single project only. And if I really don't want to do the current one, do a ___really___ small one, that's pure fun. I have to remind myself of that again and again, but it helps
It now took me 3 iterations and 3 years to get a somehow working risc-v CPU written in VHDL, that I am mostly satisfied with. It's a crazy hard struggle (finally pipelined though!). Everything breaks, I constantly get headaches and almost want to quit, but somehow I can keep on pushing this project. It helps me a lot.
Almost nothing works, things break all the time, it doesn't even look cool. But somehow it's satisfying that I can type words with a PS/2 keyboard on a VGA screen getting processed by my own RISC-V CPU (running Rust btw lol. I ditched that by now though, the struggle was too big). See here: https://i.imgur.com/PtKeAYt.mp4
I have no idea how far I can make it, maybe not fully self-hosted, but a small OS should be doable. I have interrupts and timers working, so I should have everything I need.
The downside is, I am way worse than OP and have no documentation at all. I should do that, but that's where the motivation is missing.
Edit: I type so slow in the video because I did poll the scancodes (too slow) and had to think which characters not to press, because i dont accept all yet and was scared that it breaks
Wanna hop onto my crazy train and make a second main board?
I wanna make the Ashet Home Computer an open and free platform, so if you implement the backend control interface, you can use all of the environment yourself.
> I am way worse than OP and have no documentation at all.
I confess guilty at the currenr point. I do have planned a large phase in development where i will create entry-level docs for everything, as these are an essential part of the project.
The video is incredibly cool! Ashet started as a SoC in a ECP5, but the RP2350 basically hit all my spec requirements better than i could've done it myself, which made it a no-brainer to ditch the FPGA
There are many tricks to stay focused, particularly when there are no stakeholders but you. Try a body double, someone to keep you in check - you can also return the favor, like working from the same room, or just coming to their home to watch them clean while you do your stuff.
When you do get hyperfixated on something - that can be a very productive period, until you crash. Again, bring in a friend. Not to get back to it - just to make sure you're still eating. Don't push yourself to either side, find your own balance around it - I've picked up many hobbies where I've slowed down, but maintained interest over the years, decades.
Look for buddies in your local neurodivergent group, there's usually one somewhere near you. (Even here on HN, it turns out.)
https://github.com/Ashet-Technologies/Ashet-OS
Thought it might be of interest to people learning Zig. I bet there are some interesting examples in there.However I will question some design choices!
- 32-bit only: the writing is on the wall, many vendors (HW&SW) are slowly moving to kill it off. I guess it's fine, but IMHO each ISA (regardless of pointer width) should just be considered a different ISA. Portability is good, it ensures your software doesn't hardcode too many assumptions about the platform - and weeds out bugs. Just treat an int like an int, and a pointer like a pointer. Distinct types.
- Pure co-op multitasking: it will be completely fine, until you try serious in-system development. Hard reboot any time I make a mistake in a for-loop? Mercy, please :,) Implementing a simple watchdog via a timer interrupt will still keep all of the scheduling logic simple and stupid: it's just another case for a yield call, except now involuntary. The runaway process will simply never see completion, but seeing that it would never yield anyway, I don't see a problem just killing it. And most importantly, the sanity shall be preserved.
Yes, the OS is highly opinionated, as just making another linux is incredibly boring for me!
The 32-bit only constraint is mainly due to my focus on smaller architectures, especially microcontrollers.
x86_64 and aarch64 both have much more complex initialiastion schemes, and also use much more complex page table setups.
Thus, i wanted to keep that out of the system, considering i'm targeting systems with a tiny fraction of the 4 GiB memory limit.
The co-op multitasking is a part of the OS, and the OS doesn't give you preemptive multitasking. I never said i won't implement sanity safeguards! Just killing off a hanging process that doesnt yield for more than 1-5 secs is totally i scope and increases user friendlyness. but considering the system reboots in 1.2 s on target hw right now, the user will maybe just have hit reset by then :D
Riddle me this, Batman.
What's the scope of "fully understandable?" How much of this home PC could be reasonably audited by individuals or small teams?
I've got no exceptional opsec needs as an individual, but I spend some time wondering the minimum required resources to audit a PC. Looking through the docs I see cases where there are multiple suppliers for a recommended part -- that's very cool!
As a "fake programmer" and web jockey, this looks like the right balance of complexity to learn with.
I just don't think modern CPUs really quite fit the claim of "fully understandable by a single person". I mean maybe technically but that is misleading in an educational context where there are much simpler computers that are definitely fully understandable.
Maybe all of the stuff he wraps around the main CPU is understandable though. And the expansion cards are cool.
Are there any other projects or resources in this space that you'd recommend?
A friend and I cut our teeth on those AlphaSmart word processors that ran BASIC. I might could wrap my head around that.
https://youtube.com/playlist?list=PLowKtXNTBypFbtuVMUVXNR0z1...
Some day, whenever I have the money to skunkworks this properly, I've wanted to create something like a modern spiritual successor to the Atari ST with enhanced creature comforts.
Something with a CPU based on POWER architecture (like microwatt) with a simplified multicore design (no hyperthreading or weird BIG+little core design - just straightforward homogeneous cores), a simple expansion interface of some kind, and an OS baked into ROM. Then I'd consider it to be built around a long term support model, with one design that can last decades, complete with schematics, chip design reference guide, and an open specification so it can be easily cloned as desired.
Especially now that Moore's Law and Dennard Scaling has slowed down considerably, it could be a fun platform to target for education or the demoscene, instead of spec chasing.
The OS made me wonder how far someone could get trying to create a GUI for the 6502. I suppose the Apple II (GS?) headed there before the Mac fully took the reins and the Apple II was left out to pasture.
https://youtu.be/_4nthOx8sA4?si=AiK9bRxRQwV3MB0f
There's also this Atari homebrew
https://youtu.be/T14dL9MeMHE?si=cGtsZGWILYi4jcql
And yes the IIGS had one
I realize that 8MB of RAM seems absurdly small to modern audiences, but I can assure you that I ran early versions of Turbo Pascal and compiled fine with 64K.
I know, i know! This is what makes me a bit sad. I dont know of any modern compiler i can use on the platform, as most hobby compilers target aarch64 or 32/64 bit x86.
What i need tho is a compiler that targets Arm/Thumb-2.
My research tells me that this confines me to: - A non-complete patch series for TCC (maybe) - LLVM - GCC
As twonof them obviously won't run on 8 MB, my options are stripped down to:
- Evaluate CBE and write my own backend and add 32 bit support - Write my own compiler + backend
Both options don't sound viable before the release as they would increase the scope greatly.
Considering this uses a RP2350, I am pretty sure that no single person on earth has a full understanding of this Computer.
So I'm pretty sure there are plenty of folks who understand whats going on, especially if they approach their study of the Ashet from the perspective of the RP2350.
Basically, its one hell of a swiss army knife for building computer systems.
I don't think things are as difficult to understand as you do - but then again, I grew up with 8-bit computers where it really was competitively important to understand how they worked - and I don't think the cyclomatic complexity of the Ashet is much greater than anything from that era.
A particular sweet spot is emulating 8 and 16 bit systems, as latency can be just as good as an FPGA setup. The infoNES emulator has been running on RP2040 for a while, and I see projects for Sega Master System, Genesis, Apple II, and Mac in the works. But you can also write much more powerful software natively.
Likely it will be possible to adapt software between these various RP2350 systems.
[1]: https://github.com/DusterTheFirst/pico-dvi-rs/wiki/RP2350-DV...
cool!
> Dual Core CPU
hm that will make for some interesting first steps in learning
The Apple II had a really cool disk drive because of how it did what it did with so little hardware. By relying on the single CPU for everything it was elegant, advanced, interesting... but perhaps not so easy to program.
https://www.bigmessowires.com/2021/11/12/the-amazing-disk-ii...
It can be done - if you take a holistic approach to hardware + runtime + development environment.
The Propeller probably failed because of the custom language, the custom assembly syntax, the custom ISA, the custom IDE font (!) etc. It was a very neat system though.
It was just too unusual in too many ways.
In one way it’s a bit like the Amiga vs the 8088/8086 PC.
What makes microcontrollers commercially successful is... commercial use. Hobbyist applications are fun, but they don't pay the bills.
*: Microchip hadn't bought them yet
Great point. I highly recommend crowdsupply for this type of project (extremely technical target customers), especially if this is the first campaign you run, as their team is helping much more on the nuances of running a successful campaign.
(I know this is not the place for ads, and I’m not affiliated though I run crowdfunding campaigns on all the platforms mentioned.)
That sounds exactly what i had in mind, and i really wanna do the same when my boy is old enough for computers.
It's a teaching tool and a fun toy to tinker with
I'm somehow very confident in this while also being sure that people probably thought very similar things about home radios destroying the youth in the 1920s :D
[0]: https://512pixels.net/2024/03/apple-jonathan-modular-concept/
>Raspberry Pi RP2350 Main SoC
Yeah right.
Why a hardware project at that point and not a virtual machine like pico-8?
I'm just saying, its kinda the opposite approach a hardware person would take.
Please take a look at the gallery, where there are photos of the actual electronics setups!
Also don't the mechanical mockups count as hardware? A pile of jumperwires, breadboards and devices don't make a good hero image, but physical hardware mockups do.
Also the electronics design in its current form is actually iteration 5 of the system, while the OS development started with iteration 2.
The OS does boot on the electrical prototype
I strongly disagree! Hardware people love seeing that sort of thing - the more guts you show, the better. It means you've gotten something to work and probably know what you're talking about. Take pride in what you have accomplished so far! Ideas and concepts are a dime a dozen; working hardware is a worthy milestone.
Will add a new "cleaned up" photo that isn't also entangled with kids stuff, and other desk content :D
Sadly, it really looks atrocious and it's currently a 3D build which is hard to photograph.
You can subscribe to the E-Mail newsletter linked on the front page (or on Community)
https://ashet.computer/gallery/img/ashet-hc-devsetup-02.jpg looks like hardware!