> Today, the Servo team has released new versions of the servoshell binaries for all our supported platforms, tagged v0.0.1. These binaries are essentially the same nightly builds that were already available from the download page with additional manual testing, now tagging them explicitly as releases for future reference.
> We plan to publish such a tagged release every month. For now, we are adopting a simple release process where we will use a recent nightly build and perform additional manual testing to identify issues and regressions before tagging and publishing the binaries.
> There are currently no plans to publish these releases on crates.io or platform-specific app stores. The goal is just to publish tagged releases on GitHub.
- Blog: https://servo.org/blog/
- Most recent TMIS post https://servo.org/blog/2025/09/25/this-month-in-servo/
Check them out if you're interested in what's going on with Servo.
That said, I'm recently back on RSS and this is another good feed:
(Oh! I wonder if Servo will bring about a new, JS enabled, TUI browser?)
It suggests a couple of things...
All in all, an impressive release.
I think the parent is imagining a desktop with servo available as a standard lib, in which case you're left with the same complaints as Tauri, not electron; that the system version of Servo might be out of date.
Though I’d also be interested to see how slim it could be with static linking.
Presumably a lot of code could be compiled out with dead code analysis? Or compile flags could remove old compatibility cruft and unneeded features?
Personally I'm more optimistic about Servo - because originating at Mozilla, I imagine more web browser experience and expertise went into its architecture, and also because Rust.
Andreas Kling who created Ladybird had prior experience working on KHTML/WebKit so there is expertise there too.
Per: https://www.igalia.com/2025/10/09/Igalia,-Servo,-and-the-Sov...
I was curious how you arrived at that figure so I checked the dates. Servo began in 2012 as a Mozilla skunkworks project, died off in 2020, and was revived in late 2023. If you simply subtract the "dead" period, sure, it doesn't look like it was going anywhere fast, but that's ignoring the multiple major changes in direction and the 5+ years during which Servo development was fully subordinate to Firefox development. It only became a fully independent browser development effort after the project was revived by Igalia.
If you're worrying about that size then Mac OS is not the platform for you.
And it's not about absolute size, but compared to Chrome/Electron you'd expect a fresh modern codebase to be somewhat slimmer and faster.
How big is Ladybird?
[1] I believe you can make Electron smaller by cutting parts of Chromium out, but the default is around 100 MB
The other obvious target is the JS engine. IIRC V8 is 90mb just by itself. I don't think SpiderMonkey is quite so large but definitely still in the 10s of megabytes. A slower simpler JS engine (QuickJS, hermes, etc) could be quite a bit smaller.
Binary size however is less of an issue for most users.
https://codeberg.org/bptato/chawan/src/commit/3f2fffd882ff47...
It just spins up a background process when a canvas context is created and sends drawing commands through IPC. As a result, you can rm the 970k canvas binary (most of it is just Unifont) and with some luck you will only break canvas rendering.
Of course this only works for things that are relatively self-contained. So you can add/remove image decoders or protocol handlers without recompiling (the main binary has zero networking code), but the JS API is still baked in.
(I imagine you could also implement some less performance-sensitive APIs in JS and load the bytecode on demand, but I haven't tried.)
Ladybird uses bsd-2 license which is OSI, I mean its not fsf/copyleft but permissive which should be better sometimes for things like embedding etc. no?
It looks like servo uses mozilla public license 2, can you please explain me the difference and why you think one is pushover and other is not?
As far as I can see, for an author of derivative work, permissive licences are only really preferable when you either can’t or don’t want to grant or preserve the freedoms which a copyleft licence would require you to grant and preserve. (Which, to be fair, may often be the case.) From a different point of view, copyleft can be seen as better for embedding, since it means that Free Software in question will only be used to make more Free Software.
The MPL is a copyleft licence, but it’s known as a ‘weak copyleft’ licence. That means it preserves only the freedom of the program it initially covers; any changes made directly to that program can only be distributed as Free Software, but the program itself may be used and distributed as part of a larger work, which as a whole does not have to be Free. (This is in contrast to ‘strong copyleft’ licences like the GPL, which require the entire larger work to be Free.)
Weak copyleft is a kind of compromise which lets you e.g. embed a piece of software without having to grant all your users freedom to use, share and modify your entire work, but you're still required to grant them those freedoms in regard to the piece you’re embedding.
Servo is no longer a Mozilla project, and hasn't been since 2020. It's now developed by Igalia, Huawei, and a collection of volunteers.
Ladybird is C++ and that still has the same issues as every other engine.
I suspect Ladybird will/has already leapfrogged Servo in performance and usage due to the Ladybird team and its momentum. Mozilla isn't doing anything with Servo anymore.
But I also don't really see a compelling reason for Ladybird's existence - we already have Chromium, Blink, Gecko, etc. It's hard for me to imagine a world where Ladybird is a healthy contender for marketshare.
The only real novel thing to do in this space is "rewrite it in Rust".
I don't see that changing any time soon. If Apple truly wanted Swift adoption to be cross platform, they have the resources to do it, but they didn't do it.
That's a key feature of what Apple want Swift for (to gradually replace their C++ projects with Swift) but it's still pretty new. It'll take a while to mature.
Swift is horrible to develop in cross-platform. The language ergonomics are great, but the support just isn't there.
Also - swift is great for lots of applications, but a browser? Why use a garbage collected language for something that needs to be smooth? Unpredictable GC pause jitter is not something you want when smooth scrolling and rendering. Granted Javascript already negates that experience a little bit, but why introduce even more unpredictability?
I get the feeling the leadership loves Mac/Apple, which makes sense in light of their recent iOS announcement. Maybe they're prioritizing that world.
Ironically Chromium is now starting to include quite a bit of Rust. And of course Firefox has for some time.
This is "In Review"[1], whatever that means.
[1] https://connect.mozilla.org/t5/ideas/ability-to-embed-gecko-...
I'd be glad to be proven wrong.
Servo has some areas where it performs very well, though support for things like CSS Grid are still experimental and off by default (but working pretty well), and it's missing a few important APIs still. (I haven't checked in a while but last I looked it was missing AbortController. That's pretty unfortunate and probably breaks a lot of random stuff across the web.
I think Ladybird in Nixpkgs still doesn't have fully working hardware acceleration, so I'm probably not seeing the full smoothness Ladybird can offer in an optimal setup. Still! It runs pretty well, and it supports a vast majority of the web platform. Right now for my own development, the only major two omissions I've noticed are Origin Private File System, which is pretty new, and OffscreenCanvas, which actually is "implemented" but only as a stub. Throwing Web Workers and heavy WebAssembly workloads at Ladybird seems like no problem at all, and it renders a very reasonable subset of modern CSS almost perfectly.
(Tangent: Ladybird Web Workers are actually separate processes, which is kind of funny. If one runs out of control, you can literally SIGKILL it. The browser copes with this seemingly fine!)
Unfortunately, a lot of the web uses libraries and frameworks that are happy to eagerly adopt web standards and technologies that are not supported by browsers from a year or two ago. While I realize a lot of these standards were created specifically to solve problems that real developers have, I've been unimpressed watching this play out. Often new technologies increase complexity to solve relatively minor already-manageable issues with the web platform, and I feel like it is counterproductive to fatten up the web platform just for that. I mean really. There eventually has to be a point where we all agree that there is simply too much CSS technology and there is little justification to add more, but yet it just continues to grow uncontrollably. (I'm not saying it's all bad, though. Personally, I think the text-wrap additions were actually pretty nice things that really do need to be part of the layout engine.)
Servo is very welcome; a third leg to the stool makes real diversity possible again.
What's interesting is seeing a few non-Apple WebKit browsers pop up, like Orion (Kagi) and Epiphany.
Call me cynical, but I don't see Ladybird or Servo do much beyond making a splash. Browser engines take an incredible amount of dev hours to maintain. Ladybird is hot now, but what about in a decade? Hype doesn't last that long and at that point the money and a chunk of the dev interest will have dried up.
Blink and WebKit both have massive corporations championing them, so those engines do not run that risk.
There's always risk. IE/Edge also had a massive corporation championing it, until it didn't. The US DOJ also appears to be considering actively prevent Google from backing Chrome. Which could also do for Firefox given that it's revenue comes from the same source.
No doubt that wouldn't completely kill those engine given our reliance on them, but in those kind of circumstances we might welcome the existence of some simpler engines that are cheaper/easier to maintain.
Time will tell if that will be a big problem or if more mainstream ways of doing things are better for a project intended to run everywhere!
That is not their goal at all, I don't where you heard that. Swift is currently stalled due to some blockers listed on their issue tracker, but any usage of it will be in safety-critical areas first and not a complete rewrite of existing code.
While the C++ interop in Swift seems sane with Clang being embedded I wonder how much time/energy they will have to actually move significant parts if it's so large already.
https://ladybird.org/#:~:text=The%20choice%20of%20language%2...
That doesn't even touch some of the more salient political movements or failure after failure to spin the brands off into something more/different for profit motives.
Mozilla needs to restructure as an engineering focused organization where business operations, marketting and brand management are not steering the ship.
They had $91 million in 2009. 105 million and 2010, $193 million by the end of 2011, $372 million by the end of 2015, and I don't have every number for every year, but it all seems to indicate a steady upward trend.
I'm not sure how to look at those data and interpret them as squandering of cash and those are pretty specific claims that I would hope could be articulated in a clear cause and effect way if they were true.
In the UK, spending on furthering their charitable purpose is expected to roughly match income over the medium term. There are carve-outs for specific types of "permanent endowment" (and even there, spending is meant to match the investment income) but it wouldn't cover anything like Mozilla's commercial agreement with Google.
https://assets.mozilla.net/annualreport/2024/mozilla-fdn-202...
If armageddon came and they no longer had their search revenue, they could cover 2 years of their operational costs. Many organizations have endowments that cover them for anywhere from 5 to 20 years. What I understand off the top of my head is that their major spending categories are software development, "operations" which is largely infrastructure to support that development, legal, and marketing.
I could see the case for not spending so much on marketing, but it would be organizational suicide to deficit spend away their endowment, their one firewall against existential threats, on "engineering" without a credible road map to long-term sustainable income that's better than what they're already doing. In fact if you catch the HN comment section on the right say, such behavior would probably be pointed to as yet another example of wasting money on unfocused side bets, because at the end of the day the mob truly can't decide what it wants.
And who knows maybe this "spend it all down on engineering + ??? + profit" plan could work, but that would be extremely risky and would hinge on the details of a plan. But I don't feel like I'm hearing a plan so much as vibes. I would actually turn the tables on this whole entire interpretation and say what they spend relative to their market share, they're actually punching above their weight compared to Google, and that this criticism of "hoarding" is not grounded in financial literacy.
They claim to be putting $220 M/year into software development, but can't sponsor Servo even at $1 M/year? I call bullshit.
I actually think you're right that they should have kept Servo, but that doesn't sustain the charge that is smart to not have an endowment or spend down their endowment for no reason. Most of your questions are financial literacy issues in response to standard non-profit disclosures rather than legitimate critique of strategy.
Now, how come they burn $220+ M/year into software development and $0 on Servo?
Rust was the backbone of Quantum, their monumental overhaul and modernization of the Gecko engine. Mozilla is way ahead of competitors in shipping production Rust code. Google is in C++, but taking baby steps toward also implementing Rust in bits and parts but with lots of security containering and interfacing with C++.
Firefox contains not-so-much Rust and Servo would have been the road there.
Mozilla is specifically not developing Servo, and thus mostly not getting the benefits you state. (The major exception being Stylo.)
https://www.mozilla.org/en-US/firefox/browsers/quantum/
>Firefox Quantum was a revolution in Firefox development. In 2017, we created a new, lightning fast browser that constantly improves. Firefox Quantum is the Firefox Browser.
Quantum took modernizations that were started as part of Servo but integrated them into Gecko, where they continue to live today. They include Stylo as you noted, but also WebRender, Quantum DOM, Quantum Compositor, Quantum Flow, as well as a host of rust libraries and tooling.
Firefox’s responsiveness, GPU acceleration, and memory-safety improvements are directly attributable to Servo's research being folded in.
So it is in fact getting the benefits I stated. In fact it demonstrates that Servo succeeded after all: the fruits of it were harvested, brought into Firefox and are a key demonstration of Mozilla punching above its weight with smart investments allowing them to be more efficient than Google.
Edit: The reply below is largely unresponsive, not acknowledging factual inaccuracies, and repeating points that have already been addressed. As I already explained, Firefox benefits from Servo to this day. The end of its support is regrettable but not a scandal nor even a demonstration of inefficient use of resources. Its benefits were rolled into the core browser which is maintained at a fraction of the cost of its competition.
And now Mozilla is sitting on that cool $1.3 B, burning $220-500 M/year on "software development" without telling anyone what it actually is doing with that money, and putting $0 into Servo.
It's not clear to me why that requires a sizeable team of developers - surely they'd be better off working for MoCo (the commercial subsidiary who make the browser and who provide a large portion of the MoFo's income)?
MoFo's activities are centred on philanthropy and advocacy. You'd expect most of their staff to be experts in things like community engagement, policy research and development, grant-making, campaign strategy, volunteer welfare, reporting & transparency, and management of investments.
Sure, there'll be some engineering needed to support that, but it shouldn't be their core focus.
[1] https://freebsdfoundation.org/about-us/about-the-foundation/
And that's the stated purpose. The observed current purpose of the system is to make a small handful of people more rich.
If you went back to the pre-2005 situation, in which MoFo was all there was, it would have at most low single-digit millions in the bank rather than a billion. The AOL dowry was only intended to last a couple of years, and there's simply no way it could have sustained development of the browser beyond that. The Phoenix would have been consumed by the flames, and we'd be left with a stagnant IE/Chrome duopoly.
1. most of the money comes from Google Search placement in the browser
2. most of the money is NOT used on the browser
Large nonprofits publish consolidated, high-level statements that group expenses by broad function, not by department or line item because that's the correct level of financial reporting for external audiences.
If they misrepresented their spending that would be flagged by the independent auditor. It's deeply responsible to accuse them of hiding something when you have no baseline concept of standard disclosures.
In theory, it feels like that ought not to change anything regarding the legal situation, but I bet it does.
Some are fair! But many aren't. But it may help temper MDS to put it all in one place.
Also, what's your issue with Firefox?
You should likely join https://servo.zulipchat.com and ask questions to know where to start.
I've not been following the space, is this a different project with the same name?
If someone wants to put marketing veneer on top of a new project that uses servo, great! But servo is servo: a rendering engine
Servo's CSS engine Stylo is also modular, and is shared by Firefox which is part of how they've managed to not completely fall behind in web standards support despite the project being all but abandoned for several years.
I'm building another browser engine Blitz [0] which also uses Stylo, and we're building our layout/text engine in such a way that it can be reused so future browser engines (at least ones written in Rust) shouldn't need to build either Style or Layout if they don't want to.
A few more infrastructure pieces like this and browser engine development starts to look more approachable.
Edit: see sister comment by the actual Dioxus guy, which is more accurate than mine!
For context, MMM was a browser that supported both browser addons and sandboxed applets, around 1995.
It pulls in Servo/Firefox's CSS engine Stylo (and Servo's HTML parser html5ever) and pairs it with our own layout engine (which we are implementing mostly as libraries: Taffy [0] for box-level layout and Parley [1] for text/inline layout) and DOM implementation. Rendering and networking are abstracted behind traits (with default implementations available) and you can drive it using your own event loop.
Minimal binary sizes are around 5mb (although more typical build would be more like 10-15mb).
[0]: https://github.com/DioxusLabs/taffy [1]: https://github.com/linebender/parley
We do have a couple of PoC examples of integrating with the Bevy game engine. Both of these use Dioxus Native, which wraps Blitz with Dioxus (which is a React-like framework but in Rust rather than JavaScript - https://github.com/DioxusLabs/dioxus), but you could do DOM tree creation and event handling manually if you wanted to.
- This first one includes Bevy inside a window setup by Dioxus Native (using a `<canvas>` element similar to how you might on the web). Here the event loop is controled by Dioxus Native and the Bevy game is rendered to a texture which is then included in Blitz's scene. https://github.com/DioxusLabs/dioxus/tree/main/examples/10-i...
- This second one does it the other way around and embeds a Dioxus Native document inside a window setup by Bevy. Here the event loop is controlled by Bevy and the Blitz document is rendered to a texture with which Bevy can then do whatever it likes (generally you might just render it on top of the games, but someone tried mapping it into 3d space https://github.com/rectalogic/bevy_blitz) https://github.com/DioxusLabs/dioxus/tree/main/examples/10-i...
The latter is probably what I would recommended for game UI.
Both approaches probably need more work (and Blitz could do with more complete event handling support) before I would consider them "production ready".
> Embedding Servo into applications requires a stable and complete WebView API. While early work exists, it’s not yet ready for general use.
(While announcing that they got funded to fix that.)
https://www.igalia.com/2025/10/09/Igalia,-Servo,-and-the-Sov...
I am currently working on getting https://azul.rs/reftest ready, which uses some of the underlying technologies as Servo (taffy-layout, webrender) but uses no JavaScript and also has a C / Python API. Azul is basically that, except it's not usable yet.
Also, we're not using it in Blitz (although it could be added as a backend) but a note that WebRender is maintained. See Servo's most recent 0.68 branch (https://github.com/servo/webrender/tree/0.68) and also ongoing upstream development in the Firefox repository https://github.com/mozilla-firefox/firefox/tree/main/gfx/wr
Blitz:
- Custom renderer (Skia?) vs Azuls WebRender fork (to get rid of any C dependencies)
- Stylo (CSS parser) vs azul-css (to support compilation of CSS to const items)
- HarfRust (font shaping) - vs allsorts (I used allsorts also in printpdf, so it fits)
- Skrifa (font parsing) - vs allsorts again (simplifies things)
- Fontique (font selection) - vs rust-fontconfig (custom pure-Rust rewrite of fontconfig)
- Parley (line breaking) - vs Azuls text3 engine
- All as separate projects vs Azuls monorepo-style
Dioxus:
- RSX macros, data + function coupled together vs Azuls "C function callbacks + HTML dataset" model
- Binary hot-patching vs Azuls dynamic linking model
- Macros vs Azuls HTML/CSS to Rust/C compiler build tool (no macros)
- Funded by YC (not sure about upsell?) vs funded by donations (once it's stable enough) and my Maps4Print cartography startup (dogfooding)
These things matter, even for small decisions. For example, Azul uses a custom CSS parser because the CSSProperty is a C-compatible enum, so that later on you can compile your entire CSS to a const fn and use CSS strings without even doing any allocations. So even on that level, there's a technological-architectural difference between Azul and Stylo.
But the core point is more architecturally: Azuls architecture is built for de-coupling the user data from the function callbacks, because I see this as the Archilles heel that all GUI systems so far have failed at:
https://github.com/fschutt/azul/blob/master/doc/guide/02_App...
Dioxus however repeats this exact same pattern again, and even the Elm architecture doesn't really fix it. I didn't finish the document but basically there is a (1) "hierarchy of DOM elements" and a (2) "graph of UI data" and those two are not always the same - they can overlap, but the core assumption of many GUI toolkits is that (2) is a tree (it's a graph, really) and (2) is always in the same hierarchy as (1), which is why GUI programming is a pain, no matter what language / framework. Electron just makes the visual part easier, but then you still need React to deal with the pain of data model / view sync.
I can collaborate on the flex / grid solver ofc, but it's very hard to collaborate on anything else because the technologies used, the goals, the architecture, etc. are very different between Dioxus / Azul. Azul is more "monorepo-NIH integrated solution" (because I often got bug reports in 2019 that I couldn't fix because I didn't own the underlying crate, so I had to wait for the maintainers to do another release, etc. - I learned from that).
As a note, the layout engine is also now heavily vibe-coded (sorry not sorry), so I don't take credit - but feel free to take inspiration or copy code. Gemini says the solver3 code is a "textbook implementation", take that as you will. My idea was to build a "AI feedback loop" to semi-automatically put the HTML input, the debug messages (to see what code paths are hit), the source code and the final display list into a loop to let the AI auto-debug the layout engine. So that part of writing the HTML engine isn't really hard, assuming the plan works out. The hardest part is caching, scrolling, performance debugging, interactions between different systems, and especially supporting the C API. Layout is comparably simple.
- You don't have use Dioxus to use Blitz: you can do your own DOM construction and event handling with imperative Rust APIs.
- You don't have use any of the provided renderers to use blitz-dom (although our default renderer is Vello which is also pure Rust), and it would be possible to hook it up to WebRender.
- We have a lot of the tricky incremental layout and caching logic implemented (although there are still bugs).
- Blitz has grant funding through NLnet as well as funding from DioxusLabs, and is fully open source under permissive licenses (MIT/Apache 2.0) that don't really allow for "rug pulling".
---
That being said, the designs around CSS do sound quite different: we have chosen to take on a relatively heavy dependency in Stylo; we don't support non-tree-like structures; and in general, if you wish to do your own thing then that it what you ought to do!
Not sure that I agree that layout is simple (I have spent many long hours debugging the nuances of CSS layout over the past months), and I'm a little skeptical that an AI-based approach will work out. But I wish you luck!
Then I read this on their repo:
>Servo aims to empower developers with a lightweight, high-performance alternative for embedding web technologies in applications.
Um... what? Are they just saying it's a browser in a verbose way or what? It just seems like you could replace literally all those words with "browser" and the clarity would skyrocket. Although perhaps it's not actually just a browser and I dont understand.
Servo is currently more of the latter than the former as it's UI is a pretty minimal one that is mostly useful for testing and doesn't much of the niceties that users expect of a modern browser (bookmarks, history, password manager, etc).
I do agree that it's confusing for most people though.
"The Missing Protocol: Let Me Know" https://news.ycombinator.com/item?id=44881287
Such a thing could be implemented with RSS on a long scale or ntfy.sh on a short scale, but afaik most projects don't.
Only more recently has the plan emerged to release a full browser engine based on servo.
I shipped my first commercial website in 2001. I have PTSD from times when you had to basically do at least twice the work to ensure that your page worked - not just looked, but worked - on multiple browsers over multiple systems.
I recall a postmortem from a project of a major telecom website which took its pride in being accessible "everywhere". They had a matrix of systems and browsers, like 10x10, and they described how painful it was to check all the boxes.
I remember a tiny JS library I wrote at Wikia in 2006 for deep linking. It started with ~10 lines implementing documentation. But by the time I had all major browsers covered it grew to 500 lines.
I also have very fond memories of Flash. Apart from resolution - when I had my project working on my machine I had 100% certainty that it would work exactly the same on every client. Including mobile, kiosks, and desktop deployments.
Times were different, BigCos lived in siloes, ES4 went to shit and we needed another 15 years to reinvent the wheel with TS, but today everyone seems to be on the same team and browsers are shockingly compatible.
We don't. But we could use a memory safe one, so that a random website can't steal your credit card info. Unfortunately Servo is the only one that's actively developed and is written in a memory safe language; the rest are all C++.
It was anything but. For a decade macromedia had monopoly on web standards with their player and it was marvelous.
Can you imagine a simple straight implementation and knowing it will work across all of your clients today?