I keep trying Xilem and then egui or Iced. Xilem needs more widgets out of the box to be easy to build with. Slint is another option. I wonder what cross platform GUI framework (from any language) will finally become as common as Electron based apps or the vast number of native OS apps in Windows or macOS or Linux.
I keep going back to Tauri, which is practical to build desktop apps quickly but still uses HTML, CSS, JS to build the UI. You can use Rust web UI tools but then it is still (system) browser based.
I gave it a decent shot, and I wanted to like slint, but I don’t.
It’s not a rust ui system; it’s a declarative ui language that happens to have a rust binding via macros so you can write the custom DSL.
It also has bindings for other languages.
It feels like a bunch of qtquick people got together and implemented a new runtime for qtquick. That might be the direction qt has gone, at the expense of their c++ qtwigets system, but it just feels… “electron app” to me.
If I wanted an electron app, I would just use electron.
If I wanted a non-native ui look and feel, I would use flutter.
Slint has nothing to do with elecron.
There is no browser under the hood so it is much more lightweight.
Slint has native looking widgets.
Slint DSL gets compiled to native code: no need for a runtime interpreter like in QtQuick
Something like GPUI probably, I would be quite happy with it if it wasn't so tied and restricted by the Zed's team (they reject PRs because they're not strictly related to Zed), there's even mobile fork.
Dioxus native would be second, but it's far far far away from being ready.
Not possible to do it via component. I've seen various, like custom user shaders or some silly basics like text justify (which is implemented in cosmic-text crate used by gpui). That's why it was forked.
The only things I don't like about this is a) your binaries will become huge and very slow to compile with optimizations enabled and b) it's dependence on GPUI and its unsure future direction (I'd like to see GPUI thrive outside of Zed but the maintainers have clarified that that's not the goal for now).
This is so cool. I have been in software for about 18 years but in the last few years I grew tired of the city life. My health was already affected by sedentary lifestyle - high blood glucose for many years.
I have been living in villages for about 5 years. I started a pig farm a month back. I have 16 piglets now. I still write software on a daily basis, a mix of client projects and own products. The pig farm needs about 2 hours of cleaning each day. I take care of cleaning. My business partner takes care of feeding.
I plan to grow the pig farm to a capacity of 100 pigs. It is a profitable business with roughly 30% return every 6-7 months. We give the pigs a lot more space and care than I have ever seen in any of those factory-style livestock business videos. With a 100 pigs, I will perhaps spend 5 hours a day in cleaning work - with more tools and employing a couple local folks.
Feel free to check out (links in my bio) or reach out if anyone wants to come and try this out in our little village in north eastern India. The village has large farms, growing all sorts of things.
Thanks! How do you earn or keep yourself afloat? I really like what you guys are doing. And similar orgs. I am personally doing the same, full-time. But I am worried when I will run out of personal savings.
I've been wondering this since they started it, mostly as a concern they stay afloat. Since Daniel does the work of ten, it seems like their value:cost ratio is world-class at the very least.
With the studio release, it seems to like they could be on the path to just bootstrapping a unicorn or a 10x corn or whatever that's called, which is super interesting. Anyway, his refusal to go into details reassures me, sounds like things are fine, and they're shipping. Vai com dios
Daniel is a very impressive guy. Well within the realm of “fund the people not the idea” that YC seems to do. Got a few bucks from them and probably earning from collaborations etc. Odds of them not figuring out a business model seem slim.
Companies have no idea what they are doing, they know they need it, they know they want it, engineers want it, they don’t have it in their ecosystem so this is a perfect opportunity to come in with a professional services play. We got you on inference training/running, your models, all that, just focus on your business. Pair that with huggingface’s storage and it’s a win/win.
I am interested in MetalRT. I am an indie builder, focused mostly on building products with LLM assistance that run locally. Like: https://github.com/brainless/dwata
I would be interested if MetalRT can be used by other products, if you have some plans for open source products?
Yes, that's the plan. MetalRT will ship as part of the RunAnywhere
SDK so other developers can integrate it into their own apps. We're
working on making that available. If you want to be in the early
access group, drop me a line at founder@runanywhere.ai or open an
issue on the RCLI repo. Happy to look at your project.
Hey Reiss, I just checked Synthetic. So nice to see indie providers for smaller LLMs. I am personally building products to run only with small (actually < 20b) models. My aim is for laptop usage. Would love to know what plans you have for models smaller than you have currently. Industrial use is all about smaller models IMHO
Local models, particularly the new ones would be really useful in many situations. They are not for general chat but if tools use them in specific agents, the results are awesome.
I built https://github.com/brainless/dwata to submit for Google Gemini Hackathon, and focused on an agent that would replace email content with regex to extract financial data. I used Gemini 3 Flash.
After submitting to the contest, I kept working on branch: reverse-template-based-financial-data-extraction to use Ministral 3:3b. I moved away from regex detection to a reverse template generation. Like Jinja2 syntax but in reverse, from the source email.
Financial data extraction now works OK ish and I am constantly improving this to aim for a launch soon. I will try with Qwen 3.5 Small, maybe 4b model. Both Ministral 3:3b and Qwen 3.5 Small:4b will fit on the smallest Mac Mini M4 or a RTX 3060 6GB (I have these devices). dwata should be able to process all sorts of financial data, transaction and meta-data (vendor, reference #), at a pretty nice speed. Keep it running a couple hours and you can go through 20K or 30K emails. All local!
New GPUs come out all the time. New phones come out (if you count all the manufacturers) all the time. We do not need to always buy the new one.
Current open weight models < 20B are already capable of being useful. With even 1K tokens/second, they would change what it means to interact with them or for models to interact with the computer.
hm yeah I guess if they stick to shitty models it works out, I was talking about the models people use to actually do things instead of shitposting from openclaw and getting reminders about their next dentist appointment.
Considering that enamel regrowth is still experimental (only curodont exists as a commercial product), those dentist appointments are probably the most important routine healthcare appointments in your life. Pick something that is actually useless.
The trick with small models is what you ask them to do. I am working on a data extraction app (from emails and files) that works entirely local. I applied for Taalas API because it would be awesome fit.
dwata: Entirely Local Financial Data Extraction from Emails Using Ministral 3 3B with Ollama: https://youtu.be/LVT-jYlvM18
If we can print ASIC at low cost, this will change how we work with models.
Models would be available as USB plug-in devices. A dense < 20B model may be the best assistant we need for personal use. It is like graphic cards again.
I hope lots of vendors will take note. Open weight models are abundant now. Even at a few thousand tokens/second, low buying cost and low operating cost, this is massive.
I keep going back to Tauri, which is practical to build desktop apps quickly but still uses HTML, CSS, JS to build the UI. You can use Rust web UI tools but then it is still (system) browser based.
reply