It’s much more powerful though. Based on Org-Mode.
Seems pretty cool! I have to try it out. Thanks for sharing.
Mastodon simply is a different thing than X/Bluesky. It’s more like RSS/Blog/IRC. It will never go mainstream unless they add (opt out) algorithms and a better search functionality. But maybe that’s just not worth it. Mastodon has already lost to Bluesky when it comes to being an open mainstream Twitter replacement.
I’m curious about if it’s even technically possible to build something federated that feels like a Twitter replacement, using the ActivityPub protocol.
Not much. Except for less competent leadership short term. And probably more forks long term.
My guess is that Threads implementation and Meta capital is necessary if ActivityPub and fediverse is to ever go mainstream. Mastodon just sucks too much and Threads might introduce some healthy competition that will make Mastodon survive long term.
I have no ideological problem with this. I believe that for-profit and non-profit complement each other. I’m a Social Democrat after all.
And I’m really annoyed by Mastodon instances that block Threads content. That’s just so extremely stupid from a fediverse strategy and a content quality perspective.
Yeah, it pretty much sucks for mainstream microblogging. Good as RSS replacement though.
People here need to realize that 90% of the microbloggers don’t give a fuck about decentralization or FOSS. They want something that works and doesn’t force them into a ketamine fuled nazi oligarchy delirium. Mastodon doesn’t work for normal people. It kind of works if you’re a FOSS nerd or some kind of fediverse idealist. (It works for me, because it doesn’t drag me into endless flame wars and I’m almost only following FOSS accounts).
My experience with Lemmy is that it is much more functional as in “Reddit replacement”. There are of course super few users, but it feels active and engaging (for better or worse). So in theory, maybe it could be a replacement.
But Mastodon has never been a “Twitter replacement”. It feels more like a fancy RSS client. Search, feeds and interactions just doesn’t work very well.
Because Mastodon basically suck unless you know what you’re doing?
Howdy! I’m not a total noob when it comes to general compute and AI. I’ve been using online models for some time, but I’ve never tried to run one locally.
I’m thinking about buying a new computer for gaming and for running/testing/developing LLMs (not training, only inference and in context learning) . My understanding is that ROCm is becoming decent (and I also hate Nvidia) , so I’m thinking that a Radeon Rx 7900 XTX might be a good start. If I buy the right motherboard I should be able to put another XTX in there as well, later. If I use watercooling.
So first, what do you think about this? Are the 24 gigs of VRAM worth the extra bucks? Or should I just go for a mid-range GPU?
I’m also curious experimenting with a no-GPU setup. I.e. CPU + lots of RAM. What kind of models do you think I’ll be able to run, with decent performance, if I have something like a Ryzen 7 9800X3D and 128/256 GB of DDR5? How does it compare to the Radeon RX 7900 XTX? Is it possible to utilize both CPU and GPU when running inference with a single model, or is it either or?
Also… Is it not better if noobs post questions in the main thread, instead of this one? Then the questions will probably reach more people. It’s not like there is super much activity…