Have you tried Penny’s Big Breakaway yet? Been eyeing it and based on your list here, seems right up your alley.
Have you tried Penny’s Big Breakaway yet? Been eyeing it and based on your list here, seems right up your alley.
If you don’t need to host but can run locally, GPT4ALL is nice, has several models to download and plug and play with different purposes and descriptions, and doesn’t require a GPU.
For ableton, you can run it in wine and it can work well enough to do things. It’s an OK experience at best and flat out doesn’t work at worst. Kiss your VST plugins goodbye with that though, gotta stick to the built ins which do all work when it’s working overall.
Otherwise, check out bitwig studio, made by ex ableton devs and natively runs in Linux. Still gonna be hit or miss on 3rd party plugins but the app is on par with ableton as an experience. Price in the same range too. Best short explainer is ableton meets logic in terms of usability.
While generally true, I believe there’s a lot of weird custom wireless communication out there. Plenty of mice and keyboards refuse to communicate over a standard HID protocol which leads many to not work for enterprise type devices / appliances. Anything with an HID / Console port (like some KVMs) for management will just not respond properly to key presses even if the downstream usb host can detect presses properly. This is extremely nuanced and not at all the same as something like Logitech G-Hub only being windows so customizing the buttons / RGB on the M/K is a questionable adventure for normal users.
It’s the Linux version of steam taking advantage of idle time to process shaders. It’s a critical part of making all those proton launched games working right. I wish it had better control for when to run it but it is what it is.
I disagree, Sony’s feature is fundamentally different. It allows developers to create quick-load shortcuts for different activities in a given game. Like having multiple open world objectives/missions where you can choose which one to continue from the system menu. Microsoft’s feature is all about resuming where you were directly as if you put the system to sleep with that game running.
Yeah the box shows up as a monitor in the system display settings, can even enable it and use it like a normal display. The headset will do the spatial tracking and you can recenter with the headset button. It’s just small and low resolution so you can’t even use it for productivity. Until the app works, no games at all.
Index works mostly fine. Sometimes it drops out but my Bluetooth stack hasn’t been the most stable on this install. Arch btw.
I did grab the PSVR2 PC adapter box and it does work to get a display showing in the headset as another monitor which is pretty sweet. But the PSVR2 app on steam just straight up doesn’t work in any form of compatibility mode I’ve been able to try so it’s no dice there.
That’s a different feature. Resume Activity is just a developer-dependent shortcut that’s integrated into the system menu. Quick resume is saving a snapshot of ram to disk and loading it up as needed per-application. Different goals entirely.
My biggest concern with SteamDeck was that it would become a 1-2 year upgrade cycle device. I don’t expect the hardware to last 7+ years like normal console lifecycles but I’m very glad to hear they’re being patient and aggressively supporting the software side.
Borg backup is gold standard, with Vorta as a very nice GUI on machines that need it. Otherwise, all my other Linux machines are running in proxmox hypervisors and have container/snapshot/vm backups regularly through proxmox backup server to another machine. All the backup data is then replicated regularly, remotely via truenas scale replication tasks.
Can you show where they’ve gone further than apples game porting toolkit or game translation layers? Genuinely curious because I haven’t seen any comparison but do know several large profile games have come to apple silicon recently.
Isn’t UTC meant to be… you know, universal?
… nobody can convince me otherwise
Ignorance isn’t something to be proud of.
So far it’s fine. Not much of a difference on the surface. Except floatplane videos in Firefox have distorted audio now after the update. Might be unrelated but it was directly after updating. Oh and my Application Menu crosses into the monitor to the left of my primary screen which is a bit annoying. Nothing showstopping here.
For clarity, the recommendation is specifically 3 copies of your data, not 3 backups.
3-2-1 backup; 3 copies of the data, 2 types of storage devices, 1 off-site storage location.
So in a typical homelab case you would have your primary hot data, the actual device being used to create and manage that data, your desktop. You’d regularly backup that data into warm storage such as a NAS with redundancy (raid Z1, Z2, etc). Followed by regular but slower intervals of backups to a remote location, such as a duplicate NAS with a secure tunnel or even an external drive(s) sitting at a friend or family member’s house, bank vault, wherever. That would be considered cold storage (and should be automated as such if it’s constantly powered).
My own addition to this is that at least one of the hot / warm devices should be on battery backup in case of power events. I’ll always advocate that to be the primary machine but in homelab the server would be more important and the NAS would be part of that stack.
Cloud is not considered a backup unless the data owner is also the storage owner, for general reliability reasons related to control over the system and storage. Cloud is, however, a reasonable temporary storage for moves and transfers.
Nice! I’ve been loving focus modes on iOS, it’s a real game changer when coupled with geofencing and calendar events and other automation. Glad more people will be able to utilize this!
I self host services as much as possible for multiple reasons; learning, staying up to date with so many technologies with hands on experience, and security / peace of mind. Knowing my 3-2-1 backup solution is backing my entire infrastructure helps greatly in feeling less pressured to provide my data to unknown entities no matter how trustworthy, as well as the peace of mind in knowing I have control over every step of the process and how to troubleshoot and fix problems. I’m not an expert and rely heavily on online resources to help get me to a comfortable spot but I also don’t feel helpless when something breaks.
If the choice is to trust an encrypted backup of all my sensitive passwords, passkeys, and recovery information on someone else’s server or have to restore a machine, container, vm, etc. from a backup due to critical failures, I’ll choose the second one because no matter how encrypted something is someone somewhere will be able to break it with time. I don’t care if accelerated and quantum encryption will take millennia to break. Not having that payload out in the wild at all is the only way to prevent it being cracked.
What are the features you need from your host? If it’s just remote syncing, why not just make a small Debian system and install git on it? You can manage security on the box itself. Do you need the overhead of gitlab at all?
I say this because I did try out hosting my own GitLab, GitTea, Cogs, etc and I just found I never needed any of the features. The whole point was to have a single remote that can be backed up and redeployed easily in disaster situations but otherwise all my local work just needed simple tracking. I wrote a couple scripts so my local machine can create new repos remotely and I also setup ssh key on the remote machine.
I don’t have a complicated setup, maybe you do, not sure. But I didn’t need the integrated features and overhead for solo self hosting.
For example, one of my local machine scripts just executes a couple commands on the remote to create a new folder, cd into it, and then run
git init —bare
then I can just clone the new project folder on the local machine and get started.