

Thanks for the extra info. I actually hadn’t considered the Wayland stuff.
Thanks for the extra info. I actually hadn’t considered the Wayland stuff.
I will never understand why it’s better to emulate windows to play a game on Linux instead of just using the Linux version.
If the Linux version is 1.01 and the windows one is 2.5 I get it but normally that’s not the case with GOG.
I really wish MS would have put these people in charge of shit. - https://www.youtube.com/watch?v=pFQWc79TYcU
Absolutely true, unfortunately for me I usually purchase games later in their lifespan after most bugs are worked out. It’s not like it goes F2P a year after I purchase but I guess I am just complaining.
If more people get to play it and the game is given new life it’s a good thing.
I get free to play, however I am getting frustrated when I purchase a game and then it goes free to play.
Similar to the others although I have messed with Ubuntu, CentOS, Fedora, and even a few others for like a day or two each.
At the moment I am using Fedora. My drives are raided and my main storage has all the data and the docker config directory’s.
Using docker for everything, watchtower for updates, and pertained to manage the containers with a gui. All the containers are directed to /mnt/drive/allMyData. In there is my data folders. Shows, movies, plex configs for recording over the air, ebooks, documents, etc.
Mainly I set it up this way so I can easily change distros if I wanted to and have all my services back up in an hour or so.
I started a text file that contains the command lines I have used to start all of my docker containers. This way if I need to I reference it and use the exact same commands mapped volumes to the same folders. Now I am back up and running in a few clicks. No need to backup the container if all the data in it is setup in folders in my main data directory.
However I am running a separate hardware raid setup prior to os. This way all my data stays safe as a separate volume.
Ahh, well I have found the Jellyfin clients to be less than perfect on appleTV. You may want to try another endpoint client if you don’t want to try another server side software.
Since you are against using plex even with using it only locally I would suggest you look into the Jellyfin clients support. Specifically look at each client device documentation and take a look at what audio and video codex they natively support. If there is one they both support then I suggest you do some testing and stream a movie or show that is encoded with that specific format. This will let you know if it will work across both clients whether it’s a stream issue hardware issue or Wi-Fi issue.
This will prevent transcoding that can cause audio sync issues.
Edit - if that works then I suggest you convert you library to that format.
You totally get it and sharing without paying. I encourage you to take a look at it. Super easy to setup with docker and a front end proxy.
I specifically use Vaultwarden. Great for syncing and sharing across family
Kodi has the IR stuff built in from what I remember. At least on windows it did years ago. You just need a “media remote” at one point called windows media remote. It’s a usb or receiver with a regular IR remote.
After a quick google here is a list I found.
There may be a much better way to do this but I use folder binding instead of volumes. What I usually do is map another folder structure for both sonarr and whatever DL client I am using.
So for example I do some kind of /mnt/docker/download. This is mapped in the containers as the directory as well as the path on my system.
I have this extra line in all containers that need access to the downloaded files. Then in my download client I change the default directory that it downloads to from /data to this /mnt/docker/download. Then in Sonarr/Radarr I tell it that this is the download directory. This then becomes a directory that they all have access to and can then use without error or without extra complex options in docker.
Less secure in its production practice but this is essentially a temp folder that will only ever have 1-3 files in it prior to them being processed.
Good point. Sorry I wasn’t clear. I only meant that they couldn’t branch out and develop other games like they wanted to. Essentially I was making the same point that you are in terms of what they HAD to develop. MS it was Halo, Sony it’s destiny but in either place they did not have the option to make new IP. At least maybe not till now but that looks more like a forced dev.
I agree with your sentiment, though that Sony is pushing them more in house.
I would generally agree with this statement but I think in this particular case it may be a bad thing. From what I understood of the article they are taking a portion of the Bungie dev team and spinning off to be a part of the Sony game devs. I have a feeling the manage team being taken out had been a big pushback on that.
Bungie wanted to leave Microsoft because they wanted to do new things and not just Halo. Unfortunately it turned out activation screwed them in terms of their development and mad then cut 70% of the D1 story with less than a year to release.
After all that they are eventually bought by Sony that now does almost the same thing MS was doing to them.
Don’t get me wrong, I am glad it’s more managers and corporate office than devs but I have a bad feeling that this is just the start of a bad turn of events for them.
It also broke windows the other day…
😂
Phpmyadmin.
You can also deploy it easily with docker to get it running quick.
I am sitting here going “WTF is this Doc talking about, it was a Bluesky post.”
Then i saw the one and only comment regarding servers.
YEP