I think you’d have to modify the edid, since you’re setting a custom refresh rate, not a hidden one.
I’ve use wxEDID to force enable VRR before.
I think you’d have to modify the edid, since you’re setting a custom refresh rate, not a hidden one.
I’ve use wxEDID to force enable VRR before.
Well, aren’t you glad they’re removing go-git
then!
I swear Lemmy comments for YouTube had a feature that let you open it for any page, but it seems the GitHub and Firefox page been deleted.
Edit: Looks like I’ve still got a fork: https://github.com/Steve-Tech/Reddit-Comments-for-YouTube (it says Reddit, but works for Lemmy too)
Does it also restore the content of unsaved files of the application?
That’s up to the application.
If not, I’ll prefer
systemctl hibernate
. I wonder, what this new feature is for.
I believe this is for storing the position of specific windows, for multi-window applications (e.g. GIMP’s multi-window mode). So hibernation is very unrelated.
I’ve had the same experience, you’re much better off RDPing into the VM. But I’d like to know if anyone has a better solution that doesn’t require an extra GPU.
On Asus motherboards you can enable ‘Memory Context Restore’, and it’ll remember the training. Unfortunately it seems rapid changes in the weather make my system unstable with it on.
cant move services as every other service sucks
What are your requirements?
I use Tidal and I know High/Max quality works in the web UI, just needs widevine support.
if they use AMD that’s better on linux, they don’t need to know what a GPU driver is.
Same goes for Intel, unless they need to use OneAPI.
That looks to be Volcanic Islands, which has good support with amdgpu
and no support by radeon
, according to Wikipedia.
I’m not sure what you meant by “set up radron kernel driver”, but you could maybe try blacklisting it.
I believe if your swap partition is on an encrypted LVM, you can still hibernate with kernel lockdown enabled.
Maybe, but also I think I was looking at the raw ‘data bits’, not ‘binary’ data. It’s actually almost exactly 4GiB, even when dropping down to minimum error correction (1.7 GiB otherwise).
(1454942×2953)÷1024÷1024÷1024≈4.00
Edit: So if alphanumeric mode could store lowercase letters, base64 would’ve stored more.
For those wondering, when using the biggest QR code with the maximum error correction (10,208 bytes), 1,454,942 QR codes is slightly less than 14GiB, which should be more than enough for a Windows ISO.
My math: (1454942×10208)÷1024÷1024÷1024≈13.83
Edit: Damn another guy beat me to it, now I wonder how I’m so far off.
Satellite imagery seems cheaper than you might think though. I’ve had SkyFi in my favourites for a while after they sponsored a YouTube video, and they seem to start at $8 per km2 for a new photo or $2.50 for a previously taken one.
To their partners*. Which I believe are companies that help out with support or something.
Along with VRR over HDMI not being well supported, sometimes the monitors own EDID is a little buggy and Linux can’t guarantee VRR will work properly.
I wrote a blog post a while ago on fixing EDIDs, but it was pretty much a guessing game on what to change: https://stevetech.me/posts/force-enable-vrr-edid
I’ve had to do that with both Samsung and MSI monitors so far. If you’d like to post your EDID, I could check it myself with what I know.
Epic!
I’ve never seen that on modern AMD stuff that uses radv, but I’m sure it’s probably fine.
Oh whoops yeah there is, run sudo update-grub
.
But otherwise that config looks correct.
Cool, you’re going to have to enable Sea Islands (CIK) support for amdgpu. You should just have to add radeon.cik_support=0 amdgpu.cik_support=1
to your kernel parameters. You’re probably using GRUB so to do that you’ll need to run sudo nano /etc/default/grub
to edit it’s config file, then add the above to the end of GRUB_CMDLINE_LINUX_DEFAULT
(keep it in the quotes, but space seperated from the previous parameter). Then reboot and hopefully Vulkan works!
Alternatively, there’s a section on the Arch Wiki for this, it should work fine for Mint too: https://wiki.archlinux.org/title/AMDGPU
Not exactly the same, but an electron beam puts a lot of noise in the image: https://youtu.be/Uf4Ux4SlyT4
Also I’ve heard the international space station gets a lot of dead pixels on their cameras from cosmic radiation.