Probably because there’s also permission to use the X11 socket.
Probably because there’s also permission to use the X11 socket.
I think you’d have to modify the edid, since you’re setting a custom refresh rate, not a hidden one.
I’ve use wxEDID to force enable VRR before.
Well, aren’t you glad they’re removing go-git
then!
Does it also restore the content of unsaved files of the application?
That’s up to the application.
If not, I’ll prefer
systemctl hibernate
. I wonder, what this new feature is for.
I believe this is for storing the position of specific windows, for multi-window applications (e.g. GIMP’s multi-window mode). So hibernation is very unrelated.
I’ve had the same experience, you’re much better off RDPing into the VM. But I’d like to know if anyone has a better solution that doesn’t require an extra GPU.
On Asus motherboards you can enable ‘Memory Context Restore’, and it’ll remember the training. Unfortunately it seems rapid changes in the weather make my system unstable with it on.
cant move services as every other service sucks
What are your requirements?
I use Tidal and I know High/Max quality works in the web UI, just needs widevine support.
if they use AMD that’s better on linux, they don’t need to know what a GPU driver is.
Same goes for Intel, unless they need to use OneAPI.
I’ve seen some that activate an insane number of breakpoints, so that the page freezes when the dev tools open. Although Firefox let’s you disable breaking on breakpoints all together, so it only really stops those that don’t know what they’re doing.
That looks to be Volcanic Islands, which has good support with amdgpu
and no support by radeon
, according to Wikipedia.
I’m not sure what you meant by “set up radron kernel driver”, but you could maybe try blacklisting it.
I believe if your swap partition is on an encrypted LVM, you can still hibernate with kernel lockdown enabled.
Along with VRR over HDMI not being well supported, sometimes the monitors own EDID is a little buggy and Linux can’t guarantee VRR will work properly.
I wrote a blog post a while ago on fixing EDIDs, but it was pretty much a guessing game on what to change: https://stevetech.me/posts/force-enable-vrr-edid
I’ve had to do that with both Samsung and MSI monitors so far. If you’d like to post your EDID, I could check it myself with what I know.
Epic!
I’ve never seen that on modern AMD stuff that uses radv, but I’m sure it’s probably fine.
Oh whoops yeah there is, run sudo update-grub
.
But otherwise that config looks correct.
Cool, you’re going to have to enable Sea Islands (CIK) support for amdgpu. You should just have to add radeon.cik_support=0 amdgpu.cik_support=1
to your kernel parameters. You’re probably using GRUB so to do that you’ll need to run sudo nano /etc/default/grub
to edit it’s config file, then add the above to the end of GRUB_CMDLINE_LINUX_DEFAULT
(keep it in the quotes, but space seperated from the previous parameter). Then reboot and hopefully Vulkan works!
Alternatively, there’s a section on the Arch Wiki for this, it should work fine for Mint too: https://wiki.archlinux.org/title/AMDGPU
Could you post the output of vulkaninfo
including any errors that it might also print.
If it’s not shown, what GPU do you have?
Also run lspci -k
, is your GPU using amdgpu or the old radeon driver?
I think some people also use power_save=0
which would, but my understanding is 11n_disable=8
enables aggregating transmit packets together, which impacts latency but improves upload speed.
A reverse proxy by itself doesn’t do much security wise. You could possibly setup some sort of authentication, attempt blocking, and rate limiting (in the reverse proxy, don’t trust the DVR), but it’ll probably also break the DVR even more.
There’s bots that port scan and specifically target all sorts of stuff, and DVRs are a very common target. With a VPN in the way, there’s no way of knowing what’s there. A VPN also shouldn’t break the web UI.
Not to defend them, but he did follow up with this:
https://xcancel.com/chrispavlovski/status/1856090182275215803
Although quality != latency, so idk.