this post was submitted on 06 May 2024
64 points (97.1% liked)

Linux Gaming

15226 readers
870 users here now

Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.

This page can be subscribed to via RSS.

Original /r/linux_gaming pengwing by uoou.

Resources

WWW:

Discord:

IRC:

Matrix:

Telegram:

founded 1 year ago
MODERATORS
64
submitted 6 months ago* (last edited 6 months ago) by [email protected] to c/[email protected]
 

I'm using EndeavourOS with ext4 file system for daily usage and a dual bootable Windows for gaming. What I want to have right now is getting rid of Windows completely.

When I tried it before, I had to try multiple tweaks for a game and find which one worked on Linux. Therefore, I want to take a snapshot with BTRFS and try it until I find the right configuration.

While I have quite a bit of experience with Linux, I've never used BTRFS. Do you think it's worth it?

I thought about keeping the games on the ext4 system, but I hate splitting the disk. I'm thinking of keeping the games in a non-snapshot volume.

UPDATE: I just re-installed EndeavourOS with BTRFS + snapper + BTRFS Assistant :)

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 29 points 6 months ago (3 children)

Btrfs is amazing for a steam library. The single best feature is the compression. Games tend to have lot of unoptimized assets which compress really well. Because decompression is typically faster than your disk, it can potentially make games load faster too.

I put a second dedicated nvme drive in my PC just for steam. It's only 512GB but it holds a surprisingly large library.

[–] [email protected] 15 points 6 months ago

and ~~my axe~~ deduping. all those dlls and wine prefixes that contain them occupy space only once.

[–] [email protected] 9 points 6 months ago (2 children)

I actually found the opposite with my steam library; on ZFS with ZSTD I only saw a ratio of 1.1 for steamapps, not that there's really any meaningful performance penalty for compressing it.

[–] [email protected] 8 points 6 months ago

It depends on what sort of games you play. Some games / genres / publishers are much worse about this than others.

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago) (1 children)

OK I just measured mine. I have 459GiB of games on the drive, consuming 368GiB of space. That's about 25% compression. I'm using compress=zstd:9.

I should try deduplication. I have 4 steam users and I've created an ACL hell to prevent the same game being downloaded and installed twice.

[–] [email protected] 0 points 5 months ago (1 children)

If you're messing with ACLs I'm not sure deduplication will help you much; I believe (not much experience with reflinks) the dedup checksum will include the metadata, so changing ACLs might ruin any benefit. Even if you don't change the ACLs, as soon as somebody updates a game, it's checksum will change and won't converge back when everyone else updates.

Even hardlinks preserve the ACL... Maybe symlinks to the folder containing the game's data, then the symlinks could have different ACLs?

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago)

I wrote a blog about it last year with my method of deduplicating. I really need to update that bit because steam keeps writing files that don't uphold the group permissions, and others get permission errors that need to be fixed by admin. Steam also failed to determine free space on a drive when symlinks were involved.

I even found recently that steam would write files in /tmp/ as one user, and fail when you logged in as another user and tried to write the same file. Multi-user breaks even without messing around.

My current solution doesn't use symlinks. I just add two libraries for each user. One in their respective home directory, and another shared in /mnt/steam. It means that any user can update a game in /mnt/steam, and it cleanly updates for all users at once.

[–] [email protected] 5 points 6 months ago (1 children)

Is the compression opt-in or is it enabled by default?

[–] [email protected] 6 points 6 months ago (1 children)

You have to enable compression in fstab.

[–] [email protected] 1 points 6 months ago (1 children)

Ah okay, cool. It's that easy? Does it compress all existing data after that or is it only for new data?

What would I have to do to compress existing data?

[–] [email protected] 5 points 6 months ago (2 children)

It is only for new data.

For example, you would have to defragment your filesystem again with btrfs filesystem defragment -r -v -czstd /. Where zstd is an algorithm and /, a root path. With this command, the default compression level will be used, which is level 3.

Be careful, defragmenting the btrfs file system will/can duplicate the data.

As for a mount point, if you decided to use zstd algorithm with level 1 compression, just add the compress=zstd:1 or compress-force=zstd:1 to the mount options (fstab or while mounting manually)

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago) (1 children)

Reading the manpage (btrfs-filesystem), duplication can happen on some odd kernel versions, so no danger.

Edit: that was my interpretation of breaking up reflinks of cow data anyway. Seems there's more.

[–] [email protected] 1 points 6 months ago (1 children)

If I know correctly, defrag will always duplicate the reflink files.

https://btrfs.readthedocs.io/en/latest/Defragmentation.html

Defragmentation does not preserve extent sharing, e.g. files created by cp --reflink or existing on multiple snapshots. Due to that the data space consumption may increase.

[–] [email protected] 1 points 6 months ago

Well, compression doubled my available space. ;-)

[–] [email protected] 1 points 6 months ago (1 children)

So I set up my system with btrfs in the last days and I converted two external drives (from ext4) (mainly game) and run defrag and balance, because it was mentioned in a guide to compress the existing files. Was that a bad idea? Didn't read anything about duplicates.

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago)

It is fine. You can use the duperemove tool (or bees) to find and remove duplicates.

https://btrfs.readthedocs.io/en/latest/Deduplication.html

So it is out-of-band deduplication and has to be done manually.

Also, by default cp and most file managers use a reflink copy (data blocks are copied only when modified)