harsh3466

joined 1 year ago
[–] [email protected] 1 points 7 months ago

It’s definitely a YMMV situation. I’ve heard from lots of people that it runs solid as a rock in Docker, and from others like you and me where it’s flaky af.

[–] [email protected] 2 points 7 months ago (1 children)

I’m glad yours is stable! I don’t know why, but mine, if you’d cut a loud fart near the server Nextcloud would just shit the bed on me. God forbid I try to update Nextcloud.

Like you I had most plugins disabled, and I was the only user. I first ran Nextcloud using NextcloudPi on an rpi4, and that ran solid for like four years. However, when I repurposed that pi and moved Nextcloud to my server in Docker, it just would not reliably run for me no matter what I did. At that point I also wasn’t really using Nextcloud anymore so I just abandoned it as not worth the effort.

[–] [email protected] 8 points 7 months ago

That’s what I was going to say!

[–] [email protected] 4 points 7 months ago (5 children)

I abandoned nextcloud entirely a couple years ago. It was just too damn flaky (self hosted via docker).

[–] [email protected] 6 points 7 months ago

Chiming in for Radicale. Been running it for a couple years. It’s great

[–] [email protected] 5 points 8 months ago

I re-read Preacher (Garth Ennis and Steve Dillon. R.I.P Steve) about every five years or so. Love that story.

[–] [email protected] 2 points 8 months ago

That’s just how retail works.

[–] [email protected] 4 points 8 months ago

That’s because Apple, as they love to do, decided to make their own special version of 2fa within their little garden.

Y’know, instead of going with the accepted totp method.

[–] [email protected] 2 points 8 months ago* (last edited 8 months ago)

I still have all my cards and still play with friends, but the Pinkerton move was the last straw for me. I haven’t, and won’t ever again give WotC money.

[–] [email protected] 22 points 8 months ago (4 children)

I used to love playing MtG. I don’t really anymore, and I won’t give another dime to that shit company. Fuck WotC and fuck Hasbro.

[–] [email protected] 2 points 8 months ago* (last edited 8 months ago)

Came here looking for Little Bobby Tables with the link loaded up in my clipboard!

 

Another fun week of tinkering! Here’s what I learned:

How to implement a for loop in bash scripts using seq.

I’ve been working on a script to create folders for my tv show library to play nice with my Jellyfin server. What I wanted was for the script to:

  • prompt me for the show’s name
  • query The Movie Database: Shows api for the show
  • present me a numbered list of the show results formatted as index showname year tmdb-id
  • prompt me to choose the correct result from the list
  • create a directory formatted as Show.Name.(YYYY).[tmdbid-xxxxx]

Since the number of results will vary from query to query, I couldn’t use a preset range like {0..5} for my for loop. I tried without success to have the loop iterate through the JSON response, but I was unable to figure out how to do that.

So, while likely inelegant, What I did was:

  • take the JSON response and pipe it to jq, get the number of results
  • Since jq indexes start with 0, take the number of results and subtract 1, setting the results of that calculation as my $count variable
  • loop through the JSON using for i in $(seq 0 $count) ; do to create the indexed list of results to choose from

How to use jq to work with and extract data from JSON objects

I’m just scratching the surface of jq, but I’m finding it very useful! I’ve worked with JSON before making automations on iOS with the Shortcuts app, so getting up and running with jq was pretty easy once I understood the syntax.

Note: I know tools like Filebot exist to do the kind of thing I’m doing with this script. I’m writing my own scripts from scratch in order to learn

Git and Github are different things

On my post last week a number of people suggested using Git. I already was aware of Github, and because I didn’t know what I didn’t know, I thought Git and Github were parts of a whole. I also generally knew that Git/Github are used for version control, but that was the extent of my knowledge. I still know very little, but I do now understand that Git and Github are independent things that can work together.

I also went ahead and set myself up a gitea instance on my server for when I’m ready to create repositories for myself for my scripts and dotfiles

 

I didn’t get to spend as much time tinkering and learning this week, but I still learned some new things!

  1. Wireguard is great! I had been using OpenVPN because when I initially set up my machine, my VPN had a bug with Wireguard. I was setting up a raspberry pi today for some more tinkering, and I decided to try Wireguard to see if the bug was fixed. Not only is it fixed, but Wireguard is much easier to work with. Not hating on OpenVPN, but I’ll definitely be preferring Wireguard going forward.
  2. Proper use of find, particularly with regex. This is ongoing. I’ve been using find for awhile, but not with full understanding of it’s options and syntax. I’m starting to get a better understanding of how to use it to find and manipulate the files I’m looking for. One of the biggest things that’s tripping me up with find and regex is designating the path.
  3. How to set up a new user. This was interesting. I already knew the basics, adduser -m username, sudo passwd username, but what I didn’t know anything about was --skel for copying over the skeleton shell config files. I didn’t even know the skeleton config files existed.
  4. The shell prompt can be customized. This was interesting. I was setting up a non root user on a vps that I have, and after creating the user, all I had was the $ prompt. No user@host, and no working directory. After some reading I found that adding PS1='$(whoami)@$(hostname):$(pwd)$ ' to ~/.profile will show a more traditional user@host:working/directory$ prompt. I’m sure this is not the only way to do this, and may not be the best way to do it, but based on my limited knowledge, it is the way that I’m currently doing it on my vps.
56
submitted 9 months ago* (last edited 9 months ago) by [email protected] to c/[email protected]
 

I’ve been homelabbing for a few years now, and recently I’ve really been focusing in on learning how to use gnu/linux. I thought it might be fun to periodically share the things I’ve been learning. The stand outs for me this past week were:

  1. Use the full path when referencing files and directories in bash scripts (Edit: when it makes sense, which is something I’m also still learning. This mkaes sense when the files will always be located in the same place.)
  2. In a bash script, the variable ${file##*/} will get you the name of the file your script is handling (example, when looping over files in a directory. I believe that’s a shell/bash standard variable, but I need to learn where it came from and how it works)
  3. Ubuntu gets a ton of justifiable criticism, but I find Canonical’s Multipass to be a great tool for spinning up Linux virtual machines. Especially on Apple silicon macs.
  4. Piping the output of ls to grep as a variable in a path is a great way to change to a directory you know exists but can’t remember the exact name of. (Example: cd ~/movies/“$(ls ~/movies | grep movie-name)”)
  5. The reason Mac cli utilities have syntax variations compared to the standard gnu/linux utilities is because macOS and its cli utilities are BSD based. This was information I knew at a high level, but had never really understood the implications of until this week.
  6. Related to point 5, if you’re on macOS trying to learn and you’re annoyed by the syntax differences between bsd and gnu utilities, you can run this script from darksonic37 on github to remove the bsd utilities from macOS and replace them with their gnu counterparts. (I have not run or reviewed the script. I found mulitpass first, and so far I’m happy using the Ubuntu virtual machine)
39
I'm an idiot (arm) (lemmy.world)
submitted 9 months ago* (last edited 9 months ago) by [email protected] to c/[email protected]
 

EDIT: Putting this at the top because not everyone is seeing what I actually need. I can unpack the rar archive just fine. What I can’t do (on arm) is add to/update the files in the rar archive. I have unrar already installed. What I can’t install is the rar package to create/update rar archives.

So I’ve been banging my head against the wall for about half an hour trying to install the rar package from the multiverse repository on an Ubuntu 23.10 vm I have running on my m1 mac mini. I finally ended up on https://pkgs.org and searched up rar to see if I could download it directly instead of using apt.

And it was there I realized there’s no arm version of rar.

Side note, any recommendations for an arm utility that handles rar files? I already have unrar-free installed, but what I need is something to update/add files to existing rar files.

Worst case scenario I unrar them and then repackage them with tar or zip, but if I can just work with the rar archive, I’d prefer that.

Edit: I got excited for a second remembering that I’ve got rar installed via homebrew on that same m1 mac, but when I tried to install homebrew in the vm, I learned that homebrew doesn’t officially support arm.

204
submitted 10 months ago* (last edited 10 months ago) by [email protected] to c/[email protected]
 

I've been reading Mastering Regular Expressions by Jeffrey E.F. Friedl, and since nobody in my life (aside from my wife) cares, I thought I’d share something I'm pretty proud of. My first set of regular expressions, that I wrote myself to manipulate the text I'm working with.

What’s I’m so happy about is that I wrote these expressions. I understand exactly what they do and the purpose of each character in each expression.

I've used regex in the past. Stuff cobbled together from stack overflow, but I never really understood how they worked or what the expressions meant, just that they did what I needed them to do at the time.

I'm only about 10% of the way through the book, but already I understand so much more than I ever did about regex (I also recognize I have a lot to learn).

I wrote the expressions to be used with egrep and sed to generate and clean up a list of filenames pulled out of tarballs. (movies I've ripped from my DVD collection and tarballed to archive them).

The first expression I wrote was this one used with tar and egrep to list the files in the tarball and get just the name of the video file:

tar -tzvf file.tar.gz | egrep -o '\/[^/]*\.m(kv|p4)' > movielist

Which gives me a list of movies of which this is an example:

/The.Hunger.Games.(2012).[tmdbid-70160].mp4

Then I used sed with the expression groups to remove:

  • the leading forward slash
  • Everything from .[ to the end
  • All of the periods in between words

And the last expression checks for one or more spaces and replaces them with a single space.

This is the full sed command:

sed -Eie 's/^\///; s/\.\[[a-z]+-[0-9]+\]\.m(p4|kv)//; s/[^a-zA-Z0-9\(\)&-]/ /g; s/ +/ /g' movielist

Which leaves me with a pretty list of movies that looks like this:

The Hunger Games (2012)

I'm sure this could be done more elegantly, and I'm happy for any feedback on how to do that! For now, I'm just excited that I'm beginning to understand regex and how to use it!

Edit: fixed title so it didn’t say “regex expressions”

 

I just started listening to Lessons in Chemistry, and so far it’s excellent. It’s also depressing. I (a mostly cishet male), know cognitively that women have faced horrific treatment and discrimination for thousands of years. So far, the writing in Lessons in Chemistry is very good at making something I know as a fact feel very visceral.

That’s a good thing, and I hope many men read this and experience it in a similar way as I am, though I fear the sorts of men that need to experience this are not the sort of men that are likely to read this book.

 

After a couple of years of not shooting at all (depression is great eh?), I got back into the studio with my friend/model @formaldehyde_[email protected]. We wanted to do something fun and relatively simple to sort of get my creative engine going again. We had a blast shooting, chatting, jamming to some tunes, and got some great photos too. It felt really good to be creative again, and we’re going to do a pink/yellow alien shoot later this month.

Technical details (for those of you interested):

Equipment used:

Olympus OM-D EM-5 Mk II with the M.Zuiko 12-40mm f2.8 lens at 14mm 3 x Godox AD 200 lights with the bulb heads Two umbrellasOne gridded 24” softboxWhite vinyl backdrop

Exposure settings: ISO 200, f7.1, 1/250

Lights

Two lights with umbrellas in a cross pattern to light the backdrop for the seamless white look. Set to 1/2 power.

One light in the gridded softbox above and at camera right at 1/4 power.

 

Hello c/Photography! I'm a reddit refugee like many others, and, after a couple of years of not shooting at all (depression is great eh?), I got back into the studio with my friend/model @formaldehyde_[email protected] (pleroma instance). We wanted to do something fun and relatively simple to sort of get my creative engine going again. We had a blast shooting, chatting, jamming to some tunes, and got some great photos too.

For those of you interested in the technical details:

Equipment used:

Olympus OM-D EM-5 Mk II with the M.Zuiko 12-40mm f2.8 lens at 14mm. 3 x Godox AD 200 lights with the bulb heads. Two umbrellas, one gridded 24" softbox White vinyl backdrop

Exposure settings: ISO 200, f7.1, 1/250

Lights

Two lights with umbrellas in a cross patters to light the backdrop for the seamless white look. Set to 1/2 power One light in the gridded softbox above and at camera right at 1/4 power.

view more: next ›