[-] garrett@lemm.ee 6 points 3 months ago

Riker catches an alien "virus" (from a plant) and lays down naked under a shiny blanket for the rest of the episode. Pulaski forces Riker to dream of the most boring and worst segments from season 1 and 2.

Most shows have flashback episodes that feature highlights. TNG had a clip show that showcased the worst segments. It was the most lackluster finale episode of any Star Trek season. And this was even well after Riker "grew the beard".

[-] garrett@lemm.ee 7 points 5 months ago

Penpot works perfectly on Linux, and you can even host it yourself in your own computer if you want. It's web-based and works in both Firefox and Chromium browsers. (I think WebKit ones too, but it's been a little while since I've tried it with Epiphany.)

I use Penpot myself all the time on Linux, but I'm usually using the hosted version so I can collaborate with others without having to maintain a server. I have also run locally in a container using Podman, even with Podman's rootless support.

But to start using it, all anyone needs to do is point their browser of choice to https://design.penpot.app/ and sign in. There is no setup process or installation needed; self-hosting is completely optional.

[-] garrett@lemm.ee 5 points 5 months ago

Just pointing this out, as there are non-free services that the apps use:

Frog is awesome, but note that while Frog works offline for OCR, it has TTS (text to speech) which uses an online service. As long as you avoid having it read to you, it's all done locally.

And Dialect always uses an online service. Some of the servers are FOSS, but some aren't. But everything you type or paste into it is sent somewhere else. (This is the case with using translation websites too, of course.) I'm not saying you shouldn't use it; I'm just saying that you should be aware.

Hopefully Dialect will add Bergamot (what both Firefox by default & the "translate locally" extension use for translation) at some point. Dialect has a longstanding issue about it, but no forward motion yet. https://github.com/dialect-app/dialect/issues/183

For something open source that runs completely on your computer for translations, you'd want Speech Note. https://flathub.org/apps/net.mkiol.SpeechNote It's Qt based, but works well. In addition to translation, it can do text to speech and speech to text too. You do have to download models first (easily available as a click in the app), but everything, including the text you're working with, is all done locally.

I use both Frog and Speech Note all the time on my computer (GNOME on Fedora Linux). They're excellent.

[-] garrett@lemm.ee 5 points 5 months ago

It does work in Proton, but without audio.

There's a bug open @ https://github.com/ValveSoftware/Proton/issues/7612

ProtonDB also lists the lack of audio, without workarounds (so far): https://www.protondb.com/app/2512840?device=any

Hopefully there will be a fix and/or workaround very soon; the game looks fun.

[-] garrett@lemm.ee 8 points 5 months ago

Agreed.

Additionally, the graphic oversimplifies things as well. The resulting genetically modified crop is often not even all that close close to the same as the non-GMO, as seen by studies such as this one:

https://enveurope.springeropen.com/articles/10.1186/s12302-023-00715-6

Basically; GMO soybeans contain proteins which differ and also include additional proteins. This can cause allergic reactions to modified soy where non-modified soy might not cause an issue.

Monsanto supposedly even knew about these proteins and higher risk of allergic reaction and chose to not disclose it. (I saw some research that mentioned this years ago... It'd be hard to find the exact source I read back then.) This specific paper, which talks about additional proteins and side-effects brought in by the new transgenic splicing, also explicitly states that Monsanto did studies themselves and failed to report relevant findings:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5236067/

Obviously, other methods can also change proteins too, but these papers show it isn't as clear cut as the graphic in the original post claims.

Along these lines, here's a study that finds differences not just in soybeans grown organically versus ones treated by glyphosate (Monsanto Round-Up pesticide) but also between GMO and non-GMO crops, both treated by the pesticide.

https://www.sciencedirect.com/science/article/pii/S0308814613019201

But, yeah this is just a long way of agreeing with the parent post and saying that the end goal is to make the plants resistant to poison, not to make them better for humans, all to make more money. (In this case, Monsanto is even double-dipping by selling both the pesticide and the crops tailor-made for the pesticide.)

Other GMO crops might be closer to the original crop and might also actually be beneficial for humans without drawbacks. However, Monsanto's soybeans are problematic, and other crops might be as well, especially if they're made by companies who have money as their primary goal.

[-] garrett@lemm.ee 4 points 7 months ago

Yeah, some of the smaller models are even reasonable on my old laptop in CPU mode.

General rule of thumb: The larger the model, the better it is. But not necessarily. 😉 I've found zephyr and mistral are both quite good for a tradeoff and work on CPU. Of the ones that really need more RAM and/or a GPU with a lot of vRAM, mixtral seems like the best.

Additional fun is to use a Modalfile (which is like a Containerfile, but is a recipe for models instead of containers) to customize a local model on top of one of the existing ones.

For a simple one to demonstrate, I have a system instruction to output everything in the form of the poem "This Is Just To Say", but customized per topic.

It really works best with mixtral (I've tried other ones, especially smaller ones):

FROM mixtral
PARAMETER temperature 1
SYSTEM """
You will respond to everything in a modified poem in the form of "This Is Just To Say" by William Carlos Williams, except change all the specifics to be what the subject is. Do not say any other text. Try to make the syllables the same as the original and use the same formatting.

You can expand in length in responses when there is too much to talk about, but keep the format and style of the poem.

Do not respond in any other way.

For reference, the full poem is:

I have eaten
the plums
that were in
the icebox

and which
you were probably
saving
for breakfast

Forgive me
they were delicious
so sweet
and so cold
"""

Yes, you just instruct the system with natural text like that and it (usually) abides. I tried it without the poem being referenced inline, and it mostly worked fine... but it works even better being mentioned in the file.

I have that saved in ~/Projects/ollama/ as Modelfile.fun-plums

I run the server almost as above, but now also pass in my ollama project directory as a mounted volume with z (for SELinux mapping)... don't forget to have run sudo setsebool container_use_devices=true first, else it won't work:

podman run --detach --replace --device /dev/kfd --device /dev/dri --group-add video -v ollama:/root/.ollama -p 11434:11434 -v ~/Projects/ollama:/models:z --name ollama ollama/ollama:0.1.24-rocm

(You can run this command if you already have the server running. It will replace it with the new one. This is for AMD. You'd want to use the NVidia or CPU container if you don't have an AMD card. The CPU container is the fasted to download. The version here is newer than the one for AMD that I listed above, so it might be a multi-gigabyte download if you don't have this new one yet. The important and new part is ~/Projects/ollama:/models:z)

Then, create the model. This will be almost instant if you already have the base model downloaded (in this case, mixtral), otherwise it will auto-download the base model:

podman exec -it ollama ollama create fun-plums -f /models/Modelfile.fun-plums

(The path to the model in this command is the internal path from the point of view within the container.)

Then, you run it like any other model.

Here's me running it, and bringing up the topic of leftover pizza.

$ podman exec -it ollama ollama run fun-plums
>>> pizza
 I have consumed
the pizza
that was on
the counter

and which
you were likely
saving
for lunch

Forgive me
it was satisfying
so tasty
and so warm

You can also paste the text from the reader mode of an article and it'll summarize it with a poem based on that one. 🤣

For example, copying and pasting the text from https://www.theverge.com/2024/2/10/24068931/star-wars-phantom-menace-theater-showings-25th-anniversary-may resulted in:

 I have watched
the Phantom Menace
that was on
the silver screen

and which
you may have
missed or
disliked once

Forgive me
it has charm
a new sheen
and Darth Maul
[-] garrett@lemm.ee 4 points 9 months ago

I'm just halfway through the new series. You definitely would want to read the comics and/or watch the movie first.

It's excellent so far. It's great to see it in the style of the comic, with the actors from the movie providing the voices, and the musicians (Anamanaguchi) that made the tunes for the videogame.

I can't say much about the show due to spoilers, but can already recommend it if you've enjoyed any other Scott Pilgrim media.

[-] garrett@lemm.ee 4 points 10 months ago

I basically gave up on podcasts on the desktop and only use AntennaPod on my phone. When I'm at my desktop, I have my phone paired with my computer via Bluetooth and play that way. I can pause it on my computer via KDE Connect (GSConnect on GNOME).

Bluetooth audio from phone to desktop works on Fedora Linux quite well. It probably works on other Linux distros too. I'm guessing it might also work on other OSes like Windows and macOS.

KDE Connect is available on Android, iOS, KDE (and can run on other desktops too), GNOME (via the GSConnect extension), Windows, and macOS.

This solves the syncing problem by sidestepping the need for it. My podcast state is always correct and I always have my podcasts with me, even when out and about.

[-] garrett@lemm.ee 6 points 10 months ago* (last edited 10 months ago)

Docker on Windows and Mac also runs containers through a VM though. (It's more obvious on Windows, where you need WSL (powered by a VM) and Hyper-V (a way to run VMs on Windows). But on a Mac, VMs to run Linux are also used to run Docker containers inside the VM.)

Podman Desktop helps to abstract VMs away on Windows and macOS: https://podman-desktop.io/

For the command line, there's "podman machine" to abstract away the VM. https://podman.io/docs/installation (installing on macOS is mentioned on that page and Windows has a link to more docs which also uses the podman machine command.)

As for Docker compose, you can use it directly with Podman too: https://www.redhat.com/sysadmin/podman-docker-compose (there's also podman-compose as well). The only thing Docker compose doesn't support with Podman is swarm functionality.

Docker compose can even work with rootless Podman containers on a user account. It requires an environment variable. https://major.io/p/rootless-container-management-with-docker-compose-and-podman/ (it's basically enabling the socket for podman and using the environment variable to point at the user podman socket)

[-] garrett@lemm.ee 9 points 10 months ago

This actually is an option!

I've used it to play games from the Deck at native 1080p on my TV.

I'm not at my Steam Deck right now, but I remember it's in the settings. I think if you go to the game's settings, look for something like "native" display. You have to go into the settings for each game you want at a larger resolution on an external monitor in game mode and select "native".

I don't remember if it needs to first be enabled on the system settings in the display area. (I think it does the right thing for system settings by default in most cases.)

IIRC, desktop mode also automatically supports the native resolution, but game mode is nice and console-like. Desktop mode might be a bit clunkier than what you'd want for couch gaming. Setting the option in game mode for the game is likely your best option.

[-] garrett@lemm.ee 4 points 11 months ago

You can do this with the Flatpak version of Steam, but you have to give it access to the disks.

Flatseal is the easiest way to do this.

  1. Open Flatseal
  2. select Steam
  3. scroll down to the "Filesystem" section
  4. click on the + icon on the "Other files" area
  5. either put in the full path, or use something like "/run/media" to give it access to all user-mounted storage devices (this value may vary depending on how the disk is mounted)

Restart Steam (if it was running). You should be able to access additional devices.

[-] garrett@lemm.ee 4 points 11 months ago

Some would probably consider it sacrilegious, but you can actually embed Neovim inside VS Code (or Codium, the FOSS soft fork, similar to what Chromium is for Chrome).

https://github.com/vscode-neovim/vscode-neovim

I've been using this for a couple years (after using vim for a few decades). You get the best of both worlds. You can use both VS Code plugins as well as Neovim/Vim extensions too — whatever you prefer.

I still use Neovim on the command line for quick edits, but I'm happy with VS Codium + Neovim for long IDE coding sessions.

view more: ‹ prev next ›

garrett

joined 1 year ago