this post was submitted on 02 Sep 2024
53 points (93.4% liked)

Linux

48632 readers
1332 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Beside DE and terminal commands , is there anything else I should try in a linux distro ?

you are viewing a single comment's thread
view the rest of the comments
[–] chottomatte 2 points 3 months ago (1 children)

You mean AI ? I rarely use it

[–] possiblylinux127@lemmy.zip 2 points 3 months ago (1 children)

AI is a broad term that is mostly marketing fluff. I am talking about a language model you can run locally.

You can install Alpaca which is now an official gnome app and then download the Mistral model. Once it downloads you can ask it things about what to do with the OS. It all runs local so you need enough ram and storage. 8gb of ram and then a few gigs of storage

[–] chottomatte 1 points 3 months ago (1 children)

I'm indeed open to the idea if it's locally hosted but ollama isn't available in my country... I'll search if there's a LLM that isn't an ollama fork

[–] possiblylinux127@lemmy.zip 1 points 3 months ago (1 children)

Ollama is run locally. It can be available in any country

[–] chottomatte 1 points 3 months ago (1 children)

I know, but won't I need to download the models in the app in order to run it locally ?

[–] possiblylinux127@lemmy.zip 1 points 3 months ago

Yes but that's pretty minor. You can just run ollama pull