31
submitted 1 year ago* (last edited 1 year ago) by Mio@feddit.nu to c/selfhosted@lemmy.world

I just installed apt cacher ng for catching my apt upgrade packages and saw a huge time improvement even though I have a good internet connection. It act as a proxy and caches the response packages.

Do you run something similar? Or maybe even run a local repo mirror? Warning, they are really big so I don't think it is recommended unless you really need almost everything.

all 11 comments
sorted by: hot top controversial new old
[-] kill_dash_nine@lemm.ee 4 points 1 year ago* (last edited 1 year ago)

I use apt cacher ng. Most of my use case though is for caching of packages related to Docker image builds as I build up to 200+ images daily. In reality, I have aggressive image caching so I don’t actually build anywhere close to that many each day but the stats are impressive. 8.1 GB of data fetched from the internet but 108 GB served from the acng instance as it shows in the stats page of recent history.

[-] vegetaaaaaaa@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

I want to look into apt-cacher-ng for learning purposes, to stop 10s of VMs in my homelab from adding load to Debian official repos, and also to check if there is a way to have it only mirror a list of "approved" packages.

saw a huge time improvement even though I have a good internet connection

Note that for best performance you should use https://deb.debian.org/

Semi-related I have set up a personal APT repository on gitlab pages: https://nodiscc.gitlab.io/toolbox/ (I think Ubuntu users would call that a "PPA"). It uses aptly and a homegrown Makefile/Gitlab CI-based build system (sources/build tools are linked from the page). I wouldn't recommend this exact setup for critical production needs, but it works.

[-] TCB13@lemmy.world 4 points 1 year ago* (last edited 1 year ago)

And... you can also convert the ISO files into an hosted repository in your network using Apache:

apt install apache2 build-essential mkdir /var/www/html/packages

Now, create additional directories under /var/www/html/packages/ to save packages depending upon your system’s architecture. For example, create a directory “amd64″. You can keep multiple directories and serve packages to different architecture systems at the same time.

mkdir /var/www/html/packages/amd64

Copying all DEB files from Debian installation media

Mount the first CD/DVD and copy all .deb packages to /var/www/packages/amd64/ directory from your CD/DVD.

mount /dev/cdrom /media/cdrom find /media/cdrom/pool/ -name "*.deb" -exec cp {} /var/www/html/packages/amd64 ;

After copying all deb files, unmount the first DVD using the following command.

umount /media/cdrom

Again mount all remaining CD/DVD one by one and copy the .deb files as shown above.

To verify the files, navigate to http://192.168.1.150/packages/amd64/ from your browser. You will see all packages of your Debian DVD’s. Here 192.168.1.150 is my Debian server’s IP address.

Index of -packages-amd64 - Google Chrome_002 Create Catalog file

Switch to your repository directory i.e /var/www/html/packages/amd64/ :

cd /var/www/html/packages/amd64/

and enter the following command to create a catalog file for APT use. You should run this command so that Synaptic Manager or APT will fetch the packages from our local repository. Otherwise the packages in your local repository will not be shown in Synaptic and APT.

dpkg-scanpackages . /dev/null | gzip -9c > Packages.gz

This command will scan all deb files and create the local repository in your Debian server. This may take a while depending upon the number of packages in your local repository folder. Be patient or grab a cup of coffee.

Sample output:

dpkg-scanpackages: warning: Packages in archive but missing from override file: dpkg-scanpackages: warning: accountsservice acl acpi acpi-support-base acpid adduser adwaita-icon-theme apache2-bin apg apt apt-listchanges apt-offline apt-utils aptitude aptitude-common aptitude-doc-en aspell aspell-en at at-spi2-core avahi-daemon

[...]

xserver-xorg-video-neomagic xserver-xorg-video-nouveau xserver-xorg-video-openchrome xserver-xorg-video-r128 xserver-xorg-video-radeon xserver-xorg-video-savage xserver-xorg-video-siliconmotion xserver-xorg-video-sisusb xserver-xorg-video-tdfx xserver-xorg-video-trident xserver-xorg-video-vesa xserver-xorg-video-vmware xterm xwayland xz-utils yelp yelp-xsl zenity zenity-common zlib1g

dpkg-scanpackages: info: Wrote 1151 entries to output Packages file.

Please note that whenever you add a new deb file in this repository, you should run the above command to create catalog file.

Done! We created the catalog file. Configure Server sources list

After creating the catalog file, go to your server(local) system. Open /etc/apt/sources.list file.

nano /etc/apt/sources.list

Comment out all lines and add your APT repository location as shown below.

deb file:/var/www/html/packages/amd64/ /

Configure Clients

After creating the catalog file, go to your client systems. Open /etc/apt/sources.list file.

vim /etc/apt/sources.list

Add the server repository location as shown below. Comment out all sources list except the local repository.

deb http://192.168.1.150/packages/amd64/ /

Note: Put a space between deb and http://192.168.1.150/packages/amd64/ and /.

[-] Mio@feddit.nu 1 points 1 year ago

The dvds are fine for offline use. But I dont know how to keep them updated. Probably result in taking loads of spaces as I guess they are equal to a repo mirror

[-] TCB13@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

So what are you using for a local repository mirror? apt-mirror or ftpsync? I usually keep ISOs for the architectures that interest me using jigdo as it can update them later on.

ISOs are harder to maintain for sure but they're more standalone and might survive adversities better.

[-] Mio@feddit.nu 2 points 1 year ago* (last edited 1 year ago)

I use it with Kubuntu. Doing apt update is now much faster. I did some testing and found some good public mirror so I could max my connection(100 Mbit) with about 15ms latency to the server. But I think the problem was there are so many small files. Running nala to fetch the files in parallel helps of course. With apt local ng I don't need nala at all. The low latency and files on gigabit connection to my server leads to fast access. Just need to find a good way to fill it with new updates.
A second problem is to figure out if something can be done to speed up the apt upgrade, which I guess is not possible. Workaround with snapshots and send diff does not sound efficient either, even on older hardware.

apt update - 4 seconds vs 16 seconds.

apt upgrade --download-only - 10 seconds vs 84 seconds;

[-] TCB13@lemmy.world 2 points 1 year ago

Do you know you can use the ISO files as repositories? Easier in some situations.

  1. Create the folders (mountpoint) to mount the ISO files
sudo mkdir -p /media/repo_1
sudo mkdir -p /media/repo_2
sudo mkdir -p /media/repo_3
  1. Mount the ISO files
	sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-1.iso /media/repo_1/
	sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-2.iso /media/repo_2/
	sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-3.iso /media/repo_3/
  1. Edit the /etc/apt/sources.list file to add the repository
vim /etc/apt/sources.list

	deb file:///media/repo_1/  jessie main contrib
	deb file:///media/repo_2/  jessie main contrib
	deb file:///media/repo_3/  jessie main contrib
  1. Run sudo apt-get update
[-] owatnext@lemmy.world 2 points 1 year ago

At the beginning of the COVID pandemic, I was attempting to mirror all of my favorite distro repos, just in case of societal collapse.

[-] TCB13@lemmy.world 1 points 1 year ago

jigdo the entire repositories of your Debian's favorite architectures for the win.

[-] mrwiggles@prime8s.xyz 1 points 1 year ago

This is what I use Foreman and Katello for. Package mirror with x versions synced automatically with all my machines subscribed. Or it would be, if I ever got around to actually setting the damn thing up. I have a debian package repo and a few things subscribed, but I'd like to add more.

this post was submitted on 25 Sep 2023
31 points (97.0% liked)

Selfhosted

39677 readers
288 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS