[-] ofcourse@kbin.social 2 points 1 year ago

Sweden right now -

[-] ofcourse@kbin.social 15 points 1 year ago* (last edited 1 year ago)

You can absolutely self host LLMs. HELM team has done an excellent job benchmarking the efficiency of different models for specific tasks so that would be a good place to start. You can balance model performance for your specific task with the model’s efficiency - in most situations, larger models are better performing but use more GPUs or are only available via APIs.

There are currently 3 different approaches to use AI for a custom task and application -

  1. Train a base LLM from scratch - this is like creating your own GPT-by_autopilot model. This would be the maximum level of control, however the amount of compute, time, and data required for training does not make this an ideal approach for the end user. There are many open source base LLMs already published on HuggingFace that can be used instead.

  2. Fine-tune a base LLM - starting with a base LLM, it can be fine tuned for a certain set of tasks. For example, you can fine tune a model to follow instructions or use as a chatbot. InstructGPT and GPT3.5+ are examples of fine tuned models. This approach allows you to create a model that can understand a specific domain or a set of instructions particularly well as compared to the base LLM. However, any time that training a large model is needed, it will be an expensive approach. If you are starting out, I’ll suggest exploring this as a v2 step for improving your model.

  3. Prompt engineering or indexing using an existing LLM - starting with an existing model, create prompts to achieve your objective. This approach gives you the least control over the model itself, but is the most efficient. I would suggest this as the first approach to try. Langchain is the most widely used tool for prompt engineering and supports using self hosted base- or instruct-LLM. If your task is search and retrieval, an embeddings model is used. In this scenario, you generate embeddings for all your content and store the embeddings as vectors. For a user query, you then convert it to an embedding using the same model, and finally retrieve the most similar content based on vector similarity. Langchain provides this capability, but IMO, sentence-transformers may be a better starting point for a self hosted retrieval application. Without any intention to hijack this post, you can check out my project - synology-photos-nlp-search - as an example of a self hosted retrieval application.

To learn more, I have found the recent deeplearning.ai short courses to be quite good - they are short, comprehensive, and free.

[-] ofcourse@kbin.social 11 points 1 year ago* (last edited 1 year ago)

I agree with OP that instances being closed any time is an issue that would need to be resolved fairly soon. A solution in my opinion would be the option to transfer user accounts across instances. This would help with an instance closing and eventually make the fediverse more stable.

A new user currently has a choice for joining from a number of instances but there is no assurance to ongoing existence for them. Along with that, afaik there is no way to transfer user accounts and data across instances. If a user can transfer their accounts and data, there will be less hesitancy to join a new instance, and user accounts and data can be distributed across more instances. This can also work in such a way that if a subset of user data does not meet the criteria for another instance, then that subset of data is not migrated (most likely a community based data filter).

Another issue is with the presence of same community/magazine in multiple instances (let’s say tech@lemmy.this and tech@kbin.that) which is frustrating for users since they need to track multiple communities for similar content and the same content is being copied to multiple communities. This should also be resolved by implementing account migration. We are already seeing that communities on certain instances are becoming the prevalent ones. This creates an incentive for the admin of those instances to not shut down. And if they did decide to shut down the instance, then the users can just migrate to another instance and the prevalent community will also get to keep all its data, just in the new instance.

[-] ofcourse@kbin.social 1 points 1 year ago

I agree with OP that instances being closed any time is an issue that would need to be resolved fairly soon.

A new user currently has a choice for joining from a number of instances but there is no assurance to ongoing existence for them. Along with that, there is no way to transfer user accounts and data across instances, afaik. Having the option to transfer user accounts would help with the instance closing and eventually make the fediverse more stable.

If a user can transfer their accounts and data, there will be less hesitancy to join a new instance, and user accounts and data can be distributed across more instances. This can also work in such a way that if a subset of user data does not meet the criteria for another instance, then that subset of data is not migrated.

Another issue is with the presence of same community/magazine in multiple instances (let’s say tech@lemmy.this and tech@kbin.that) which is frustrating for users since they need to track multiple communities for similar content and the same content is being copied to multiple communities. This should also be resolved by implementing account migration. We are already seeing that communities on certain instances are becoming the prevalent ones. This creates an incentive for the admin of those instances to not shut down. And if they did decide to shut down the instance, then the users can just migrate to another instance and the prevalent community will also get to keep all its data, just in the new instance.

[-] ofcourse@kbin.social 3 points 1 year ago

I agree with OP that instances being closed any time is an issue that would need to be resolved fairly soon.

A new user currently has a choice for joining from a number of instances but there is no assurance to ongoing existence for them. Along with that, there is no way to transfer user accounts and data across instances, afaik. Having the option to transfer user accounts would help with the instance closing and eventually make the fediverse more stable.

If a user can transfer their accounts and data, there will be less hesitancy to join a new instance, and user accounts and data can be distributed across more instances. This can also work in such a way that if a subset of user data does not meet the criteria for another instance, then that subset of data is not migrated.

Another issue is with the presence of same community/magazine in multiple instances (let’s say tech@lemmy.this and tech@kbin.that) which is frustrating for users since they need to track multiple communities for similar content and the same content is being copied to multiple communities. This should also be resolved by implementing account migration. We are already seeing that communities on certain instances are becoming the prevalent ones. This creates an incentive for the admin of those instances to not shut down. And if they did decide to shut down the instance, then the users can just migrate to another instance and the prevalent community will also get to keep all its data, just in the new instance.

1

I save and backup all the photos on a Synology NAS instead of using one of the online providers. However Synology Photos doesn't have good search capabilities. So I built a project to search through the images using natural language captions, and found that it works really well.

I have published the project publicly with a GPL-3.0 license - synology-photos-nlp-search. Anyone is welcome to use and contribute to the project.

It was really cool to see that I can try two search terms like food and eating, and the embeddings model would understand the difference and provide relevant images for both.

The project runs the model and stores any model-related files locally, so besides downloading the model and necessary python packages, there are no API calls being made to any outside services. I have containerized the application to make it easier to deploy and use. That said, some programming experience might be needed as it's not an open-and-use application.

This is my first major project that I am publishing, and would welcome any feedback for improvements from the community.

3
submitted 1 year ago* (last edited 1 year ago) by ofcourse@kbin.social to c/apple@lemmy.ml

I wanted to share with the community an iOS shortcut I created to remove exif metadata from images - Remove Image Metadata.

To use the shortcut, select photos, select the share button, and then select the shortcut name. The shortcut will remove the exif data from each selected image, and save them as new images with new names.

I use this shortcut before uploading images to apps and websites so the images do not include metadata identifiers for my device. The screenshot with the post was generated using the shortcut.

If anyone has suggestions for improvements, please let me know.

3
submitted 1 year ago by ofcourse@kbin.social to c/apple@lemmy.ml

There was a post recently for how to bypass paywalls and one of the suggestions was to prefix 12ft.io to the url.

So I created a shortcut in iOS to do just that and that can be used directly in Safari from the share sheet.

Instead of sharing the shortcut through iCloud, I’ve posted the screenshot here so you can create your own. After creating the shortcut, click on the Share button on any website with a paywall, select the shortcut name, and the website would open in a new window without the paywall.

[-] ofcourse@kbin.social 1 points 1 year ago

I am surprised some of the big ones haven’t been mentioned yet -

  • Radiolab - Not really sure how to describe this podcast. It’s superb journalism at its core. They do both short and multi-episode long form about a variety of topics from science to history to current events. For example, how dinosaurs died when the asteroid hit earth, the story of a Guantanamo convict with the same name as the host, and how poorly computer databases are designed for names that are outside the norm.

  • Planet Money - An excellent economics podcast where complex topics are distilled in fairly short episodes. They recently released a completely AI generated episode which was incredibly scary with how good it was.

  • More Perfect - Everything the US Supreme Court

  • Serial - One multi-episode series at a time about complicated criminal cases.

  • What Roman Mars can Learn about Con Law - Started off during the Trump Presidency when tough questions about the US constitution are being asked given his penchant for pushing the legal boundaries and norms.

[-] ofcourse@kbin.social 1 points 1 year ago

Mental Illness Happy Hour by Paul Gilmartin, if you like a podcast that talks honestly about the struggles of mental health.

Paul interviews a different person each week and discuss their journeys on dealing with their mental health. Paul is also been very open about his struggles. It helps that he is a comedian and has a subtle but dark humor that I enjoy.

I also really like the short surveys that he reads and people have filled out on his website because they make me feel connected that I’m not alone.

[-] ofcourse@kbin.social 1 points 1 year ago

Throughline has been my favorite since it launched a few years ago. The hosts take a deep dive into the historical events leading up to topical events of the present weaving a thread through them, hence the name.

Some of the examples are the history of policing in the US and how capitalism became the dominant economic system.

I cannot recommend this podcast enough!

[-] ofcourse@kbin.social 1 points 1 year ago

Some other factors that I have noticed -

  • Since most of the democracies determine the result based on first past the post (FPTP) or closely related voting system, the candidates only need to get 50% of the voting population to agree with them. They focus on populist policies that resonate with at least 50.1% of the population even if those policies will be detrimental to the remaining 49.9%.
  • The opposition is not seen as strong enough to lead the country. This was the case in recent Turkish elections and has been the case in the last 3 Indian elections. Erdogan and Modi keep winning because people who don’t want to vote for them are not convinced by the other candidates’ abilities to lead the country. So many of the opposing people don’t vote at all or have their votes fragmented across multiple candidates in FPTP systems. That was and also remains the concern with Biden in the US.
  • Once these leaders are in power, they actively suppress the voice of the minorities, by controlling the media and law enforcement, or by making it harder for minorities to vote and express themselves. This reduces the total voting population in favor of these leaders which again benefits them get past the 50% votes. Ultimately, we observe the vicious cycle of more power consolidation over time and more authoritarianism.

ofcourse

joined 1 year ago