They still don't know what memes are huh?
thefunkycomitatus
I got tired of Hexbear being so goddamn reactionary so I started using reddit, discord, dbzero, twitter, parlor, and truth social. Hopefully you all will do some growth and I can start trusting you as a platform again.
He would come home after a long day at the racism factory and say "God, I love Israel. If I ever get shot in the neck at a public event, I want you to know that Bibi would never and that it's the only democracy in the Middle East. On that note, if I do get shot in the neck at a public event by some discord memer, I want you to move on and find love. Someone presidential or, at least, vice presidential." He would say that almost every day, word for word.
They used Space X as the filming location for Hammer Industries in Iron Man 2. Which is it's own nugget because Musk is more of a Justin Hammer than Stark, depending on how much credit you want to give Stark as a character.
All I know is that Helvetica is fairly woke.
I would just find the books in .epub to begin with. PDF is an evil format. Just in case anyone needs book sources:
Start doing this.
I haven't seen it but from what I read it is a different cut than the previously released version(s) called "The Whole Bloody Affair." It's even different than the cut Tarantula showed in his own personal theater all these years. The version you posted in the second link is a fan edit, not an official release. The one out in theaters now is Tarantula's version. It will probably get a special anniversary edition physical release at some point.
Being a Charlie Kirk could, potentially, have more influence and money than being a VP.
I'm a little curious how the AI version works with much accuracy. If the context window is like 1M tokens, that sounds like a lot. But it would have to tokenize the whole book and that book gets fed into prompt behind the scenes with every question. Everytime you ask a question not only is the LLM having to process your question, but it gets fed the entire prompt + book. Plus it gets fed all your previous conversation for the session. If you're asking it questions about a Dostoevsky book, you're probably going to fill up the context window pretty fast if not immediately. Then it will just start hallucinating answers because it can't process all the context.
If they're doing something fancy with tokenization or doing some kind of memory thing, then it seems like it would be suited for a standalone program. But it also says they're using local LLMs on your computer? I mean those are going to have small context windows for sure. It seems like bloat as well. In order to run those models locally, you need to download the tensors. Those are several GB in size. I don't want my e-book library management software to do that.
I looked up her agent because I was curious. Her agent just left one talent management company for another. Like a few days ago. Sidney is keeping her agent so technically she is under a new agency. I'm curious if this is part of a new PR package the new place created for her.
In which struggle session did you get banned?