batmaniam

joined 2 years ago
[–] batmaniam@lemmy.world 1 points 5 days ago

Maybe? It was/is on an old win10 I keep around (and came with it).

There's a million better options, but I was glad it was there. Good way to get some kid fooling around early the way paint did. You used to be able to scan things with your surface and import them into builder (this was a good while back).

[–] batmaniam@lemmy.world 1 points 5 days ago (2 children)

I'm going to toss out Microsoft 3D builder, strictly to dip your toe in the water. It's bare bones and basically MS Paint but when I was getting started I used it for very simple stuff. I still use it if I'm making dead simple modifications/combinations of existing .STL files.

Microsoft actually had some cool ideas in the early/mid 2010s. Still had all the proprietary bullshit but there was at least nifty stuff going on.

[–] batmaniam@lemmy.world 3 points 2 weeks ago

Small Soldiers

[–] batmaniam@lemmy.world 3 points 2 weeks ago* (last edited 2 weeks ago)

This is a great conversation because I'm one of those people who's terrible at arithmetic, but quite good at math. As in: I can look at a function, visualize it in 3D space, see what different max, mins and surfaces are dominated by what terms etc, but don't ask me to tally a meal check. I'd be useless at applying any math without a calculator.

Similarly, there's a lot of engineers out there that use CAD extensively that would probably not be engineers if they had to do drafting by hand.

The oatmeal did a comic that distilled this for me where they talked about why they didn't like AI "art". They made the point that in making a drawing, there are a million little choices made reconciling what's in your head with what you can do on the page. Either from the medium, what you're good at drawing, whatever, it's those choices that give the work "soul". Same thing for writing. Those choices are where learning, development, and style happen, and what generative AI takes away.

That helped crystalize for me the difference between a tool and autocomplete on steroids.

Edit: to add: you're statement "I claim to understand but don't" hits it on the head and is similar to why you have to be careful if plagiarism in citing academic review papers. If you write YOUR paper in a way that agrees with the review but discuss the paper the review was referencing, and, even accidentally, skip over that the conclusion you're putting forward is from the review, not the paper you're both citing, that's plagiarism. Notion being you misrepresented their thoughts as your own. That is basically ALL generative AI.

[–] batmaniam@lemmy.world 2 points 3 weeks ago

this makes me feel much better. I'm debating spooling it up on wifi after disconnecting it and piecing it back together.

[–] batmaniam@lemmy.world 3 points 3 weeks ago

That makes sense but also it seems like you just like adding extra round characters to the end of things.

[–] batmaniam@lemmy.world 1 points 3 weeks ago

thank you! I had looked at the documentation but was unable to find that. I think to be safe I'm going to follow what @autriyo@feddit.org said as well. There's no reason not to label them.

Which means, sorry future people stumbling on this, I will not be providing definitive evidence one way or the other on this.

[–] batmaniam@lemmy.world 10 points 3 weeks ago (2 children)

This. I'm smack in the middle of prepping for this. A friend came out and visited and helped me start boxing. I have another who's coming out to drive the truck. I could but he's got his CDL and is willing to do it.

Getting movers that drive the truck out of town is expensive, but in town, or just some big dudes to load/unload is much less. Everyone's financial situation is different but I was shocked at how different the price of those two categories was.

[–] batmaniam@lemmy.world 2 points 1 month ago (1 children)

Something like 98% of what you see in the night sky is already out of our reach. If you left right now, at the speed of light, you would never, ever, reach them.

Another consequence of that is that some day the light from those stars will also be unable to reach us. They'll still be there, same as the day before, but not one shred of information from them will be attainable.

If you could go to this future, you would have no way of convincing people, except say, the ancient texts. To some extent it would not even matter because again, existing or not, there is no way for them to interact.

98% of what's in the night sky would just have to be taken on faith.

Im not advocating religion here I just always thought there was some poetry in that.

[–] batmaniam@lemmy.world 3 points 1 month ago (1 children)
[–] batmaniam@lemmy.world 5 points 1 month ago

It was actually very low effort! There are a number of image to STL converters. I used this one: https://imagetostl.com/

Like you can see it'll flub some stuff. I would have been better off filling in the areas of text and doing the emboss manually myself, but I just wanted to hit print. 2% infill, I think it was like 2.5g of filament and 20mins.

It's fun to screw around with that process. I'm tweaking one of my friends cabin to use to make a mold. My goal is cast concrete or something similar so I can pound some thin copper around it, and be left with a cool wall decoration.

[–] batmaniam@lemmy.world 26 points 1 month ago (5 children)

You don't know me. Everything I've ever printed was critical.

 

Hi All,

This will be difficult to pin down, but getting pointed in the right direction would be helpful.

Purchased a FlashForge AD5X ~5 weeks ago. Worked great, one button calibration out of the box, I proceeded to do what everyone does when learning: print a bunch of stuff, mix success and stumble over the usual stuff. Ie: Learned why you clean the bed, learned how supports work, deal with filament breaks etc etc.

About a week ago I had a print fail, it looked like there was a broken filament that wasn't being pushed. I do a cold pull on the nozzle, and was able to print successfully for a time (although there were some small features on some prints that seemed sloppy compared to previous prints).

After that though ALL my prints started to fail. Even after cleaning the bed, double checking bed/nozzle temp, I'd get bad adhesion. I'd also get the nozzle dragging through layers, as if the Z was off (even after running calibration repeatedly and before each print). There was some popping and oozing, which I put up to not storing my PLA dry (although ambient was only ~40%). However the problem persisted even with a freshly opened vac-sealed (confirmed seal was good) roll of PLA.

I ordered a replacement nozzle that arrives today, but can anyone give me some insight? I only ran ~2kg of PLA through, that seems like really premature wear; I must have done something wrong.

Thanks for anything putting me in the right direction.

 

Stumbling through getting a proper backup regime in place. I have an unraid system running a proper array, and am trying to setup backups for two separate machines (one windows one debian). I've successfully setup a file share, and have duplicati running. Are there disadvantages to just setting the network folder as the destination for the backup? It seems a little hamfisted (and the data rates are terrible).

It seems like there's probably a better way to do this...

 

Hi All,

I have a somewhat ridiculous setup where I have:

  • 4 monitors all fully adjustable arms
  • 2 totally separate PCs (one running windows for mostly work, the other debian for sanity)
  • all monitors going through switches so any can be either machine with the push of a button (in any combination)
  • A M&K switch that swaps my M&K from one machine to the other by double clicking the scroll wheel.

As you can imagine this takes some space. I have both boxes under my desk towards the edge, and have a three section cabinet in front of my desk to neaten everything up/hide cables. It's kind of like the one in the picture, except mine would be an isosceles trapezoid from a top view.

It works well, but I don't really use the cabinet as, well a cabinet. What I'd like to do is mount each computer in the left and right area of the cabinet. At some point, I'd get around to getting an old electric fireplace (preferably a craigslist or garage sale on that didn't work as a heater), take the door off the center cabinet, mount the fireplace in there and tie brightness to fan speed (that part I can do). As a bonus I'd put a mechanical vent switch that let me output heat to the front, or behind, where my feet are under the desk.

My question is: what do I need to do to ensure proper grounding? Also, are there any rules of thumb for air circulation? Any other pitfalls you might be able to think of?

I am also considering putting the guts of both PC's in the center section of the cabinet, but I think that would make it a bit crowded for the fireplace insert.

 

Hi All,

Looking to steer into HA, but have some questions on how data is handled.

First, I don't mean the opt-in on the scant analytics. HA is very clear about that which is great. Awesome clear policy.

Second, I understand that "integrations", which use a device manufacturer's/services software/infrastructure, are outside scope here (although I do have some questions).

My goal is to find and work a system where no one knows when my lights are turning off and on, and is only on my hardware. IE: If the internet went down, but I was still connected to local wifi, can my HA still work?

The answer seems like a strong "yes", but I want to double check. I also want to make sure if I do use an integration that there's not an avenue for telemetry beyond that integration. IE: I don't want Spotify to gain access to what temperature I keep my house just because I want to play music.

I also have questions about the mobile app, but if the rest is truly locked down, I can navigate that.

I currently have an automated bog garden, but how I did it isn't really scalable. It's all modbus components with values passed to a local server to generate a dashboard. I'd like to expand to more actual "home" automation, and this seems like a great tool!

Thanks for any clarification.

9
submitted 10 months ago* (last edited 10 months ago) by batmaniam@lemmy.world to c/plex@lemm.ee
 

I'm considering spinning up a xteve instance to add IPTV to my server, and have some VERY high level questions. While I may purchase a subscription, my main goal is to implement a workaround I've seen where I can get RSTP fed into xteve and made accesible via the plex app.

I'm looking to do that RSTP work around for two reasons:

  1. It would be fun to add access to some camera feeds (fish, bird feeders, etc) for some people who use my plex.
  2. I occasionally put up broadcasts via owncast. Half the people that would like to see those broadcasts are capable of using plex, but stumble around with VLC (and them being able to use plex is a minor miracle in the first place).

So I'm confused about how a few scenarios would be handled:

  1. Owncast broadcasting a channel on plex via xteve, with ZERO other available channels. How are multiple simultaneous viewers handled (as in, whats the experience like on their end)?
  2. Owncast broadcast as a channel on plex via xteve WITH additional channels available through an IPTV provider. If one user puts on the owncast broadcast, and the other puts on some other channel, does it switch for both of them? Boot one out?

Thanks for any input. I'm not really at the point of trying to technically implement, just looking to generally understand how all this funnels.

 

Pretty much the title. I'd like to add it to the archives.

 

Hi All,

I'm screening a large media library (20TB) wherein some files got corrupted when I did a transfer via filezilla (by my guess ~10%). The corrupted files display with a green "filter" over every frame (when played via plex and a number of local video players playing the file directly).

I'd like to screen the library, and want to write a script to get an average color reading.

Are there any libraries that would let me return a value AND specify how many frames I want it to take the average of? Because of how consistent and defined the issue is, it's really not necessary to average the whole file.

It would also be great if it automatically skipped non-video files, but I imagine a simple "try/except" would be fine.

My skill level here is best described as "high level hobbyist". I'm familiar with what I need to do iterating over the folder etc, but would prefer not to learn how to pull specific frames from a video container unless I have to.

Thanks for any help!

 

Hi All,

About a year ago I transferred all my files to a new drive. I used filzezilla which did mostly ok-ish, but I didn't notice that some of the video files were corrupted. Random files will have a green tinge to them (like someone put a green filter over the lens).

It seems random, although if it's a series it's usually the whole series.

I've been replacing them as they come up, but I was wondering if anyone had any bright ideas to expedite the process.

Thanks for any help!

 

I was wondering if anyone bumped into this. I noticed random jumps (1-3seconds) in playback when playing original quality. Definitely not buffering or performance lag, just an actual playback error. Jump was at the same spot anytime I loaded the media and regardless of what time I loaded it to.

Which is curious because on playing the file with a different media player on the box it was on, zero issue what so ever.

Disabling direct stream option (under debug) resolved it, and there doesn't seem to be much of a performance hit, I'm just curious what's going on here.

 

Running Bookworm, Plasma DE if that's relevant.

Background: I'm learning here. Decent amount of coding and embedded hardware experience but I'm usually missing one or two key concepts with this stuff.

Getting a box running, and wrestling with NVIDIA drivers. I successfully installed the driver (I think), but now lightdm isn't working. From what I read it appears there's a common issue around a race condition where lightdm tries to fire up before the drivers ready, so I need to add the nvidia driver to initramfs.

Can anyone give me some pointers? Specifically while I get the above:

  1. I'm not sure what modules need to be added and if they're named something specific for debian vs other distros
  2. The correct file to modify
  3. The correct format/syntax that needs to be added

I've found lots of examples, just none specific to debian, and screwing around at this level I don't want to bork something enough I need to do a bare install.

Thanks for any help!

 

Can anyone point me in the right direction here? I have a pretty beefy PC I use as a server and HTPC. 24 2.5ghz cores, 64gb ram, kind of a crappy video card, debian 11. I just migrated all my stuff over and stress tested it supporting 8 different transcribed streams simultaneously (mix of in/out of local). That worked great.

BUT, the video playback is choppy (as in frame skipping) and out of sync when I'm running the HTPC program. Oddly using the web client on the same machine avoids that issue.

Any thoughts? I'm wondering if it might be that it's an older TV it's plugged into and there's some issue there. Thing is, like I said, the webclient its worlds better. Webclient seems to have some issues but I'm pretty sure that's just due to the TV.

Any pointers are helpful! I'm OK at this stuff but very much learning.

 

Basically title. I remember reading about it back in like 2018, I even remember a company that would provide crypto based on the amount of traffic you let through. Just curious if that ever saw any growth.

Everything I google keeps bringing up things on the darkweb. The goal of this was explicitly to go "ISP-less". Like they envisioned mesh net covering giant swathes of space.

view more: next ›