Computer Science

572 readers
1 users here now

A community dedicated for computer science topics; Everyone's welcomed from student, lecturer, teacher to hobbyist!

founded 6 years ago
MODERATORS
1
 
 

So a few things to note, in mechatronics engineering, this is a rigorous course full of a blend of chemistry and physics and I have major ADHD. This course load is just too tough and I'm recently discovering that I may have to devote more time to work to be able to support myself and my partner.

So I guess my general question is would a computer science degree be worth it, I can focus on mathematics or I can focus on chemistry but I don't think I can do both. And don't get me wrong. I have some programming experience and it was quite challenging but it was less varied. I felt like I was able to keep my head above water a little bit more. What kind of focuses does a cs degree entail? How long did you spend at college say doing a full time course load, I think I can extrapolate what I would be allocating to this each semester.

Also, what is the actual work like once you graduate, is it more freelance or structured? How hard is your day to day? What is the learning like after college, how up-to-date do you need to stay with new technologies? I think this would personally be easier for me since I learned new technologies for fun. As glamorous as mechatronics seemed to me, I'm just not sure I can keep up with all of this at once.

Any other things I should consider?

I also asked this on reddit but I doubt I'll get any sort of meaningful discussion like I would with good old Lemmy, bonus points if you have ventured in both degrees and have some insight

Thanks!

2
 
 

cross-posted from: https://lemmy.ml/post/33339350

Results show that hardware, software, and numerical variability lead to perturbations of similar magnitudes — albeit uncorrelated — suggesting that these three types of variability act as independent sources of numerical noise with similar magnitude.

We define “hardware variability" as the output differences caused by using CPU micro-architectures with or without AVX2 support, “software variability" the variability resulting from different compilation environments in Docker and Guix, mostly influenced by the FSL, gcc, libm, and OpenBlas versions, and “numerical variability" the variability resulting from random rounding.

AVX2 (Advanced Vector Extensions 2) is a CPU extension related to x86/AMD floating point & integer vector processing.

I only skimmed the article, and I’m not sure what to make of the “random rounding” variability.

Random rounding is a type of Monte Carlo Arithmetic, a stochastic arithmetic technique to empirically evaluate numerical stability by injecting noise into floating point operations and quantifying the resulting error at a given virtual precision. While Monte-Carlo Arithmetic provides different noise injection modes, we only used random rounding (RR).

Perhaps they’re saying that the precision of floating point computations varies slightly depending on the numbers themselves?

This was an interesting bit about Guix and Docker.

Docker and Guix, the two packaging solutions used in our experiments are known to mitigate software variability. From a computational bit-wise reproducibility point of view, experiments conducted in this study show that the two packaging solutions lead to similar conclusions: results are bit-wise reproducible when using the same packaged FLIRT executable on equivalent micro-architectures. We note however that, despite using the same version of the FLIRT source code, the two solutions yielded different outputs for most of the input files (but not for all). This is due to the software variability resulting from different compilation environments, mostly influenced by the gcc, libm and OpenBlas versions. The Docker image is a black box providing little or no information on how the executable was built (both on the compilation process, and on software dependency stack). In contrast, the Guix solution enforces full transparency on both compiling and runtime environments, requesting for the description and availability of all dependencies. The Guix package is thus more complex to produce than a Docker image, but, once available, variations can be easily built by modifying compiler options or by using other versions of dependent packages.

3
 
 

cross-posted from: https://lemmy.sdf.org/post/37414239

I've read the old papers proving that fact, but honestly it seems like some of the terminology and notation has changed since the 70's, and I roundly can't make heads or tails of it. The other sources I can find are in textbooks that I don't own.

Ideally, what I'm hoping for is a segment of pseudocode or some modern language that generates an n-character string from some kind of seed, which then cannot be recognised in linear time.

It's of interest to me just because, coming from other areas of math where inverting a bijective function is routine, it's highly unintuitive that you provably can't sometimes in complexity theory.

4
 
 

cross-posted from: https://lemmygrad.ml/post/4649344

kewl

5
6
 
 

From to school to school it could either be in the Engineering college or Science college. Some say it's a Math.

Is having it in a different college going to affect how you're going to be taught? I'm going to Cal Poly Pomona and comp sci there is located in the college of science but at Cal State Long Beach it's in the college of engineering. Does this matter and how?

7
8
9
1
submitted 2 years ago* (last edited 2 years ago) by Benjamin@jlai.lu to c/compsci@lemmy.ml
 
 

Hello, I'm trying to understand what has slowed down the progress of CPUs. Is it a strategic/political choice, or a real technical limitation?

10
11
12
 
 

I was curious about the difference between these two. I found a Reddit post about this, and I thought I'd summarize it here so that the content is replicated to lemmy.

Please be aware, this is not from a perspective of a computer scientist. I am very interested in computer science, but have a lot of knowledge gaps.

The premise of the question: Both of these seem to be methods of running two or more tasks / sub-tasks without necessarily using multiple cores. The concepts geeked similar to me, but I will explain the difference below.

How does the OS scheduler's multi tasking works (besides multi-threading)?

Basically, when the operating system is running two or more tasks, it may be running more tasks than cores available. To deal with this, it switches between these tasks very quickly - so fast it seems they're running concurrently.

Hyper threading

A hyper threaded CPU looks a bit different. To the user, it is interfaced almost like an extra core. Think about a multi core CPU. Every core is almost its own CPU. With hyper threading, some of those components are replicated, but not enough to make an extra core. (remember, CPU has many components, such as ALU, floating point units, load/store units, branch units, etc) However, the CPU arranges tasks in a way where two tasks can be using different parts of the CPU, hence can run concurrently on the same core. It is not super often that this happens, which is why a separate core is faster than hyper threading. But we are able to speed up tasks with concurrency faster than switching between them. This can be enhanced by the operating system scheduler scheduling tasks in an order making it likely to run concurrently on a hyper threaded CPU, in other words, scheduling tasks that will likely use different parts of the CPU and not conflict.

I hope this is a good summary, and I hope computer scientists can correct me if I'm wrong.

13
14
 
 

Simulation of the Apollo Guidance Computer (AGC)

15
1
The Recursive Universe (www.amandaghassaei.com)
submitted 5 years ago by yogthos@lemmy.ml to c/compsci@lemmy.ml
16
17
 
 

The Goedel Prize awards outstanding papers in the area of theoratical computer science. All of the papers are linked.

Each computer scientist/student should at least have a look at these great papers. It covers papers of multiple CS fields and there is probably some interesting algorithm/prove that you haven't heard of yet.

Have a look and enjoy the rabbit hole!

18
 
 

Who of you never had the idea of translating a language in your mother tongue?

I just discovered this list of Non-English based programming languages. Have fun in experiencing a new way of programming without this nasty language barrier!

https://en.wikipedia.org/wiki/Non-English-based_programming_languages

Yours rr