

You would think python would shine with data that's slightly large but fits in memory, but SAS was actually winning this. I'm actually working on a project I can publish to a public github repo because I couldn't believe it. When dealing with small data python was much better, but who cares if you can do your work in 50ms compared to 200ms? When the data got bigger many of the tasks I was trying to replicate with python were so much slower that SAS was actually completing the task faster than python could even read the dataset into memory. I was working on a proof of concept to prove to my team python was much better and could replace SAS, and I was struggling. But for all the shit it gets and the fact that it was coded in the 70s? It's still insanely fast and good at certain statistical tasks and medium-large sized data. I used to work at a job where I used R for 90% of my tasks and now work at a job where I use python for 90% of my tasks.

I have a comp sci degree, have experience coding in many different languages, and get frustrated to no end every time I need to use SAS. So don't get me wrong I hate SAS in general.
