Data science/big data/GPGPU/ML as a career?

iroboto

Daft Funk
Moderator
Legend
Supporter
Hey guys,
Just curious because I know a few of board members who do work in this field, how do you find this as a career? I've been mulling over it and it would appear as though an opportunity is opening for me to switch from marketing into data.

I've completed the coursera course on ML and have a book on CUDA but there are a lot of moments regarding data science that does often put me to sleep. I've done 1 hackathon with big data as a topic which admittedly was pretty fun but we were working with tools that were already able to do most of the common functions you would want to use.

how is it from a job perspective? Do you find it more interesting and exciting than other jobs you've worked? Most of my experience is around web

Thanks for your responses in advance.
 
Hi, I'm a software developer but don't work in the areas you mentioned. I've done part of my batchelor in CUDA & Brook back in the days

My first remark would be that that this field you've mentioned is.. quite heterogenous. Functional programming (ML) is not Big data (though framerworks & tools may borrow some fp style) and neither has much to do with GPGPU.

I've not coded in ML but seems a really nice language, so much better than hyped alternatives of today such as Java, Python, C++ , C# etc. HM type inference, pattern matching should be a prerequesite for any sane software that cares about corectness. FP is super cool in general as you can reason more clearly about what the code you write and you can abstract over stuff more easy in general, in my experience. And less ad-hoc jargon is involved, compared to OOP. If you want jargon, maths is your jargon for FP ;)

GPGPU on the other hand involves a lot of hand tuning, a lot of arraging problems so that they fit various memories or caches. Much less abstract code and more knowledge about the underlying hardware is needed. I can see that as being satisfying as well - after days and weeks of struggles - you finally achieve your (un)expected speed-ups. It should be rewarding.

I also find Big Data interesting. Not worked in the field but I consider switching job towards this in the future , on the hope that among the very many things I like and think I would fare well at, it is the area where there is a chance I'd find an employer in the city I live in. Big data is a lot about distributed computing and tools that handle it nicely and transparently (but are hard to mix and match). So this paralelism would be one of the few (the only one?) similarities it has with GPGPU
 
Last edited:
I work with data but probably not the type you're interested in ;)
 
My first remark would be that that this field you've mentioned is.. quite heterogenous. Functional programming (ML) is not Big data (though framerworks & tools may borrow some fp style) and neither has much to do with GPGPU.
I am pretty sure he means Machine Learning and not ML the languange ;)
 
I am pretty sure he means Machine Learning and not ML the languange ;)
Lol yes :) but the rest of his post was fairly informative. I just had the introduction interview and the job looks heavily BI based which is a far cry from big data or GPGPU.

But could one day lean into machine learning or what not.
 
Sorry ML abbreviation gets automatically expanded in my brain into the language for any technical topic.

Ah ok. I've done a bit of machine learning too . My GPGPU batchelor was in fact on trying out a machine learning algo... Now that feels so 2005ish ;). I also have a friend who's a post doc and he's been doing neural networks + CUDA for the past 4-5 years

If you have a math-friendly stomach, I feel there's nothing to dislike in this field.

Otherwise, i guess it depends on the respective job. Handwritten character recognition, face detection and others may not sound as appealing now as they did 10 years ago. But there are other newer & probably more interesting applications out there. I would have dared to say speach recognition now that we are able to use NN for this task (as opposed to Markov chains). But my feeling here is that the language processing side of things is still lacking (not heard of any new tech or papers out there). And ML is not going to solve that, i think

And yes, I recall there are a few members here which actively work with NN so their potential posts would be potentially more insightful for you ;)
 
Last edited:
Back
Top