Learning programming languages *split*

Discussion in 'General Discussion' started by tuna, Nov 25, 2016.

  1. tuna

    Veteran

    Joined:
    Mar 10, 2002
    Messages:
    3,039
    Likes Received:
    299
    That is not an object oriented approach, that is just code separation for readability and/or removing code duplication. But this is OT.
     
    Cyan likes this.
  2. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,129
    Likes Received:
    1,942
    as of currently I am learning Java, that's what they teach me. Most of the typical programs I am making are things like the Floyd's triangle, Pascal's triangle, plus other shapes, arrays, maybe a calendar, etc etc. But for the most part everything is in the main method, so it looks for now like structured language, like C.

    Java is a great programming language, to learn and stuff.



    My real passion is Xamarin Studio combined with C#. I find it so elegant:



    I am learning both at the same time, they are quite similar and C based. Another super interesting language to me is Haskwell, but it might take time to learn the 3.
     
    #2 Cyan, Dec 5, 2016
    Last edited: Dec 5, 2016
  3. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,129
    Likes Received:
    1,942
    Of course you're right that if you want maximum performance on an AAA game or maybe an emulator of a modern machine, C++ is going to be the language.

    But Java is perfectly useful for most non performance intensive games, so is C#. For instance, Minecraft is written in Java, which in turn was inspired by Infiniminer, which was written in C#.

    https://www.rockpapershotgun.com/2011/01/20/proto-minecraft-abandoned-due-to-epic-error/

    afaik, you can use the unsafe tag to go low level in C# and use pointers, but I don't know exactly how it works tbh.

    This game was made using C# and is a 4 years work, and you can see how the graphics are, but you know...

     
  4. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,954
    Likes Received:
    811
    Location:
    Planet Earth.
    Functional/structured programming is wayyy better, Object Oriented (C++/C#/Java style) is pretty much all you shouldn't do if you want high performance, working on one item at a time instead of a bunch of them, having a lot of dead data in your cache when you want none...
    Programming is difficult because you need to know the hardware to make best use of it, and the ins and outs of your programming language so it doesn't get in the way, also big O notation is less and less relevant because memory access are the bottleneck...
    There's so much to say about programming and languages and algorithms and how primitive all of that still is, not even talking about keyboards which are nothing else than glorified typing machines meant to slow you down !

    Computer science is mostly wrong and backward with occasional bright things.
    [Time too limited to expand atm, sorry.]
     
    Billy Idol, milk, Cyan and 3 others like this.
  5. rcf

    rcf
    Regular Newcomer

    Joined:
    Nov 6, 2013
    Messages:
    323
    Likes Received:
    247
    And we still program computers using 7bit ASCII glyphs that already existed in ancient typewriters. Since we now have Unicode, we need keyboards with e-Ink/OLED keys that can be remapped to show any Unicode glyph, and then standardize some keyboard layouts for programming and create some kind of "programming notation" (like it was done to math notation).
    It's sad to see languages like APL almost completely forgotten. It has its share of problems, but it is extremely powerful and its creator had a point when he talked about notation as a tool of thought.
     
  6. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    6,761
    Likes Received:
    4,975
    introduction to comp sci is pretty... accessible, but the good stuff happens when you go low level. Compilers, OS, and gate level programming separate the 'just need a job' from 'I like this stuff'
     
  7. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,284
    Location:
    Helsinki, Finland
    Big O is relevant. However the N in O(N) isn't the number or comparisons or arithmetic instructions, it is the number or cache misses. Cache miss is ~200 cycles, while ALU (multiply, add, etc) is just one.

    Branches are worth noting. Modern CPUs are pretty wide and have deep OoO. CPU must guess which way each branch goes. If the guess is wrong, all the speculative work must be discarded and executed again. Avoid branches that are random. Branch that changes infrequently (like null checks and validation) is however fast. Sorting/separating data by branch is a good way to avoid branch mispredictions and also a good way to separate data/code. Leads to better cache line utilization and better code reuse. This is the basis for data oriented entity/component data models.
     
    Alexko, Cyan, Rodéric and 1 other person like this.
  8. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,954
    Likes Received:
    811
    Location:
    Planet Earth.
    Indeed. As I said I typed that in a hurry, it's a topic I like too much to pass on replying altogether but under severe time constraints so I'm not nearly precise or deep enough :(

    When it comes to branching I prefer no branching as much as possible, [you can do that for maths code with masking and such] there are branch prediction tables and if you can use them sparingly without unnecessary clutter it should run faster [fewer mispredictions]. (nullptr checking for exemple should only be in debug build IMHO. Depending on your programming language you may not have the problem at all [non nil pointers in Swift] or have alternatives [references in C++])
    Analysing your data and how you use it (which algorithms you run on it) is the only correct way to decide on your data structures (what you pack together because it's used together)
     
    #8 Rodéric, Dec 6, 2016
    Last edited: Dec 6, 2016
    sebbbi likes this.
  9. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,954
    Likes Received:
    811
    Location:
    Planet Earth.
    I'll still hammer that our input devices, especially the keyboard, in its very layout, is just completetly wrong, it's a copy of a type-writer which by design is meant to reduce typing speed, therefore efficiency :(

    When it comes to programming languages, we are still mostly telling the machine what it must do [imperative programming] rather than what we are trying to achieve, although some languages are more about the latter. (Haskel ? Swift ?)
    I'm also unsure using ASCII or even Unicode is a good solution either, first you cannot notice connections/spaghetti code easily whereas some kind of visual programming tool could do that, also I think pretty much all cultures have the same say : "Un petit dessin vaut mieux qu'un long discours" in french, and "a picture is worth a thousand words" in english, which clearly should be a hint that writing in a solely textual programming language might not be the best way to convey the information we are trying to pass on.

    I think there is still progress at least in the primitive textual form to make things more readable for the programmers (because, we must all acknowledge that our work will mostly be read by fellow humans, so we must optimise for them), without losing performance for the machine.

    (Also note that the compiler's optimisations are not magic and often rely on things they shouldn't, such as undefined behaviour, and you shouldn't count on your compiler to make your code fast, but rather carefully analyse your data set & algorithms.)
     
    sebbbi likes this.
  10. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,284
    Location:
    Helsinki, Finland
    Agreed. Most code should take parameters as references. Most low level code can't properly deal with null case, and it's not even that code's responsibility. So passing a pointer to low level code is not a correct thing to do. I use only pointers when null is a valid value (this is a very small percentage of code base). Error checking (including unresolved/missing links to assets -> null pointers) should be done at a higher level. You then pass data as reference. if (data == nullptr) handleNullCase() else processData(*data). Data processing code filled with null checks is less readable. Also as you said, even predictable branches are not free. It is instruction cache bloat and too many branches close to each other result in worse branch prediction behavior (as there's fixed amount of predictor storage per cache line).

    My preference is to assert code bugs and handle data issues separately (null pointer checks, range checks, etc). Data (asset) issues should never terminate the execution, as most engines share code base with game and tools. Tool build crash (assert) because an user filled wrong data to a field is unacceptable. Data issues should give good error messages and recover. If corrupt data however gets loaded (because it is not correctly validated), that's is a code bug -> asserts will catch that code bug -> someone will fix the actual bug, which is missing data validation (+ potentially add new error message to tools).
     
    BRiT, tuna and Rodéric like this.
  11. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,954
    Likes Received:
    811
    Location:
    Planet Earth.
    I think I'll just write a quick summary of the languages I learnt:
    Basic (on Amstrad CPC6128), using line numbers to call subroutines
    Pascal, C++ & Object Pascal, during my computer science degree, pointers where tough back then ;)
    C, because C++ is based on it and it was just difficult to learn w/o its basis
    Java, as a server back-end dev during the 1st internet bubble
    D, D2, because I was (and still am) dissatisfied with C++
    lisp, mostly primitive stuff, but it looked fun to learn.
    Clay, which is a kind of C with generics, really nice, changed the way I programmed since
    Chapel, mostly "Hello World !" stuff, I really like the approach the language took on concurrency, it's well done and could become the next big language with good runtime/tools
    Swift, just started, more than "Hello World !" but not a small engine with it yet, it's still in infancy but already nice

    I wrote a 2D rendering engine in Pascal, 6 or 7 engines in C++, 1 in D2 (barely finished), 1 in Clay (unfinished though).
    I find it really interesting to rewrite code in a new language with new paradigms, it usually changes the way I code from then on, I use generics a lot more and dislike object oriented a lot more (because the way C++ does it, encapsulates functions in a namespace which prevents me from writing a generic version that is called like a regular member function. In D2 there's Universal Function Call Syntax, which means foo( bar* b ) can be called either b.foo() or foo( b ) so I can.)
    Swift protocols are really interesting, they seem to be (not dived in implementation yet) about C++ (still unavailable) concepts, which pretty much are template arguments requirements, to the point Apple started referring to Swift as a Protocol Oriented System's Programming Language, AFAIR. ^^
     
    Cyan likes this.
  12. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,954
    Likes Received:
    811
    Location:
    Planet Earth.
    Maybe I should also go about programming styles.
    Since I started with basic, I used imperative structured programming, but only with global variables since that's how the language works. (Globals require special care in heavily concurrent engines, so you use them as little as possible, or with extra care, so rarely because extra care is tedious and error prone, basically only things like AssetMananger or LogFile...)
    Pascal is conceptually similar but has local variables.
    C++ & ObjectPascal exposed me to a virus called object oriented programming (which is such a bad idea it could only originate from California according to M Dijkstra)
    Java nailed the idea in my brain (unfortunately, although Java is schizophrenic having an int primitive and an Integer class.)
    D & D2 being spirtual successors do C++ continued with the idea of objects, although D2 went multi paradigm and bit more functional (Haskel is a functional programming language, to me it means more math like), because Alexandrescu went on board [and he's very C++ biased even if he doesn't realise it].
    lisp was for fun, but everything a list was definetly interesting, and taught me how to think completely differently.
    Clay was going back to a kind of C with a dead easy syntax and generics, wrote all the supporting code of my existing in engine in about half the number of lines, because there was a lot of code to reuse through templates that was not possible when functions were integrated into their own namespace (a class defines its own namespace) and with a hidden pointer (the "this" pointer). I also had something like concepts already, and that was before C++11 AFAIR.
    I found Chapel looking for something better at concurrency that could run on heterogeneous architectures (ie CPU & GPU), that one being made by Cray inc, it handles supercomputers ^^ Very well designed language in the making, I really like it, following every release because I find it to have a huge potential.
    Swift is the best alternative to C++ IMO, written by one of the makers of LLVM, the language is simple, straightforward, powerful, clean, and getting fast. (already as fast/faster than C++ in some cases, sometimes a lot slower, but it's very young !) And it's protocol oriented more than object oriented, seems similar to the way I ended up using templates in C++ thanks to Clay, so might be really nice ^^

    For people learning programming today, I think I'd push them toward Pascal, because it's really well thought, for better low level knowledge maybe some C programming, but for more modern stuff, I'd point directly at Swift.
    Of course that also depends whether it's the start of a pro career and in which field, for the web, it seems php rules [horrible thing that language], for games C++ rules [unfortunate, language massive, bloated, badly designed and getting worse by each release in complexity, but huge amount of existing code + plenty of rather good tools = hard to avoid :(], and for Apple... well you're the luckiest then, you can use Swift !
     
  13. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,459
    Likes Received:
    1,938
    How do you suggest we enter text, telepathy ?
     
  14. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,954
    Likes Received:
    811
    Location:
    Planet Earth.
    What about this
    [​IMG]
    https://shop.keyboard.io/

    The problem with current keyboards is that, first keys aren't in a row/column format because it was (mechanically) impossible on typewriters, but on electronic devices this is no issue and is way more natural, second is that letters placement was meant to minimize typing accident/interlock, nothing else (not balance writing to both hands, not based on statistical analysis or comfort or anything, just because of the way they were built.)
    So both the physical & logical layout of current computers keyboards are plain wrong...

    (Small illustration for what I call typing accident : upload_2016-12-6_16-45-55.jpeg )
     
    #14 Rodéric, Dec 6, 2016
    Last edited: Dec 6, 2016
  15. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,954
    Likes Received:
    811
    Location:
    Planet Earth.
    After a night's rest, I think Pascal is a good fit, but Swift might also be, it's more modern unfortunately doesn't run on Windows yet (I think there's a port in progress), but runs on macOS, iOS & Linux, which isn't negligible.
    Also I would advise learning about the basics of the hardware, nothing difficult, what's a CPU, memory, how they work together, have some metric informations about size, speed, bandwidth to get a clue, then the programming language, then some algorithms and more details about hardware.

    I wonder how other programmers would advise to proceed, I learnt some things before studying computer science, some during, and an incredible lot after...
    (Including reading the not so digest Intel architecture books/pdf)


    When it comes to algorithms books, I recommend either "The algorithm design manual" or "Introduction de algorithms", you can probably read a number of chapters online to see which one matches you best, they cover the same topics so one should be enough.
    There are plenty of good books when it comes to 3D graphics, some with more or less details, the reading them in the right order might be easier.
     
  16. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    17,599
    Likes Received:
    1,148
    Location:
    Maastricht, The Netherlands
    I have a long career of using all sorts of programming languages, and even wrote some training material. But today, I would go about teaching programming completely differently.

    I can't go into it too much now, but basically I would teach two different programming styles: agent driven and data driven. Agent driven would be setup very object oriented like, and focus on seeing programming as creating agents that do tasks for you. I would bring in (unit)testing very early too, as a way of thinking about what you want to program before you start, and having quick feedback on whether (and eventually proving that) you succeeded.

    The other approach would be to see any programming assignment as a factory line that has resources (data) coming in, and resources (data) coming out, and you are going to try to create as efficient a line as possible.

    Then I would try to bring in DDD and MicroService concepts for high-level organization and tie everything together.

    While the actual language isn't that important, I would be inclined to go for C# for the agent driven approach, and F# for the data driven approach. I did try Swift on the Mac briefly, and I like that too, especially its sandboxy nature there (create visual effects while you type code). But I don't know if I'd want to give up on the wealth of unit testing frameworks for C#, which I think can really help learning to understand code and learning how to keep code not becoming a giant ball of mud.
     
    Orion likes this.
  17. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,129
    Likes Received:
    1,942
    wow, that's an impressive amount of languages. I wonder how you cope with that. Here you have the code for the last-letter first-letter "game", well, better called the chain of words game -which I had to create in an exam- in a ton of languages. I checked and compared the syntax of the different languages you used, and they are so different!

    https://rosettacode.org/wiki/Last_letter-first_letter

    It's also my intention to lean several languages. My main language is Java, I have extra passion for C# and C++, and I'd like to learn a language or two that almost nobody know, and I got to say that of those Haskell seems to fit that, plus it's simple and structured. As for Swift..., I just heard of it and you seem to like it so it might be good. For me, until it becomes more platform agnostic it's not much of an option...
     
  18. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,129
    Likes Received:
    1,942
    a friend of mine, who is a professional programmer has a similar keyboard. Where I have hard time programming is writing brackets and stuff --it also depends on the language your keyboard is set in, but still.

    I learnt typewriting, I studied it for 4 years, and had a decent amount of words per minute I could write -though women in my class where the best at that, maybe women have nimbler fingers-.

    But when I am programming that speed is gone! Because traditional typewriting and programming aren't the best friends, I think.
     
  19. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,129
    Likes Received:
    1,942
    What does big O and O(N) stand for?

    As I said, Java is my main language (even read Deitel's 9th edition book on it), although I enjoy C# quite a bit more. I tried several Java IDEs, and NetBeans is probably the best, along with Eclipse. Also tried BlueJ and Dr. Java, which are simpler, especially Dr. Java.

    With C#, MonoDevelop (Xamarin Studio) everything works like a charm for me. Visual Studio is very nice as you might know but Xamarin Studio is multiplatform, plus it's so fine and easy to use... F# I am interested in because it's meant for GPUs and I would like to create a game some day.
     
  20. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,129
    Likes Received:
    1,942
    In regards to learning/teaching, I am currently reading a 2016 book I downloaded in PDF format (although I am going to buy the actual book when I can import it, you can easily see the great effort the guy put into it) called Essential C# 6.0 which I downloaded here (the link has a collection of good books to learn programming and programming games)

    https://onedrive.live.com/?authkey=!ACHCoDPWYJdjcRY&id=3974B306D708001E!4064&cid=3974B306D708001E

    As a newbie structured programming looks easier for me, but I think OOP might be better in the long run, especially if you can do that while getting high performance, like in C++ (C# is not as fast, but the unsafe tag might be useful if you want to go lower level and use pointers). This is what it says about it in the book:

    I also like the Foreword of the book, since I am not a very critical person (I get easily hyped and many many times the fall is far greater than how high you think about something), it happened many times to me, never learn :/) and I admire the people who is, and part of the book is depicted to one of those people):


    On a different note, I would like to effectively use LINQ and Lambdas.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...