Preferences

aleph_minus_one
Joined 3,464 karma

  1. > If that's the case [...], shouldn't retaining that skill be the priority for anyone who has already acquired it?

    Indeed I believe that, but in my experience these skills get more and more useless in the job market. In other words: retaining such (e.g. low-level coding) skills is an intensively practises hobby of such people that is (currently) of "no use" in the job market.

  2. > How do you use Google, translator, (even dictionaries!), etc without "degenerating" your own abilities?

    By writing down every foreign word/phrase that I don't know, and adding a card for it to my cramming card box.

  3. > If you hate reading other people's code, then you'll hate reading llm generated code, then all you'll ever be with ai at best is yet another vibe coder who produces piles of code they never intend to read, so you should have found another career even before llms were a thing.

    I do like to read other people's code if it is of an exceptional high standard. But otherwise I am very vocal in criticizing it.

  4. This fits my experience: programmers who are very vocal in their hate of using AI for programming work have in my opinion traits that make them great programmers (but I have to admit that such people often do score not very high on the Agreeableness personality trait :-) ).
  5. > Many have the attitude of finding one edge case that it doesn’t work well and dismiss AI as useful tool

    Many programmers work on problems (nearly) *all day* where AI does not work well.

    > AI gives you an approximated answer, it depends on you how to steer it to a good enough answer

    Many programmers work on problem where correctness is of essential importance, i.e. if a code block is "semi-right" it is of no use - and even having to deal with code blocks where you cannot trust that the respective programmer did think deeply about such questions is a huge time sink.

    > Some people are just not good at constantly learning things

    Rather: some people are just not good at constantly looking beyond their programming bubble where AI might have some use.

  6. > Module 8 is worded in a way that surprised me: “if we could decide the languages in NP efficiently, this would lead to amazing applications.”

    > My understanding of P=?NP is that all intuition points to the answer being “no”. The openness of the question doesn’t give us hope that a magical algorithm might one day arrive. It just means that despite a lot of suspicion that NP cannot be solved in polynomial time, we cannot yet prove this to be the case.

    I admit that I am slightly biased towards that P=NP does indeed hold. I know that this is a non-mainstream view; even though I could give arguments to actual experts in the field that (surprisingly) did convince them that it is much less "clear" than they thought all the time that P \neq NP "must" hold.

    On the other hand, I consider it to be plausible that the fastest algorithm that might exist for a "natural" NP-complete problem (e.g. 3-SAT) that exists could have a runtime of \Theta(n^(2^(2^(2^1000000)))).

    Or, it is also possible that the proof of P=NP might be highly non-constructive, and obtaining an actual algorithm from this non-constructive proof might be even harder than proving P=NP. Such a situation actually exists in the Robertson-Seymour theorem ("every family of graphs that is closed under taking minors can be defined by a finite set of forbidden minors"):

    > https://en.wikipedia.org/w/index.php?title=Robertson%E2%80%9...

    Unluckily,

    > https://en.wikipedia.org/w/index.php?title=Robertson%E2%80%9...

    "The Robertson–Seymour theorem has an important consequence in computational complexity, due to the proof by Robertson and Seymour that, for each fixed graph H, there is a polynomial time algorithm for testing whether a graph has H as a minor. [...] As a result, for every minor-closed family F, there is polynomial time algorithm for testing whether a graph belongs to F: check whether the given graph contains H for each forbidden minor H in the obstruction set of F.

    However, this method requires a specific finite obstruction set to work, and the theorem does not provide one. The theorem proves that such a finite obstruction set exists, and therefore the problem is polynomial because of the above algorithm. However, the algorithm can be used in practice only if such a finite obstruction set is provided. As a result, the theorem proves that the problem can be solved in polynomial time, but does not provide a concrete polynomial-time algorithm for solving it."

    So, it is in my opinion a very "religious" belief that even if we could prove P=NP, a practically relevant algorithm will arise from this proof.

  7. > Once the bomb threat wasn't a way to get out of school on a nice day there was never another one.

    If I think of my school time, I would believe even the fact that a bomb threat would be an annoyance to teachers would a be sufficient reason (of course, in the schools of the country where I live there were other methods than bomb threats to be an annoyance to teachers).

  8. > I wish I still had this level of motivation :)

    It's rather: can you find a company that pays you for having and extending this arcane knowledge (and even writing about it)?

    Even if your job involves such topics, a lot of jobs that require this knowledge are rather "political" like getting the company's wishes into official standards.

  9. I know that in the past (such as your mentioned 15 years ago) Intel GPUs did have driver issues.

    > It is quite telling how good their iGPUs are at 3D that no one counts them in.

    I'm not so certain about this: in

    > https://old.reddit.com/r/laptops/comments/1eqyau2/apuigpu_ti...

    APUs/iGPUs are compared, and here Intel's integrated GPUs seem to be very competitive with AMD's APUs.

    ---

    You of course have to compare dedicated graphics cards with each other, and similarly for integrated GPUs, so let's compare (Intel's) dedicated GPUs (Intel Arc), too:

    When I look at

    > https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

    the current Intel Arc generation (Intel-Arc-B, "Battlemage") seems to be competitive with entry-level GPUs of NVidia and AMD, i.e. you can get much more powerful GPUs from NVidia and AMD, but for a much higher price. I thus clearly would not call Intel's dedicated GPUs to be so bad "at 3D that no one counts them in".

  10. > Anyone have a good explanation for why elliptic curves have a 'natural' group law? [...] As far as I've seen, the group law is what makes elliptic curves special. Are they the _only_ flavour of curve that has a nice geometric group law?

    I asked the same question to a professor who works in topics related to algebraic geometry. His answer was very simple: it's because elliptic curves form Abelian varieties

    > https://en.wikipedia.org/wiki/Abelian_variety

    i.e. a projective variety that is also an algebraic group

    > https://en.wikipedia.org/wiki/Algebraic_group

    Being an algebraic group means that the group law on the variety can be defined by regular functions.

    Basically, he told to read good textbooks about abelian varieties if one is interested in this topic.

    > Are they the _only_ flavour of curve that has a nice geometric group law?

    The Jacobian of a hyperelliptic curve (which generalize elliptic curves) also forms an abelian variety. Its use in cryptography is named "hyperelliptic curve cryptography":

    > https://en.wikipedia.org/wiki/Hyperelliptic_curve_cryptograp...

  11. > Microsoft has largely abandoned DirectX

    What does Microsoft then intend to use to replace the functionality that DirectX provides?

  12. > If you have a GPU capable of cutting edge graphics, you have to have a top notch driver stack. Nobody gets this right except AMD and NVIDIA (and both have their flaws). Apple doesn't even come close, and they are ahead of everyone else except AMD/NVIDIA. AMD seems to do it the best, NVIDIA, a distant second, Apple 3rd, and everyone else 10th.

    What about Intel?

  13. > Pretty much every person in the first (and second) world is using AI now

    This sounds like you live in a huge echo chamber. :-(

  14. > I still can’t give them money, so what’s the point?

    What do you say about the following link, then?

    > https://www.mozillafoundation.org/en/donate/

  15. > Why is it necessary to have a flood of foreign money to operate the university? Universities in the past operated without an influx of wealthy foreign students paying outrageous tuition.

    I guess it is not strictly necessary, but it brings in a lot more money, which the university is of course very eager to take.

  16. > That said, in the worst case a PhD says "this person can work on ill-defined tasks and has the diligence to see them through." Regardless of the industry, that is a pretty useful skill.

    Very few companies and industries want employees who

    - are very conscientious ("has the diligence to see [the tasks] through"), and

    - are much more effective working on their own, i.e. are no "team players" because they don't really need a team ("this person can work on ill-defined tasks").

  17. > The PIs and laboratory heads all know damn well how the system works, they are no better than those bosses of H1B sweatshops, except perhaps they do their exploitation from ivy filled ivory towers rather than in Patagonia vests.

    In my observation there do exist quite some people among the PIs and laboratory heads who are quite highly idealistic for research, but have no other option than playing this rigged game of academia.

  18. > If students decide that they are comfortable with a sub 20% job placement rate, then great, nothing needs to change.

    The problem is in my opinion not this low job placement rate per se (it is very easy to find out that this is the case for basically every prospective researcher). The problem rather is the "politics" involved in filling these positions, and additionally the fact that positions are commonly filled by what is currently "fashionable". If you, for some (often good) reason, did good research in an area that simply did not become "fashionable": good luck finding an academic position.

  19. > I consider myself fluent in English, I watch technical talks and casual youtubers on English daily, and this is the first time I encounter this word lol.

    > The only "stride" I know relates to the gap betweeb heterogeneous elements in a contiguous array

    I am also not a native English speaker, but I got to know the verb to "to stride" from The Lord of the Rings: Aragorn is originally introduced under the name "Strider":

    > https://en.wikipedia.org/w/index.php?title=Aragorn&oldid=132...

    "Aragorn is a fictional character and a protagonist in J. R. R. Tolkien's The Lord of the Rings. Aragorn is a Ranger of the North, first introduced with the name Strider and later revealed to be the heir of Isildur, an ancient King of Arnor and Gondor."

  20. > I sense that your claims and suggestions here strongly suggest that your school experience is not a recent one

    Of course.

    But nevertheless, I have a feeling that the central difference is not in "recent or not", but in the fact that older generations were simply much more rebellious in not wanting to accept the restrictions set on the school computers and willingness to do everything imaginable to circumvent them.

This user hasn’t submitted anything.