Preferences

cashsterling
Joined 1,161 karma

  1. I too doubted, from the beginning, that neural networks will be the basis of AGI. As impressive and useful as LLM's are, they are still a long, long way from AGI.
  2. It is not a one dimensional take... it is a stress test of qubit gate fidelity [across all qubits involved in the circuit], state prep and measurement , lifetime (coherence), memory errors, etc.

    Now I agree that there are other great stress tests of quantum computer systems... but most of the industry agreed that quantum volume was a great metric several years ago. As many companies systems have been unable to hit decent QV, companies have pivoted away from QV to other metrics... many of them are half baloney.

  3. The better analogy here, in my opinion, is until someone builds a car that actually drives forward...

    A lot of these quantum computing papers are just loud engine rev'ing.

    But you are right... "zero attention" is hyperbole... I shouldn't have said "I pay zero attention"... I should have said, "I am underwhelmed and don't give a lot of credence to systems until they demonstrate ability to perform high fidelity calculations of reasonable complexity and depth."

    I do think superconducting qubits approaches will continue to improve in fidelity and ability... there are just too many brilliant people working on these challenges to count them out.

  4. Until a company can demonstrate a quantum volume of even just 2^16... their computer is just about worthless for any sort of half-real quantum computing.

    I pay zero attention to any technical information and marketing speak coming from a quantum computing company until they demonstrate decent quantum volume.

    Most companies computers can't even hit 2^16 so they are figuring out means of distracting the market from the poor fidelity of their systems.

  5. Modern high-end fabs have extremely expensive equipment and are highly automated... like they are so automated that people don't actually handle wafers... it is almost all robotic.

    Thus, salaries and cost of services do not factor in as heavily as you might think to fab economics.

    Data suggests that TSMC's per wafer costs in Arizona are 10-30% higher than Taiwan and that Arizona fab is relatively new. It's economics will probably improve over time, narrowing the margin to 5-15%.

    Looking towards the future, power costs and other global supply chain factors could very easily make TSMC's Arizona fab less expensive and more reliable to operate over time. For one, the US is completely energy independent... Taiwan is not.

  6. Atlantic Quantum has some brilliant people and I'm sure they have some great ideas.

    That said, I dislike the quantum computing craziness we're in where big claims are made without proof or data.

    "We have the fastest clock speeds, lowest error rates, and most scalable architecture... no data, just take our word for it."

    It will be interesting to see how they do in the DARPA QBI program.

  7. I always try to communicate a strong sense of hope to my teenage kids. Many nations have an inverted population demographic, including the US (especially without immigration).

    I think the US is "as little as 10 years" away from a significant skilled labor shortage.

  8. I think Ada is a great language and it is completely possible that Ada will experience a resurgence in coming years.. especially as LLM's are used more and more to generate software. ADA/Spark can provide very robust guide rails for correctness (so can Rust) of 'AI' generated code.
  9. Yeah... my intro CS class was in C and Ada95 (I'm not a CS guy btw, just took the class). I actually preferred Ada over C... but continued to program in C for other classes because of compiler availability; I had to do all my Ada programming on Sparc workstations at school.

    I personally think that AdaCore, and friends, missed an opportunity in the early 2000's to fully embrace open source... they left a big gap which Rust has filled nicely.

    I still think Ada is a great programming language. When a question comes up along the lines of: "what's the best programming language nobody's heard of?", or "what's the best programming language that is under used?" Ada is usually my first answer. I think other good answers include F#, <insert LISP flavor>, Odin, Prolog, Haxe, and Futhark (if you're into that sort of thing).

  10. I really hope the best for Oxide and applaud their compensation model.

    I applied to one of their roles which required me to write about 10 pages of text to answer all their questions... which I think is a big ask but I did it because "why not".

    They took over 3 months to get back to me, but at least they got back to me (with an apology and a polite "no").

  11. 100% agree with you!

    I have worked US manufacturing and manufacturing R&D for most of my career: pharmaceutical, microelectronics, materials, aerospace, etc. The US is awesome at manufacturing when we want to be.

    One problem is that "modern MBA/business philosophy" views manufacturing and manufacturing employees as a cost center and there is so much emphasis on maximizing gross margin to increase shareholder value.

    So business leaders scrutinize the hell out of anything that increases the cost of their cost centers:

    - employee training & development? hell with that.

    - Increasing pay to retain good employees in manufacturing? Why? isn't everything mostly automated?

    - manufacturing technology development? Not unless you can show a clear and massive net present value on the investment... and, then, the answer is still no for no good reason. I have pitched internal manufacturing development investments where we conservatively estimated ~50% internal rate of return and the projects still didn't get funded.

    There is also a belief that outsourcing is easy and business people are often horrible at predicting and assessing the total cost of outsourcing. I have been on teams doing "insource vs. outsource" trade studies and the amount of costs and risks that MBA decision makers don't think about in these situations really surprised me initially... but now I'm use to it.

    Anyhow... the US (and Europe for that matter) can absolutely increase manufacturing. It is not "difficult"... but it would be a slow process. I think it is important to differentiate between difficulty and speed.

  12. I graduated in ChemE in southern California in 1999 when there was a major downturn in the job market. One or two big chemical engineering design firms closed their SoCal offices flooding the market with qualified chemE's, aerospace companies were consolidating, etc.

    Some of my school colleagues got good jobs at refineries and whatnot... but they were the fortunate ones. It took me 12 months to land my first "I made it" engineering job with a good salary. In the interim, I worked hourly jobs making between 13-18 USD an hour.

    Don't let the current job market deflate you. You are young, intelligent, and you have a degree from MIT... you are going to be fine.

  13. My unpopular opinion, as someone who worked in vaccine development and biotech manufacturing for a lengthy portion of my career, is that:

    - vaccination should be a personal choice... no one has the right to compel anyone to get injected with anything. Full stop.

    AND

    - we need to raise the level of scientific and medical rigor applied to vaccine development, manufacturing, vaccine administration, and ongoing monitoring / pharmacovigilance so that people will feel comfortable taking vaccines.

    The fact is that different quality and risk/benefit rules are applied to vaccines versus other injected drugs and this is not okay (read up on the National Childhood Vaccine Injury Act passed on 1986, why it was passed, and how vaccines are tested for efficacy and safety versus other injected drugs).

    Under normal circumstances, the COVID mRNA vaccines should have failed the clinical trial requirements for human safety. There were far too many adverse events in the clinical trials and many, many people were harmed by these mRNA vaccines (with quite a few deaths). The NIH & FDA largely swept this under the rug and refused to investigate or fund adequate investigation. Unfortunately, that entire episode has seriously harmed people's faith in the government and has negatively influenced people opinions on vaccine safety.

    This is very disappointing because there are a lot of highly effective vaccines with excellent safety profiles when administered properly... vaccines almost everyone should receive. But the average citizen can't tell the difference and I don't blame them if they don't trust the government to protect them at the moment.

  14. I work in system engineering at a quantum computing company... when I read announcements and new papers on quantum computing, qubit technologies or implementations of quantum computers, I loosely apply these thresholds.

    threshold one: The authors/company report data on qubit coherence times, single-qubit and two-qubit gate error, state prep and measurement error, etc. The more rigorous data, the better. If they report data from different qubits on the same device or across multiple devices, even better. But without this sort of data, any assertions or forward looking statements about the utility of a device or quantum computing approach are "pure marketing". To me... the Microsoft paper and announcement do not meet threshold one.

    Threshold two is performing some sort of useful bench-marking calculation that requires repeated use of multiple qubits in concert. Quantum volume calculation is one such benchmark. It much easier to get great qubit results from a small test device (a hero device) than a larger system. It is tough to make a blanket statement about all QC technologies but system noise levels and calculation error failure modes scale with # of qubits... so be able to achieve high fidelity two-qubits gates repeatedly in a deep ciruit using 20-50 qubits is much, much more difficult and impressive than demonstrations with 1-4 qubits. To me, number of qubits is almost irrelevant if those qubits are not useful together... example: if a company reports 100+ qubits on a device but can't pass a quantum volume 12 or 16 calculation, then I will reserve judgement about the utility of that QC approach. There is engineering development value to scaling number of qubits (like figuring out how to orchestrate massively parallel qubit control at scale) while also working on improving qubits performance metrics... but these two development streams have to converge: lot of qubits AND high fidelity. Demonstrating high fidelity at low qubit count and high qubit counts separately doesn't mean that high fidelity, high qubit counts will be achievable.

  15. I'd consider Advanced Engineering Mathematics by Zill, 6th Edition which is available used for as little as 25 USD. There is also a print solution manual for this book which is great for self study.
  16. At a startup I worked at, we had our adaptive control pilot software flying our drone designs in XPlane for testing and demo purposes.
  17. Rust has functionality in the std lib for saturating arithmetic (and other kinds of arithmetic via crates) so that calculations saturate rather than overflow, etc. https://doc.rust-lang.org/nightly/std/?search=saturating
  18. If your goal is literally to get good at math through Calculus, Differential Equations and Linear Algebra... I feel like the best approach is working through good books (reading, taking notes, and doing problems out of the book then checking answers/solutions) and hitting Youtube and Khan Academy for help with certain topics.

    Fortunately, there are a lot of great older-edition, cheap used books on Pre-Algebra, Geometry, College Algebra, Pre-Calc, Calculus, DiffEq and Linear Algebra, etc.

    It will take some time to master all the math you want... but if you structure your studying so that are you actual enjoy studying (i.e. enjoy the process), you will 100% get there.

    I'd be happy to list out of some book recommendations.

    I will echo the sentiment of others that you don't need to be a math expert to use machine learning libraries effectively in many cases. The problem is, without the math expertise you won't always be able to identify the cases when you are using the wrong approach, etc. And you'll have a harder time applying cutting edge ML to new problems.

  19. Going away from physical book-based learning was possibly well intentioned (but I have my doubts)... but it was really dumb.

    There are clear studies that show reading a physical book (versus a screen) and using and physical pen or pencil on a piece of paper, versus typing or drawing on a screen, leads to higher comprehension and retention of information... and thus much better overall learning outcomes. This doesn't even consider the fact that youtube, discord, and a bunch of other apps are a swipe away on an iPad.

    A common solution to the "carrying books around problem" used to be there was the copy you were issued (and mostly stayed at home) and there was a shared classroom copy.

    Carrying around 2-3 books plus a binder is not a big deal (and is not a 30lb backpack... more like 10-15 lbs)... we act like this is some sort of massive hardship yet so many of us did this for over a decade of our childhood with no ill effect.

  20. This is absolutely ridiculous... shame on UW. Their quid pro quo offer is ethically very wrong.

    My wife works at a major public university and fights with antiquated software and business systems all day long. The amount of IT system bloat and 'village-idiot dumb' processes for managing course offerings, course catalogs, class schedules, etc. is pretty bad at a lot of universities.

This user hasn’t submitted anything.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal