Why is that who believes only "AI bubble" always come with "it only has real value if it replaces all of us and we will live in a dystopia with AI our owners and we will have like a dog"
I am starting to get a connection with not fully understanding what does Machine Learning does and "bubble" theory.
I hate to tell you, but AI has been used long before LLM ever come to light (so ChatGPT), companies widely used ML/AI for statistics, search optimization, administration. I understand LLM is the new hot shit but ML isn't only about "replacing jobs" what about autonomics, robotics, pharmatics, genetics?
ML is very good at understanding patterns and giving output based on them (obviously it depends on what data you feed, no, not every AI will hallucinate like ChatGPT and not every AI is a chat bot)
I won't state AI hype currently isn't includes "job replacing" and It will but why all of you stop AI hype at that level why not go beyond that?
Google TPUs handle training just as well as nvidia at a much lower cost. Still need nvidia for customer workloads that require GPGPU, but not reliant for AI/ML workloads. Source: I work for GCP
Shit, probably. I did some work there about 15 years ago and they were doing really advanced stuff back then, eg massive HPC/ML clusters doing drug discovery work, protein folding, etc. It's the only place I've met dudes with computer science phds.
Finished my Master's in AI & ML and passed my Series 65 this weekend.
Now, I am super baked and stacked with dog bones for the dogs, all beef hot dogs, buns, chicken nugs obviously, and mac and cheese. Windows open at a pleasant 68F. This is all I need to be happy.
*But, I would prefer the Casino open
If you’re long ORCL, the real bet is on OCI growth, the MSFT tie-up, and whether they can line up GPUs and power; averaging down without those hitting is how you get bagheld.
What I’d watch this print: OCI growth pace (still >50–60% y/y or slowing), RPO/backlog, Oracle Database@Azure customer logos and new regions, Cerner margin recovery, and any specifics on capex, data center power deals, and Nvidia H200/B200 delivery timing. OCI’s edge is often price/perf on GPUs and cheap egress, but it only matters if they can turn that into capacity and logos.
If you want exposure with less pain: sell cash‑secured puts at levels you’d be happy owning, or wait for the call and write covered calls on any spike; set a hard line where the thesis breaks (e.g., OCI decel + weak backlog).
On the ecosystem point: we’ve shipped data apps with Snowflake for warehousing and Databricks for ML, and used DreamFactory to quickly stand up REST APIs over Oracle/SQL Server so teams could ship without building gateways.
Bottom line: ORCL works if OCI + Azure expands and GPU/power ramps show up; otherwise it’s dead money.
>“You’re absolutely right!” I see you’re using pandas with this large dataset, sometimes pandas struggles with large matrices, let’s add 17 log files to find the root of the problem…. I have no doubt this could be done for significantly less computational resources than is currently being reported.
Lmao so true
>ML researcher with econometrics? Sounds like a certain profession I won’t mention here. Any experience with rough bergomi models and/or using ML for calibration
No unfortunately, statistical learning theory on time series & nonlinear cointegration tests
“You’re absolutely right!” I see you’re using pandas with this large dataset, sometimes pandas struggles with large matrices, let’s add 17 log files to find the root of the problem…. I have no doubt this could be done for significantly less computational resources than is currently being reported. ML researcher with econometrics? Sounds like a certain profession I won’t mention here. Any experience with rough bergomi models and/or using ML for calibration
Lolzers I was an AI/ML researcher (applications to econometrics) before becoming a degen, and while Chat never fails to amaze me that fucker always gets something wrong and trying to code with it makes stuff unnecessarily complex, unnecessarily fast. AI economy is a bubble, definitely. No reason to lay this many people off.
There's also the possibility that some clever people (likely from East Asia) coming up with a simpler way to do linear algebra that requires less computational resources and drop NVDA down to earth's crust.