That its really hard to work a hardware company when a digital hypecoins affect the value of your product.
It's really not nvidia's fault. They're in a really difficult position here to try and absorb all this new business space without stretching too thin or using too much velocity.
But in terms of processing graphics, the hardware is great. Using gpus for mining digital money hurts people using gpus for graphics, which was sorta the point in the first place. Maybe not now, but it used to be.
I don't mean to make the argument that graphics are more important than digital money, but it's like the Bitcoin community has made zero effort to be introspective about the impact it has on people who just want to play games or whatever. Or even power usage in general.
Bitcoin is such a weird phase of tech history in that its completely counterintuitive to altruism. I always imagine the Internet wanting to try and always move towards a better, more utilitarian future and benefit the users in a collective sense, but Bitcoin doesn't seem to be about that anymore.
The author is shorting Nvidia stock (as he / she discloses at the end of the article). So, while this article attempts to be objective, I just don't believe it can be.
Totally disagree with you, and agree with Taleb: To write a piece like this and be worthwhile one must put their money where their mouth is. The disclosure is all that's required.
I don't trust any writing on stocks/markets where a person does not have a position. This person is an investor or trader, unlike the forecasters/prophets/charlatans on day-time stock shows. You don't have to believe this author, but you at least know they are taking their own thesis seriously and stand behind it if they are wrong.
It would be nice if SA put them at the top of the article though.
Shorting a stock does not make automatically make negative opinions about it invalid any more than going long makes positive opinions invalid. The whole point of shorting, and the reason why it's legal, is to give people incentive to find overvalued stocks and thus correct valuations.
It's like people with a negative outlook on a company can't win: if they short, they're accused of being biased. If they don't short, they're accused of tossing out accusations with no skin in the game.
Yeah, this disclosure should really be at the top of the article. Until AMD puts out a new graphics card NVIDIA is still going to sell at monopoly prices. Prices have barely begun to approach msrp and that's 2 years after initial release of this series of cards. NVIDIA could just as easily delay release of their next gen cards until this inventory clears or AMD forces them to make a move. I wouldn't be too worried about their balance sheet right now.
There’s a point of view that short selling is important & ethical because it creates an incentive to find problems with companies and expose them. Hopefully they’re honest when they do it, but that applies to people pumping a stock too.
The bias should lead to more caution, but the disclosure was made. The disclosure would have been more ethical if it had been at the beginning - especially because of the likelyhood of people not reading to the end of the multi-paginated format.
Most of what I see on Seeking Alpha is people cherry picking charts and statistics to support a pre-existing hypothesis. Many times authors continually double down on clearly wrong opinions and just say "oh well, the market is crazy".
Surely if we're going to start accusing people of anti-NVidia bias we should include Charlie from SemiAccurate so he doesn't feel left out. He did coin the term "Bumpgate" after all.
It seems he has been tracking the AIB monthly revenue data out of Taiwan as well as trying to reconcile 2017 numbers with what Nvidia/Amd have reported on mining.And in Nvidia's case he seems focused on the CEO attributing gaming revenue strength in q1 entirely to Fortnite/PUBG when gamers had virtually no access to cards. So, the way I read it is he took the SemiAccurate report as evidence that CEO has gotten it wrong, and that this explained the very vague 'not for a long time' comment at Computex.
Seriously? I find this hard to believe - prices of GPUs have been ridiculously high (much higher than at launch) for so long precisely because the supply was so short.
Pretty much every supplier I've seen would either sell out any high-end Nvidia GPUs almost as soon as they got them or would ramp up the price more and more.
This is either an incorrect rumour or some very weird shitty strategy by Nvidia that backfired hard (I lean towards the former).
Even now OEMs should have very little problems selling cards at (above) MSRP so there must be something more going on.
NVidia probably liked the high prices (because of miner demand), produced accordingly and kept the price stable. These high prices probably ensure a nice profit margin for a longer time then normal during this stage of the product's life cycle. However a side effect might be somewhat slower sales then they planned (for example because miners will buy anyway but gamers are waiting for the nextgen cards). That means they're left with a lot of stock, which sells, but not at their normal rate. NVidia normally has a very predictable product life cycle but I think they are now in something that is (to them) uncharted territory.
This could lead to stuff like their accountants saying: "You've got an awful lot of stock which isn't selling as fast as you thought. We think that's a risk". Which might compel them to set money aside for stock risks.
The majority of the price hike happened in the supply chain and didn't flow through to nvidia. The demand itself was a benefit, of course (simply selling more), but nvidia's source pricing didn't change.
nvidia spoke out against mining because their concern was that gamers would move to the PS4 Pro, XboxX, etc, due to the shortage, which could hurt them in the long term.
And I think it was a very valid concern. I put off upgrading my gaming tower to 4K for another 2-3 years last year when prices were insane, and instead got an xbox one x. I know that all the games would run better on my pc but when the marginal cost is so much higher to upgrade my GPU that it’s frankly more than good enough.
Miner demand was about 50% of sales. Admittedly this is because a lot of gamers were turned off by high prices.
That demand has evaporated since the price of Bitcoin alternatives (eg. Ethereum) has plummeted, and the difficulty of mining is still very high since now ASICs can mine many coins more efficiently than consumers GPUs.
Plus it seems like Nvidia are stuck with a lot of low-end chips. Many miners were buying things like the 1060.
However, Nvidia have no direct competition in the GPU space anymore. They can hold off their next product launch for months and wait for their inventory to clear.
>Nvidia have no direct competition in the GPU space anymore
At what level? AMD does the same thing in GPU they do in CPU: They might not hold the overall performance crown, but their value is often better at several tiers than their competition. I really want an RX 580.
The Vega cards (56 and 64) were definitely priced competitively as NVidia felt the need to release the 1070 Ti out of nowhere. It sits in a weird price / performance area smack dab between the 1070 and 1080 cards. In addition, the Vega architecture is pretty efficient and your statement about being power hungry and very hot is false.
Perhaps they were when on their discounted launch price, but they quickly jumped up in price (around £100 in a week, even more a few weeks later) and just weren't available for quite some time AFAICT.
> In addition, the Vega architecture is pretty efficient and your statement about being power hungry and very hot is false.
Excuse me, but I speak from personal experience here. It's really not very efficient, I bought a 64 on launch day. It was hot, noisy and power hungry compared to nVidia cards at similar performance levels, which is the main reason I didn't keep it for long.
Re the competition in the gpu space: that’s also still the case in the scientific computing / ML fields, where CUDA is still several years ahead in terms of functionality and existing investment.
My understanding is that it's been mostly Ethereum driving demand for GPUs. Bitcoin can be mined on dedicated ASICs and mostly is. Ethereum requires more general purpose hardware with a large pool of RAM so people use GPUs instead to mine it. I presume the recent decline in demand for mining GPUs is due to the decline in price.
"[...] my sense is that there's a fair amount of pent-up demand still. Fortnite is still growing in popularity. PUBG is doing great. And then we've got some amazing titles coming out. And so my sense is that the overall gaming market is just really, is super healthy. And our job is to make sure that we work as hard as we can to get supply out into the marketplace. And hopefully, by doing that, the pricing will normalize and the gamers can buy into their favorite graphics card at a price that we hope they can get it at."
Prices went up because of mining, and now that the craze is over price stay up because companies want to pocket the extra money. It's as simple as that.
In my local market, the slowing down of the crypto fad is showing. Used GTX 1080 GPUs are no longer hawked at $800 and there are various ASICS generally available. Even new retail cards are coming down in price, but are still way too high for technology that’s been available for a couple years now. Unfortunately, I believe that Nvidia is going to keep the prices high so that they can position the upcoming 11 series at the $700 mark just because they can.
The article states that oversupply is in low-end GPUs. Did Nvidia incorrectly assume there would be a demand for low-end GPUs somewhere?
My understanding is that the demand for high-end GPUs was driven primarily by the shortage of middle-tier GPUs. High-end GPUs obviously are faster at mining but their cost and power consumption meant that they were lower profit and higher risk. Thus miners were buying up all of the middle-tier GPUs. Lower-end GPUs were largely overlooked due to RAM limitations the presented that limited their useful life as miners.
The shortage of middle-tier GPUs meant that gamers could either go for higher or lower tier GPUs or just hold out. Thus high-end GPU prices rose as some people chose to upgrade. When crypto currency prices peaked last fall this created futher demand for the high-end GPUs limiting their supply and further driving up prices. Now that crypto currencies prices are falling and demand for GPUs has waned, the artificial demand for high-end GPUs has disappeared.
But was there ever a demand for the low-end GPUs? If you absolutely needed a GPU sure, but the last 3 generations of GPUs have been all about 4k or VR gaming at 60+FPS. This isn't a market that low-end GPUs can satisfy. Most people looking to upgrade, want an upgrade. If you have a 3 year old middle-tier GPU it can hold it's own or beat most current low-end GPUs.
It looks like sellers/board-makers were just charging what they thought they could get away with because the consumers weren't price sensitive, not because of a shortage.
e.g. If you normally sell 100 Video Cards, at $500, but now you can sell 100 at $750 you'd be foolish not to increase the price and capture the margins that go along with it.
I recently upgraded to a 1060 6GB card (at inflated prices, because it didn’t look like there’d ever be a break in the storm). I got to thinking, I doubt I’ll ever feel much need to upgrade the screen I game on from 1080p to 4k, mostly because I sit across the room and my eyes naturally antialias things at distance...
Given that combination of factors, I have to wonder what it’ll take to make me want to upgrade from my modest 1060. I can’t imagine what sort of incremental improvement video hacks will make this card too slow at 1080p.
I have to wonder if 4k gaming and VR are the stopgaps that’ll keep some demand for improved gaming performance until some paradigm shift happens, such as the mainstreaming of realtime raytracing. It’s crazy to think that since my first Voodoo video card until now, it’s just been a steady evolution and layering of graphical hacks. We’ve known since before then that eventually raytracing would be an optimal case. I wonder how far away from a total revamping of 3D graphics we are... two generations of GPUs? Three?
I'm glad you mentioned this, because it reinforced some of my feelings on the state of the industry from the consumer end.
Fill rate is a tremendous part of GPU performance, so barring any big, relevant demand for 4K resolution or above with VR, I just don't see the need.
Game engines have this tremendous opportunity, that's now available with technology like Vulkan and Metal, to provide high performance rendering, but I think more people are concerned with games that frankly, are just fun.
We're at a really exciting intersection in technology right now, no doubt. But as far as the hardware goes, you almost have to check prices like the damn stock market.
>Given that combination of factors, I have to wonder what it’ll take to make me want to upgrade from my modest 1060. I can’t imagine what sort of incremental improvement video hacks will make this card too slow at 1080p.
What made you want to upgrade to your 1060? New, more graphically demanding games. Why are those new games more demanding? Because people bought newer, faster GPUs.
That steady evolution of graphical hacks will march on for as long as hardware improves. Ambient occlusion is a hack, ray tracing is a hack, the rendering equation is a hack. There will be no revolution, just another generation of GPUs and another generation of APIs, with developers trying to squeeze the most out of them.
Nvidia are fairly bullish about real-time raytracing, but that's just another incremental step. It's potentially a backwards step in many cases, because a lot of other tasks are competing for GPU resources. Illumination is only one aspect of simulating a realistic 3D scene.
I've heard about realtime raytracing for decades, and it regularly shown off in demos, but it never materialized in commercial engines. Even now, when we have realtime graphics that look better than movies from 20 years ago, we are still using rasterization, and it shows no sign of stopping.
Raytracing doesn't mean better. It is just a different solution to the rendering equation, and what it does better than rasterization (mostly refraction) is not worth the additional cost. Especially considering that it won't help with global illumination, which is the big thing right now.
The closest thing we have to a "perfect" rendering technique is pathtracing, but for something that isn't a sphere or 8 cubes in a box, we are far, far from realtime rendering.
I got in on an RX 580 8GB for about $320 and it handles most VR games pretty damn well. I was upgrading from an R7 270 so it was a great value and would only need to be replaced if I decide I'm willing to burn ~$1500 for the Vive pro. Depending on what you had before the 1060, you may have gotten a super bad break on the upgrade.
My point I guess is, the VR stopgap is already over. Expensive laptops can now do some VR. IE, why would I pay between 800-1000 dollars for the vega line or 1080ti line for a somewhat mild improvement? It takes last gen stuff to run VR well, so the profit margins are shrinking
I suppose if the universe is in fact a hologram as some scientists have hypothesized, and we’re just living in a 3d projection then we could say reality is faking it :-)
But the current gen video hacks and future raytracing are pretty hard to compare. Raytracing can be an accurate simulation of the physics properties of light. I suppose the biggest caveat is whether you’re doing forward or backward raytracing. Backward raytracing simulates only the photons relevant to the observer. Forward raytracing simulates photons from various light sources, a small fraction of which are observed by the camera.
Both are still based on the physical path of virtual photons, which as far as models go, is a pretty solid illusion!
We still fake the surface properties. Skin, for example, involves a lot of subsurface scattering, and we're very far from that. So texture maps it will be for a long time.
May be there is a term for it, but it is a classic example how demand suddenly disappear when the price has dropped because everyone thinks / knew it is going to drop further. For Example, they know the 1080Ti were going for $400, but due to crypto it shoot up to $1000, now even at $500, they may as well wait a little more, and new graphics Turing or Ampere are coming soon anyway.
So the optimal strategy for NVidia is to shred those redundant silicon dies? At some point new Vega will put a pressure on them to release consumer Volta/Turing/Ampere/whatever-they-name-it and at $1000 there is not much hope for getting rid of Pascals via standard sales channels.
>So the optimal strategy for NVidia is to shred those redundant silicon dies?
Potentially, yes. Dumping their current inventory at the market-clearing price might badly affect demand for their next generation of products and damage their relationship with board partners. Lower GPU prices from Nvidia will barely trickle down to customers due to the rising cost of GDDR memory and the fixed cost of all other aspects of AIB manufacturing. Constraining supply to keep prices close to list might be frustrating for consumers, but it's probably in the long-term interests of Nvidia and their board partners, especially considering the weak competition from AMD.
Just give it time for the market to digest it organically. It will be a slow process but it is the best then to shred it. Afterall Nvidia is in no hurry for new GPU introduction, as Vega doesn't seems to be competitive.
We are at the tail end of the 10-series GPU live cycle. Everyone is waiting for the 11-series release. Why would any one buy an overpriced card now that is sure to be made obsolete in just a few months time?
Not a chance. The next series will give at least a 30% margin over the 1080 Ti. Remember that there is a process shrink which is allowing an increase in 30% of CUDA cores and maybe faster MHz speeds.
I Am So Mad. I have had a new workstation purchase waiting for the cards to calm down and now I read this? We get no new technology and prices where a 1080ti equals the retail price of a Titan and now they have 300k to much inventory?
NVIDIA needs to be greeted with a Linus salute again. They have been horrible for Linux and horrible on me having a new workstation at work.
For two year old top tier video cards which are only top-tier because the manufacturer is not releasing later models to try and get rid of the old stock? Probably not.
(I have no evidence that they are not releasing newer cards just to continue to sell the older stock at MSRP).
Prices are determined by what people are willing to pay for things. A video card isn't a requirement to sustain life, it's a luxury. If people think $1000 for one is too much, they simply won't buy one. Obviously a business wants to maximize profit, but it's up to the consume to decide if it's worth it.
I looked it up I paid 351,90 euro for my 1070 last year now that same card is 500 euro's... I remember you could pick up a 1080 for 560 euro at that time.
Curiously I have a feeling that diamond jewellery falls further when you just look at the short term. What I mean is that (according to a few things I've read, and neatly summarised at http://diamondssuck.com) immediately after purchase the resale value of your diamond ring hits the floor - whereas you can at least punt a GPU for a short while afterwards.
This is mostly beside the point though - you're right that GPUs are effectively obsolete after a very short period and have very little value in the long term :)
[joke] Plot twist: A cousin of both Jensen Huang and Lisa Su was arrested in Taiwan for propelling price of BTC in the past year using bots running advanced Deep Learning. [/joke]
[not joke] Jensen and Lisa are family. [/not joke]
I wonder if this has anything to do with ETH not being able to be mined on 3GB cards any more. I may be mistaken, but I recall several months back people with 3GB 1060s no longer being able to mine ETH because of the size of the DAG.
What is Nvidia's strategy for "streaming" games, games that execute on a cloud server and streaming to thin-client machines? I assume such a future would see the gamer market for cards disappear.
Take that future, add the possible future collapse of crypto currencies, and I see a significant existential risk for anyone whose business relies on GPU sales. Not a 100% future, not ever 25%, but something to think about.
I don't think there is any majority thinking that streaming games is a thing now that VR's around. Game streaming was struggling against the speed of light to get a modest/responsive game experience. Even then on the low end there's plenty of games that'll work on intel embedded just fine so it just becomes a who does it actually serve kind of question to me.
> The report also cites Nvidia aggressively buying GDDR5 as evidence that they now have an excess stock of lower-end GPUs that need to be made into boards as well as other insiders/sources citing an inventory buildup.
Weren't lower end Nvidia GPUs not suitable for use in Mining?
Perhaps Nvidia assumed people who typically buy a new GPU annually would just pickup a low end unit instead of waiting so they over produced?
300,000 GPU dies is maybe 30M worth of inventory to NVIDIA especially considering that if these were "mining" cards those were likely GP104 or even 106's which are worth less than $100 per die.
NVIDIA made a $3.2bln in revenue in the last quarter with gaming accounting for more than half of that so do the math just how many GPUs they actually sell.
They were one of the reasons it began inflating, but as their interest cooled off, the prices didn't follow that back down. Nvidia's current model is probably based on that increased cost, so they've committed to a strategy that makes it hard to slip price reductions through channels (or are just unwilling to follow the market back down).
Is it better for them to take back returned inventory, or cut their profit on the ones out there?
What constitute a good or bad idea has to be related with how much money you end up making. As it stands the variety of crypto currency and the fact that an ASIC's algorithm is set in stone makes it so it's possible to mine some crypto currencies on GPUs for a better profit with respect to the cost of electricity.
Not to mention, the ASIC I've seen was mounted with two delta-like fans and was about 3 times noisier than 6 TI cards. Also, power availability is always a limiting problem the true metric eventually becoming how much hardware can you run on the power you have access to and how much money will that net you per day/month.
While GPUs are still used for some mining, a lot of miners have moved past them. Bitmain, the company that makes most ASIC mining chips, is valued at $12bn..
Cryptocurrency miners were a big reason for the insane GPU prices yes. But since then the price has tanked to around 30-40% of the December highs mining profitability has also tanked. This is probably why the prices have come down again.
No. Gamers have always been willing to pay stupidly high prices for the latest and greatest hardware going back decades. Nvidia has just optimized the price point.
nVidia does build their own cards, as well. But yes, the graphics card market is predicated on AMD and nVidia creating the designs, and then other companies license and manufacture them. Some will also alter the designs a bit, like adding more memory, overclocking the cards, etc.
Bitcoin surged and the same card was $1300 for a few months.
Now it's back to $750ish
My FPS stayed pretty stable though