Full NVIDIA GeForce GTX 1650 Specifications and Prices emerge on the web

Published by

Click here to post a comment for Full NVIDIA GeForce GTX 1650 Specifications and Prices emerge on the web on our message forum
https://forums.guru3d.com/data/avatars/m/45/45709.jpg
People better BEWARE of the cards that aren't equipped with the (dedicated) power connector. In a machine with a powerful CPU, such card can prove to be power-hungry, thanks to an unexpected "boost" of the graphics subsystem. Many years ago, I had a whole lot of troubles because of the card (GeForce 4 4800 SE, Innovision) not having this connector - black screens, in-game picture trembling and disappearing, OS restarts... Apparently, this was an error by design, since people at no-vidia estimated that the card wont draw more ampers than the power contact in the slot can provide. But, at the time, I had a HEAVILY overclocked CPU/RAM, which, among the other things resulted in a power draw of the card being so huge that the AGP's power connector wasn't able to provide enough current for the card. The card was also (moderately) overclocked (via NiBitor). Now, after a few weeks of research, the temporary solution had been found - via Riva Tuner, the card's x8 turbo was disabled and the aforementioned problems disappeared. It took me 3 or 4 months of additional research to find a true solution: the 4-pin (molex) connector was soldered to the appropriate pins on the card and I was again able to run it at turbo x8. On a side note: neither no-vidia nor in-no-vision were able (or: willing) to help me - they all seemed to be either not interested in the problem or not capable of understanding it and suggesting the solutions. The producers cut off the thing that can be very useful (and which would add to the card's price less than a penny), under the pretense of it "not being needed".
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
the 4800se never needed this connector, you should have made sure your agp port was compliant with agp power needs.
data/avatar/default/avatar23.webp
DLD:

People better BEWARE of the cards that aren't equipped with the (dedicated) power connector. In a machine with a powerful CPU, such card can prove to be power-hungry, thanks to an unexpected "boost" of the graphics subsystem. Many years ago, I had a whole lot of troubles because of the card (GeForce 4 4800 SE, Innovision) not having this connector - black screens, in-game picture trembling and disappearing, OS restarts... Apparently, this was an error by design, since people at no-vidia estimated that the card wont draw more ampers than the power contact in the slot can provide. But, at the time, I had a HEAVILY overclocked CPU/RAM, which, among the other things resulted in a power draw of the card being so huge that the AGP's power connector wasn't able to provide enough current for the card. The card was also (moderately) overclocked (via NiBitor). Now, after a few weeks of research, the temporary solution had been found - via Riva Tuner, the card's x8 turbo was disabled and the aforementioned problems disappeared. It took me 3 or 4 months of additional research to find a true solution: the 4-pin (molex) connector was soldered to the appropriate pins on the card and I was again able to run it at turbo x8. On a side note: neither no-vidia nor in-no-vision were able (or: willing) to help me - they all seemed to be either not interested in the problem or not capable of understanding it and suggesting the solutions. The producers cut off the thing that can be very useful (and which would add to the card's price less than a penny), under the pretense of it "not being needed".
Honestly this seems like the issue doesnt lie with the card. When you overclock older Motherboards it can increase power and voltages to the AGP or pci and you have to be careful not to. Even if the card wanted to draw more power it wont crash, it would just get as much as it could. I honestly think there was a much bigger problem at hand.
https://forums.guru3d.com/data/avatars/m/268/268616.jpg
User runs things out of spec, has issues, blames the manufacturer.... yep... sounds about right. Also, its Nvidia, idk who this No-vidia is, maybe you should stop buying from them, they sound like a scam copy /sarcasm
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
price looks about right here which is good its slightly less than the 1050ti and should perform around the same or even better might need to pick one up if they come low profile.
https://forums.guru3d.com/data/avatars/m/45/45709.jpg
Ridiric:

User runs things out of spec, has issues, blames the manufacturer.... yep... sounds about right. Also, its Nvidia, idk who this No-vidia is, maybe you should stop buying from them, they sound like a scam copy /sarcasm
-Every single AGP card I have ever had was overclocked - those days, it was almost a must-have, very fashionable, to say the least. After this 4800 I had a 5900, that WAS equipped with a connector - the card clocked like craaaazy, becoming hot like hell, yet being rock-stable. -If this problem was due to "running things out of specs", it would not have been able to run smoothly after being properly fed , later on. How's that? -Yeah, it's my sarcasm - what else one can do, confronted with the companies whose so-called tech support is next to none and whose policies are pretty much based on a "eat what you've been served with or else shut up" formula?
data/avatar/default/avatar29.webp
icedman:

price looks about right here which is good its slightly less than the 1050ti and should perform around the same or even better might need to pick one up if they come low profile.
The lack of the Ti moniker bugs me. The moment I buy one out of novelty, or for an HTPC build, they're going to release a 1650 Ti version, and my OCD about having the best of the worst is going to go bonkers.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
DLD:

People better BEWARE of the cards that aren't equipped with the (dedicated) power connector. In a machine with a powerful CPU, such card can prove to be power-hungry, thanks to an unexpected "boost" of the graphics subsystem. Many years ago, I had a whole lot of troubles because of the card (GeForce 4 4800 SE, Innovision) not having this connector - black screens, in-game picture trembling and disappearing, OS restarts... Apparently, this was an error by design, since people at no-vidia estimated that the card wont draw more ampers than the power contact in the slot can provide. But, at the time, I had a HEAVILY overclocked CPU/RAM, which, among the other things resulted in a power draw of the card being so huge that the AGP's power connector wasn't able to provide enough current for the card. The card was also (moderately) overclocked (via NiBitor). Now, after a few weeks of research, the temporary solution had been found - via Riva Tuner, the card's x8 turbo was disabled and the aforementioned problems disappeared. It took me 3 or 4 months of additional research to find a true solution: the 4-pin (molex) connector was soldered to the appropriate pins on the card and I was again able to run it at turbo x8. On a side note: neither no-vidia nor in-no-vision were able (or: willing) to help me - they all seemed to be either not interested in the problem or not capable of understanding it and suggesting the solutions. The producers cut off the thing that can be very useful (and which would add to the card's price less than a penny), under the pretense of it "not being needed".
Oh hey look, it's another person who likes to make up names to feel big about themselves! Nice to meet ya! ............................not really though. As to the rest of your statements, sound like you were handed a bad set of circumstances, as i have never had this issue with multiple PCs with 75 watt GPUs that don't require additional power plugs, overclocked or not, never one issue But you're also talking about AGP, which had a max wattage output of 48 watts, and was a little more complicated then PCI-Express in terms of how many watts was actually available to the GPU. Seems more likely you had a faulty motherboard, or you were requiring more wattage out of the GPU by overclocking it then was available. This is a you problem, not a technology problem. But again, this all seems like a YOU problem, not a lack-of-external-power problem. You're basically warning people to not get low end, low power requirement parts, because you tried to treat them as high end, and it didn't work out for you. That's a you problem.
data/avatar/default/avatar06.webp
Based on specs wise, it should sit between the 1050TI and the 1060, it will eventually loose against the RX 570, but should be close. The fact that don't even needs a PSU cable to power it, makes it a very very good option for low end builds. I will just ignore the DLD comment as it makes no sense, I had dozens of friends with low end cards and had 0 issues, actually less issues than I had with atually midrange stuff. XD
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
TheBigSmooth:

Honestly this seems like the issue doesnt lie with the card. When you overclock older Motherboards it can increase power and voltages to the AGP or pci and you have to be careful not to. Even if the card wanted to draw more power it wont crash, it would just get as much as it could. I honestly think there was a much bigger problem at hand.
Nope, the issue laid with a number of motherboards only supplying the agp slot with 1.5v when it required up to 3.3v with the 8x spec.
https://forums.guru3d.com/data/avatars/m/45/45709.jpg
Astyanax:

Nope, the issue laid with a number of motherboards only supplying the agp slot with 1.5v when it required up to 3.3v with the 8x spec.
Thank you, Astyanax. Yes, finally it turned out the power drawn through the mobo was not enough for that card - it DEFINITELY needed an extra supply cable, so to speak. Was or wasn't the mobo itself unable to deliver proper current/voltage, I couldn't care less: once the guy (an enthusiast-amateur electrician) soldered a 4-point female molex to the card, all issues were resolved. If Nvidia wanted to be sure it is selling an absolutely reliable product, it would have equipped the card with the connector (for a "just in case" type scenarios). No, they will rather save a tiny bit (even though it wouldn't significantly affect the price of the card).
https://forums.guru3d.com/data/avatars/m/268/268616.jpg
DLD:

Thank you, Astyanax. Yes, finally it turned out the power drawn through the mobo was not enough for that card - it DEFINITELY needed an extra supply cable, so to speak. Was or wasn't the mobo itself unable to deliver proper current/voltage, I couldn't care less: once the guy (an enthusiast-amateur electrician) soldered a 4-point female molex to the card, all issues were resolved. If Nvidia wanted to be sure it is selling an absolutely reliable product, it would have equipped the card with the connector (for a "just in case" type scenarios). No, they will rather save a tiny bit (even though it wouldn't significantly affect the price of the card).
How is a motherboard manufacturer making a board that didn't meet AGP spec Nvidia's problem? That fault lies directly at the motherboard manufacturers feet. And I stand by my previous statement, if you run things in spec and you have issues fair enough, the minute you run things out of spec there is always a chance you are going to run into problems, this was even more of an issue in the past, as overclocking has become more and more accepted by manufacturers the chance of things going sideways once running things out of spec (assuming you go with sane settings) has decreased. Still doesn't change the fact that if it runs fine with the whole system in spec, and then doesn't once even a single part of the system is out of spec, that's not the manufacturers problem, they make parts to run to a certain spec, if your motherboard doesn't meet that spec, or you run parts out of spec, you are then taking things into your own hands. I'm saying this as someone who currently has their CPU running at 5.2Ghz and GPU running at 2068mhz with a custom water loop and delided processor, so I am well aware of what is typically expected out of a product, still doesn't change the fact that running something out of spec like I am is still running it out of spec.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
DLD:

Thank you, Astyanax. Yes, finally it turned out the power drawn through the mobo was not enough for that card - it DEFINITELY needed an extra supply cable, so to speak. Was or wasn't the mobo itself unable to deliver proper current/voltage, I couldn't care less: once the guy (an enthusiast-amateur electrician) soldered a 4-point female molex to the card, all issues were resolved. If Nvidia wanted to be sure it is selling an absolutely reliable product, it would have equipped the card with the connector (for a "just in case" type scenarios). No, they will rather save a tiny bit (even though it wouldn't significantly affect the price of the card).
So, wait, you're saying, because a motherboard had an issue where it wasn't supplying enough wattage, that somehow is the graphics cards fault for not having external power? ....What? You just don't know when to admit fault do you? You came in here telling of a story about your ancient problem with having a GPU with no external power source, when it had nothing to do with the GPU. You worked around the problem with your MOTHERBOARD by supposedly soldering on a molex connector to your GPU, but you didn't FIX the GPU, you FIXED the issue with the motherboard not supplying what it should have been supplying. What's next, external power sources on the CPU just incase the motherboard that should be within spec to provide said power, doesn't? Maybe our PCI NIC cards should too right? Just in case? Heck lets just go and put external power sources on everything and not hold the motherboard accountable for its inability to do what it needs to do why don't we! Sheesh.
data/avatar/default/avatar36.webp
The 1650, given its price and performance will be the worst of the 20 series along with the RTX 2080 and 2080Ti.
data/avatar/default/avatar25.webp
Look how well the 1050Ti sells, small + low power is actually very important in this market segment. It'll be one of the top sellers.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Geezus guys..... If you're not interested in buying the card, why complain about it?
BReal85:

The 1650, given its price and performance will be the worst of the 20 series along with the RTX 2080 and 2080Ti.
The 1650 won't be the worst of the 20 series. It's not even part of the "20 series". It's part of the "16 series" according to NVidia, where yes, it will be the worst of the 3 cards in the "16 series".