mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Hardware > GPU Computing

Reply
 
Thread Tools
Old 2019-03-01, 05:30   #1
axn
 
axn's Avatar
 
Jun 2003

2×32×269 Posts
Default GTX 1660 Ti

Anyone has bought the new GTX 1660 Ti? Any benchmarks on mfaktc? Power usage? Driver/CUDA stability/issues?
axn is offline   Reply With Quote
Old 2019-03-01, 07:27   #2
firejuggler
 
firejuggler's Avatar
 
Apr 2010
Over the rainbow

2·1,259 Posts
Default

At this time i'm thinking of replacing my 750 ti with either a 1060 or a 1660... both are in the same price range.



https://www.anandtech.com/show/13973...evga-xc-gaming
or
https://www.tomshardware.com/reviews...ring,6002.html
from what I read, it has 5.4 Teraflops (FP32) compared to 4.4 for the 1060.

Last fiddled with by firejuggler on 2019-03-01 at 07:35
firejuggler is online now   Reply With Quote
Old 2019-03-01, 08:15   #3
nomead
 
nomead's Avatar
 
"Sam Laur"
Dec 2018
Turku, Finland

2·3·5·11 Posts
Default

In games etc. the 1660Ti should probably be about the same as a GTX 1070 (some are faster, some are slower). Some newer games have reportedly been programmed to take advantage of the concurrent FP32/INT32 execution in Turing, and some others can use FP16, which the Pascal-generation cards weren't particularly good at. Even though tensor cores were removed, the GTX 1660Ti has dedicated hardware for FP16. There is some speculation that this is because some AMD-sponsored game titles extensively use FP16, since the Vega cards can run packed FP16 at twice the rate of FP32. So it could be a reaction to one of the few advantages that AMD still had.

The one outlier is again factoring with mfaktc. The clock speeds seem to be about the same as on RTX cards, so the core count is all that matters. 20% down from RTX 2060... I'd still expect it to be a bit faster in mfaktc than a GTX 1080 Ti.

Not planning to buy one, though...
nomead is offline   Reply With Quote
Old 2019-03-01, 13:20   #4
axn
 
axn's Avatar
 
Jun 2003

12EA16 Posts
Default

Quote:
Originally Posted by nomead View Post
The one outlier is again factoring with mfaktc. The clock speeds seem to be about the same as on RTX cards, so the core count is all that matters. 20% down from RTX 2060... I'd still expect it to be a bit faster in mfaktc than a GTX 1080 Ti.
This is what interests me as well. Whether it will perform like a RTX 20 series or like a GTX 10 series (or somewhere in between).

Doesn't look like Linux drivers are ready yet.
axn is offline   Reply With Quote
Old 2019-03-01, 15:40   #5
Mark Rose
 
Mark Rose's Avatar
 
"/X\(‘-‘)/X\"
Jan 2013

1011011100012 Posts
Default

Quote:
Originally Posted by axn View Post
Doesn't look like Linux drivers are ready yet.
Well, Phoronix did Linux benchmarks using driver version 418.43: games and openCl.

The INT performance looks promising, with performance 83% of the RTX 2060 (more than double a GTX 1080).
Mark Rose is offline   Reply With Quote
Old 2019-03-01, 16:25   #6
axn
 
axn's Avatar
 
Jun 2003

2·32·269 Posts
Default

Quote:
Originally Posted by Mark Rose View Post
Well, Phoronix did Linux benchmarks using driver version 418.43: games and openCl.
Ok. Looks like SNAFU on NVidia's part. If you go here and select 16 series, it doesn't offier Linux as a platform choice, but if you select RTX 20 or GTX 10 series, it shows Linux, and on selecting it, it will list 418 driver where it says that it supports 16 series !
axn is offline   Reply With Quote
Old 2019-05-04, 19:29   #7
lycorn
 
lycorn's Avatar
 
Sep 2002
Oeiras, Portugal

11·131 Posts
Default

Just wondering if someone has hard figures for the TF performance of this card. It might be an interesting (read: more affordable while delivering decent performance... ) alternative to the 2xxx series.
lycorn is offline   Reply With Quote
Old 2019-05-05, 02:32   #8
axn
 
axn's Avatar
 
Jun 2003

2·32·269 Posts
Default

Quote:
Originally Posted by lycorn View Post
Just wondering if someone has hard figures for the TF performance of this card. It might be an interesting (read: more affordable while delivering decent performance... ) alternative to the 2xxx series.
I actually bought this sucker, though I'm not using it for GIMPS TF work. Nonetheless, I can offer some preliminary numbers. These are run on a system with P95 running on all 4 cores, and the GPU is responsible for Xorg stuff. If you do any screen activity, the numbers will tank. And I'm making no claims that mfaktc parameters are optimal. YMMV.

barrett76_mul32_gs - 1440 - 1650 - 1690
barrett87_mul32_gs - 1330 - 1530 - 1550

The first number is GD/d @ 70w power limit, second @ default 120w power limit, and third @ max power limit of 150w. Personally, I run @ 70w pl.

GPU is GIGABYTE GEFORCE GTX 1660 TI OC (GV-N166TOC-6GD)
axn is offline   Reply With Quote
Old 2019-05-05, 04:19   #9
dcheuk
 
dcheuk's Avatar
 
Jan 2019
Pittsburgh, PA

24710 Posts
Default

Quote:
Originally Posted by axn View Post
I actually bought this sucker, though I'm not using it for GIMPS TF work. Nonetheless, I can offer some preliminary numbers. These are run on a system with P95 running on all 4 cores, and the GPU is responsible for Xorg stuff. If you do any screen activity, the numbers will tank. And I'm making no claims that mfaktc parameters are optimal. YMMV.

barrett76_mul32_gs - 1440 - 1650 - 1690
barrett87_mul32_gs - 1330 - 1530 - 1550

The first number is GD/d @ 70w power limit, second @ default 120w power limit, and third @ max power limit of 150w. Personally, I run @ 70w pl.

GPU is GIGABYTE GEFORCE GTX 1660 TI OC (GV-N166TOC-6GD)
Wow those are pretty impressive numbers for the gpu prices nowdays.

Do you guys have any take on NVIDIA's Creator-Ready-Driver vs Game-Ready Driver? My driver's from January, probably should update lol

Last fiddled with by dcheuk on 2019-05-05 at 04:21
dcheuk is offline   Reply With Quote
Old 2019-05-05, 09:17   #10
lycorn
 
lycorn's Avatar
 
Sep 2002
Oeiras, Portugal

101101000012 Posts
Default

@axn: Thanks for your answer. Those are very interesting figures, way higher than I was expecting. Just for completeness. what were the bit levels and exponent sizes used for the benchmark posted?
lycorn is offline   Reply With Quote
Old 2019-05-05, 09:32   #11
axn
 
axn's Avatar
 
Jun 2003

2×32×269 Posts
Default

Factor=bla,90027299,75,76 (for barrett76)
Factor=bla,90027299,76,77 (for barrett87)

Note that I did not run these to completion. I just used these as dummy worktodo to do the benchmark, and then eyeballed an average rate based on what the program itself was reporting.

Some additional info:
OS: Ubuntu 18.04 LTS (4.15.0-48)
driver 418.56
cuda 10.1
mfaktc 0.21 compiled for cc 7.5
axn is offline   Reply With Quote
Reply

Thread Tools


All times are UTC. The time now is 09:52.

Tue Jan 19 09:52:30 UTC 2021 up 47 days, 6:03, 0 users, load averages: 1.84, 1.83, 1.96

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.