Gpu z 2 53 0
Author: e | 2025-04-25
D: Workloads winget-pkgs [master ≡ 0 ~1 -0 !] winget download -m . manifests t TechPowerUp GPU-Z 2.55.0 已找到 TechPowerUp GPU-Z [TechPowerUp.GPU-Z] 版本 2. GPU-Z 0 Builds. GPU-Z 0.6.7; GPU-Z 0.6.6; GPU-Z 0.6.5; GPU-Z 0.6.4; GPU-Z 0.6.3; OldVersion.com provides free software downloads for old versions of programs
gpu-z -gpu-z 2.41.0 -53
Doesn't work. Specifically the framerate monitor. 2019/10/25 20:32:54 (permalink) I just upgraded my gpu to 2080 ti FTW3 ultra today, did clean install of drivers. I did not uninstall PX1 and reinstall it again, figured that should be fine. I am also having issues with PX1 Ver. 1.0.1, the frame rate graph does not show anything other than 0??? Any fix for this? Thanks in advance. CPU: Intel i9 10850kMOBO: Rog Maximus XII Hero Z490GPU: EVGA RTX 2080 Ti FTW3 Ultra OVERCLOCKEDRAM: Corsiar Dominator Platinum RGB 32GB 3200MHZAIO: Corsair H100i Pro RGBPSU: Corsair HX1000iSSD: 1 v-NAND SSD 970 PRO 2 TBCASE: COOL MASTER MC500PMONITOR: Asus Tuff VG27AQ 2560x1440 bob16314 CLASSIFIED ULTRA Member Total Posts : 7859 Reward points : 0 Joined: 2008/11/07 22:33:22Location: Planet of the Babes Status: offline Ribbons : 761 Re: Precision OSD doesn't work. Specifically the framerate monitor. 2019/10/25 21:33:53 (permalink) battlelog I am also having issues with PX1 Ver. 1.0.1, the frame rate graph does not show anything other than 0??? Any fix for this? You have to be running a Game/3D App for the Framerate to show in the graph..It will show 0 just running on the Desktop.Run the Render Test in GPU-Z and it should register..Open the Render Test by clicking on the little '?' button to the right of Bus Interface. See if that works. * Corsair Obsidian 450D Mid-Tower - Airflow Edition * ASUS ROG Maximus X Hero (Wi-Fi AC) * Intel i7-8700K @ 5.0 GHz * 16GB G.SKILL Trident Z 4133MHz * Sabrent Rocket 1TB M.2 SSD * WD Black 500 GB HDD * Seasonic M12 II 750W * Corsair H115i Elite Capellix 280mm * EVGA GTX 760 SC * Win7 Home/Win10 Home * "Whatever it takes, as long as it works" - Me battlelog New Member Total Posts : 63 Reward points : 0 Joined: 2018/06/26 08:49:49Location: Hollywood, CA Status: offline Ribbons : 0 Re: Precision OSD doesn't work. Specifically the framerate monitor. 2019/10/25 21:36:00 (permalink) I was running the Valley benchmark and it stayed at zero. Will try to do what you said. Thanks. CPU: Intel i9 10850kMOBO: Rog Maximus XII Hero Z490GPU: EVGA RTX 2080 Ti FTW3 Ultra OVERCLOCKEDRAM: Corsiar Dominator Platinum RGB 32GB 3200MHZAIO: Corsair H100i Pro RGBPSU: Corsair HX1000iSSD: 1 v-NAND SSD 970 PRO 2 TBCASE: COOL MASTER MC500PMONITOR: Asus Tuff VG27AQ 2560x1440 bob16314 CLASSIFIED ULTRA Member Total Posts : 7859 Reward points : 0 Joined:
GPU-Z GPU-Z _ _
This example shows how to use GPU-enabled MATLAB® functions to compute a well-known mathematical construction: the Mandelbrot set. Check your GPU using the gpuDevice function.Define the parameters. The Mandelbrot algorithm iterates over a grid of real and imaginary parts. The following code defines the number of iterations, grid size, and grid limits.maxIterations = 500;gridSize = 1000;xlim = [-0.748766713922161, -0.748766707771757];ylim = [ 0.123640844894862, 0.123640851045266]; You can use the gpuArray function to transfer data to the GPU and create a gpuArray, or you can create an array directly on the GPU. gpuArray provides GPU versions of many functions that you can use to create data arrays, such as linspace. For more information, see Create GPU Arrays Directly. x = gpuArray.linspace(xlim(1),xlim(2),gridSize);y = gpuArray.linspace(ylim(1),ylim(2),gridSize);whos x y Name Size Bytes Class Attributes x 1x1000 8000 gpuArray y 1x1000 8000 gpuArray Many MATLAB functions support gpuArrays. When you supply a gpuArray argument to any GPU-enabled function, the function runs automatically on the GPU. For more information, see Run MATLAB Functions on a GPU. Create a complex grid for the algorithm, and create the array count for the results. To create this array directly on the GPU, use the ones function, and specify 'gpuArray'.[xGrid,yGrid] = meshgrid(x,y);z0 = complex(xGrid,yGrid);count = ones(size(z0),'gpuArray');The following code implements the Mandelbrot algorithm using GPU-enabled functions. Because the code uses gpuArrays, the calculations happen on the GPU.z = z0;for n = 0:maxIterations z = z.*z + z0; inside = abs(z) endcount = log(count);When computations are done, plot the results.imagesc(x,y,count)colormap([jet();flipud(jet());0 0 0]);axis off See AlsogpuArrayGPU-Z -GPU-Z(GPU ) -PC
How to Check What Graphic Card (GPU) Is in Your Computer How to Identify the Hardware in Your Computer How to Fix Problems Installing Drivers from NVIDIA, Intel, or AMD Screenshot for GPU Caps Viewer Portable Top Downloads In Video Card ToolsEVGA Precision X1 will allow you to make fine-tuned adjustments on your graphics card, including GPU Clock Offsets, Memory Clock Offsets, Fan Speed, voltage, and more. GPU-Z 2.64.0 [ 2025-02-26 08:37:53 | 11 MB | Freeware | 11|10|8|7 | 5 ]GPU-Z is a lightweight freeware video card utility designed to give you all the information about your video card and GPU. Portable version is also available.Video Memory Stress Test allows testing your video RAM for errors and faults thoroughly. nvidiaProfileInspector is a small tool that displays hardware information for Nvidia-based graphics cards. Principal features include; hardware monitoring, Core/Shader/Memory clock tuning, Fan Speed adjustment, etc. MSI Afterburner is a handy overclocking utility for graphics cards that includes features like GPU/Shader/Memory clock adjustment, advanced fan speed, and GPU voltage control. A Beta version is also available. Tactical Briefings -->Comment Rules & Etiquette - We welcome all comments from our readers, but any comment section requires some moderation. Some posts are auto-moderated to reduce spam, including links and swear words. When you make a post, and it does not appear, it went into moderation. We are emailed when posts are marked as spam and respond ASAP. Some posts might be deleted to reduce clutter. Examples include religion, politics, and comments about listing. D: Workloads winget-pkgs [master ≡ 0 ~1 -0 !] winget download -m . manifests t TechPowerUp GPU-Z 2.55.0 已找到 TechPowerUp GPU-Z [TechPowerUp.GPU-Z] 版本 2.gpu-z -gpu-z v2.41.0 -
Nickv0947 New Member Total Posts : 7 Reward points : 0 Joined: 2015/11/30 22:53:26 Status: offline Ribbons : 0 Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING --> Hello, I'm considering overclocking my GPU, and I have a few questions. What does the "XC" stand for in 'EVGA GeForce GTX 1660 Ti XC Ultra GAMING?' Does that mean it's already overclocked? Has anyone overclocked this GPU using Precision X1 or MSI Afterburner? Do you know the best settings (memory clock, GPU voltage, GPU clock, GPU temp target, power target, and/or fan speed)? Thank you, - Nick See attached.EVGA Part Number: 06G-P4-1267-KR post edited by nickv0947 - 2020/03/09 17:37:53 Sajin EVGA Forum Moderator Total Posts : 49227 Reward points : 0 Joined: 2010/06/07 21:11:51Location: Texas, USA. Status: offline Ribbons : 199 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/09 19:42:57 (permalink) ☄ Helpfulby nickv0947 2020/03/09 21:15:11 Yes, the card is overclocked out of the box. Nobody can tell you what the best settings are as all gpu's will overclock differently. nickv0947 New Member Total Posts : 7 Reward points : 0 Joined: 2015/11/30 22:53:26 Status: offline Ribbons : 0 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/09 21:56:51 (permalink) I set my GPU clock to +140 and Memory clock to +1300. Using Superposition Benchmark, my score went up from 11688 to 12718 (which is 1000+). I'll keep it like that for awhile and test a few games. gamernut78 iCX Member Total Posts : 383 Reward points : 0 Joined: 2009/05/03 19:28:00Location: Mars Status: offline Ribbons : 0 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/17 09:23:30 (permalink) Why would you overclock this card if it's already overclocked from the box? How is your gaming performance upon overclocking this card? What does it compare to when it comes to PC gaming? Does it behave quite good as a 1080Ti? Please help me earn some bucks to save money in the future as a team player and gamer!My Rewards Program Code: 6JD2JTSGVP coolmistry CLASSIFIED Member Total Posts : 2506 Reward points : 0 Joined: 2009/04/08 11:13:01Location: Hemel Hempstead , London Status: offline Ribbons : 45 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/17 10:08:56 (permalink) gamernut78Why would you overclock this card if it's already overclocked from the box? How is your gaming performance upon overclocking this card? What does it(GPU-Z) (GPU-Z) v2.32.0
Around and let see if someone else use the same board and could confirm your readings . #7 all the temp are good this auxiliary must be the same thing i have with another name that show -128 c on my system , stay around and let see if someone else use the same board and could confirm your readings . OK, thanks...My Motherboard is an ASUS P8H61-M Pro. Feb 18, 2010 30,587 324 107,640 #8 nice little board you could do some overclock with her . #9 I'm quite worried about my Temperature #1 reading. And since you were waiting for somebody with a similar system I thought I'd post:+- ASUS P8Z68-V LX (/mainboard)| +- Nuvoton NCT6776F (/lpc/nct6776f)| | +- Temperature #1 : 78.5 26.5 86 (/lpc/nct6776f/temperature/1)| | +- Temperature #2 : 56 50.5 59 (/lpc/nct6776f/temperature/2)| | +- Temperature #3 : 32 30 32 (/lpc/nct6776f/temperature/3)|+- Intel Core i7-2600K (/intelcpu/0)| +- CPU Core #1 : 41 38 65 (/intelcpu/0/temperature/0)| +- CPU Core #2 : 41 38 68 (/intelcpu/0/temperature/1)| +- CPU Core #3 : 40 37 65 (/intelcpu/0/temperature/2)| +- CPU Core #4 : 35 31 58 (/intelcpu/0/temperature/3)| +- CPU Package : 41 38 68 (/intelcpu/0/temperature/4)|+- NVIDIA GeForce GTX 570 (/nvidiagpu/0)| +- GPU Core : 54 54 57 (/nvidiagpu/0/temperature/0)Any ideas / opinions guys? Feb 18, 2010 30,587 324 107,640 #10 your on stock cooler i would suggest to use after market one for the cpu like the evo 212 and mx2 or mx4 as cooling paste they do not need cure time . #11 I have similar temperatures, no problems so far:|+- ASUS P8H67-M PRO (/mainboard)| || +- Nuvoton NCT6776F (/lpc/nct6776f)| | +- CPU Core : 47.5 45.5 50 (/lpc/nct6776f/temperature/0)| | +- Temperature #1 : 73.5 50.5 83 (/lpc/nct6776f/temperature/1)| | +- Temperature #2 : 90.5 88 96 (/lpc/nct6776f/temperature/2)| | +- Temperature #3 : 37 37 38 (/lpc/nct6776f/temperature/3)|+- Intel Core i5-2400 (/intelcpu/0)| +- CPU Core #1 : 53 49 59 (/intelcpu/0/temperature/0)| +- CPU Core #2 : 54 52 61 (/intelcpu/0/temperature/1)| +- CPU Core #3 : 55 51 59 (/intelcpu/0/temperature/2)| +- CPU Core #4 : 53 49 58 (/intelcpu/0/temperature/3)| +- CPU Package : 58 55 61 (/intelcpu/0/temperature/4)+- NVIDIA GeForce GTX 570 (/nvidiagpu/0)| +- GPU Core : 59 58 60 (/nvidiagpu/0/temperature/0)I looked for the data sheet of the chip and the specifications say the operating temperature is T = 0°C to +70°C (well, we're 20 degrees over! )Storage temperature -55 to +150°CSource: page 416 Advertising Cookies Policies Privacy Term & Conditions TopicsGPU-Z -GPU-Z v2.
OldVersionWelcome Guest, Login | Register WindowsMacLinuxGamesAndroidEnglishEnglishالعربيةDeutschEspañolFrançais日本のРусскийTürk中国的Upload SoftwareForumBlogRegisterLogin Stats: 30,053 versions of 1,966 programsPick a software title...to downgrade to the version you love!Windows » Utilities » GPU-Z » GPU-Z 0.5.9Get Updates on GPU-ZGPU-Z 0.5.97,496 DownloadsGPU-Z 0.5.9 0out of5based on0 ratings.File Size: 0.97 MBDate Released: Feb 13, 2012Works on: Windows 2000 / Windows 7 / Windows 7 x64 / Windows 8 / Windows 8 x64 / Windows Vista / Windows Vista x64 / Windows XP / Windows XP x64Doesn't Work on: Windows 3.1 / Windows ME / Windows 98 / Windows 95 License: Add info Official Website: TechpowerupTotal Downloads: 7,496Contributed by:liz07641william Rating:0 of 5Rate It!(0 votes) Tested: Free from spyware, adware and virusesGPU-Z 0.5.9 Change Log* Added support for AMD Radeon HD 7750 and HD 7770 * Added voltage monitoring for HD 7950 and HD 7970* Fixed memory size readings for ATI cards with large VRAM* Improved formula for NVIDIA ASIC Quality reading* Added explanation text to ASIC quality window* Fixed bug that caused updater to show up even though no update available, lagging GPU-Z.* When multi-GPU setup detected, PCIe load test will recommend full screen* Added board ID to BIOS version readout* Added option to show sensor reading in GPU-Z title (click the arrow next to the sensor name)* Refresh sensors in background now defaults to enabled* Fixed release date for HD 7950* Added PCI vendor Packard Bell* Fix for ATI hardware access breaking on Catalyst 12.1* Added fan RPM monitoring support on some ATI cards* Added GF108 based GT 520, GTX 555 (non-mobile), GeForce 305M, 610M GPU-Z 0.5.9 Screenshotsupload screenshotupload screenshotupload screenshotupload screenshotupload screenshotupload screenshotupload screenshotGPU-Z 0 BuildsGPU-Z 0.8.5GPU-Z 0.6.7GPU-Z 0.6.6GPU-Z 0.6.5GPU-Z 0.6.4GPU-Z 0.6.3GPU-Z 0.6.2GPU-Z 0.6.1GPU-Z 0.6.0GPU-Z 0.5.8GPU-Z 0.5.7GPU-Z 0.5.6GPU-Z 0.5.5GPU-Z 0.5.4GPU-Z 0.5.3GPU-Z 0.5.2GPU-Z 0.5.1GPU-Z 0.5.0GPU-Z 0.4.9GPU-Z 0.4.8GPU-Z 0.4.7GPU-Z 0.4.6GPU-Z 0.4.5GPU-Z 0.4.4GPU-Z 0.4.3GPU-Z 0.4.2GPU-Z 0.4.0GPU-Z 0.3.9GPU-Z 0.3.8GPU-Z 0.3.7GPU-Z 0.3.6GPU-Z 0.3.5GPU-Z 0.3.4GPU-Z 0.3.3GPU-Z 0.3.2GPU-Z 0.3.1GPU-Z 0.3.0GPU-Z 0.2.9GPU-Z 0.2.8GPU-Z 0.2.7GPU-Z 0.2.6GPU-Z 0.2.5GPU-Z 0.2.4GPU-Z 0.2.3GPU-Z 0.2.2GPU-Z 0.2.1GPU-Z 0.0.9GPU-Z 0.0.7GPU-Z Commentsblog comments powered by Disqus6155 Top 5 Contributorssofiane41,005 PointsPKO1716,000 Pointssafarisilver13,345 Pointsalpha110,985 PointsMatrixisme9,755 PointsSee More Users »Upload SoftwareGet points for uploading software and use them to redeem prizes!Site LinksAbout UsContact UsHelp / FAQCategoryWindowsMacLinuxGamesAndroidFollow OldVersion.com Old VersionOldVersion.com provides free software downloads for old versions of programs, drivers and games.So why not downgrade to the version you love?.... because newer is not always better!©2000-2025 OldVersion.com.Privacy PolicyTOSUpload SoftwareBlogDesign by Jenox OldVersion.com Points SystemWhen you upload software to oldversion.com you get rewarded by points. For every field that is. D: Workloads winget-pkgs [master ≡ 0 ~1 -0 !] winget download -m . manifests t TechPowerUp GPU-Z 2.55.0 已找到 TechPowerUp GPU-Z [TechPowerUp.GPU-Z] 版本 2.Comments
Doesn't work. Specifically the framerate monitor. 2019/10/25 20:32:54 (permalink) I just upgraded my gpu to 2080 ti FTW3 ultra today, did clean install of drivers. I did not uninstall PX1 and reinstall it again, figured that should be fine. I am also having issues with PX1 Ver. 1.0.1, the frame rate graph does not show anything other than 0??? Any fix for this? Thanks in advance. CPU: Intel i9 10850kMOBO: Rog Maximus XII Hero Z490GPU: EVGA RTX 2080 Ti FTW3 Ultra OVERCLOCKEDRAM: Corsiar Dominator Platinum RGB 32GB 3200MHZAIO: Corsair H100i Pro RGBPSU: Corsair HX1000iSSD: 1 v-NAND SSD 970 PRO 2 TBCASE: COOL MASTER MC500PMONITOR: Asus Tuff VG27AQ 2560x1440 bob16314 CLASSIFIED ULTRA Member Total Posts : 7859 Reward points : 0 Joined: 2008/11/07 22:33:22Location: Planet of the Babes Status: offline Ribbons : 761 Re: Precision OSD doesn't work. Specifically the framerate monitor. 2019/10/25 21:33:53 (permalink) battlelog I am also having issues with PX1 Ver. 1.0.1, the frame rate graph does not show anything other than 0??? Any fix for this? You have to be running a Game/3D App for the Framerate to show in the graph..It will show 0 just running on the Desktop.Run the Render Test in GPU-Z and it should register..Open the Render Test by clicking on the little '?' button to the right of Bus Interface. See if that works. * Corsair Obsidian 450D Mid-Tower - Airflow Edition * ASUS ROG Maximus X Hero (Wi-Fi AC) * Intel i7-8700K @ 5.0 GHz * 16GB G.SKILL Trident Z 4133MHz * Sabrent Rocket 1TB M.2 SSD * WD Black 500 GB HDD * Seasonic M12 II 750W * Corsair H115i Elite Capellix 280mm * EVGA GTX 760 SC * Win7 Home/Win10 Home * "Whatever it takes, as long as it works" - Me battlelog New Member Total Posts : 63 Reward points : 0 Joined: 2018/06/26 08:49:49Location: Hollywood, CA Status: offline Ribbons : 0 Re: Precision OSD doesn't work. Specifically the framerate monitor. 2019/10/25 21:36:00 (permalink) I was running the Valley benchmark and it stayed at zero. Will try to do what you said. Thanks. CPU: Intel i9 10850kMOBO: Rog Maximus XII Hero Z490GPU: EVGA RTX 2080 Ti FTW3 Ultra OVERCLOCKEDRAM: Corsiar Dominator Platinum RGB 32GB 3200MHZAIO: Corsair H100i Pro RGBPSU: Corsair HX1000iSSD: 1 v-NAND SSD 970 PRO 2 TBCASE: COOL MASTER MC500PMONITOR: Asus Tuff VG27AQ 2560x1440 bob16314 CLASSIFIED ULTRA Member Total Posts : 7859 Reward points : 0 Joined:
2025-04-12This example shows how to use GPU-enabled MATLAB® functions to compute a well-known mathematical construction: the Mandelbrot set. Check your GPU using the gpuDevice function.Define the parameters. The Mandelbrot algorithm iterates over a grid of real and imaginary parts. The following code defines the number of iterations, grid size, and grid limits.maxIterations = 500;gridSize = 1000;xlim = [-0.748766713922161, -0.748766707771757];ylim = [ 0.123640844894862, 0.123640851045266]; You can use the gpuArray function to transfer data to the GPU and create a gpuArray, or you can create an array directly on the GPU. gpuArray provides GPU versions of many functions that you can use to create data arrays, such as linspace. For more information, see Create GPU Arrays Directly. x = gpuArray.linspace(xlim(1),xlim(2),gridSize);y = gpuArray.linspace(ylim(1),ylim(2),gridSize);whos x y Name Size Bytes Class Attributes x 1x1000 8000 gpuArray y 1x1000 8000 gpuArray Many MATLAB functions support gpuArrays. When you supply a gpuArray argument to any GPU-enabled function, the function runs automatically on the GPU. For more information, see Run MATLAB Functions on a GPU. Create a complex grid for the algorithm, and create the array count for the results. To create this array directly on the GPU, use the ones function, and specify 'gpuArray'.[xGrid,yGrid] = meshgrid(x,y);z0 = complex(xGrid,yGrid);count = ones(size(z0),'gpuArray');The following code implements the Mandelbrot algorithm using GPU-enabled functions. Because the code uses gpuArrays, the calculations happen on the GPU.z = z0;for n = 0:maxIterations z = z.*z + z0; inside = abs(z) endcount = log(count);When computations are done, plot the results.imagesc(x,y,count)colormap([jet();flipud(jet());0 0 0]);axis off See AlsogpuArray
2025-03-29Nickv0947 New Member Total Posts : 7 Reward points : 0 Joined: 2015/11/30 22:53:26 Status: offline Ribbons : 0 Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING --> Hello, I'm considering overclocking my GPU, and I have a few questions. What does the "XC" stand for in 'EVGA GeForce GTX 1660 Ti XC Ultra GAMING?' Does that mean it's already overclocked? Has anyone overclocked this GPU using Precision X1 or MSI Afterburner? Do you know the best settings (memory clock, GPU voltage, GPU clock, GPU temp target, power target, and/or fan speed)? Thank you, - Nick See attached.EVGA Part Number: 06G-P4-1267-KR post edited by nickv0947 - 2020/03/09 17:37:53 Sajin EVGA Forum Moderator Total Posts : 49227 Reward points : 0 Joined: 2010/06/07 21:11:51Location: Texas, USA. Status: offline Ribbons : 199 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/09 19:42:57 (permalink) ☄ Helpfulby nickv0947 2020/03/09 21:15:11 Yes, the card is overclocked out of the box. Nobody can tell you what the best settings are as all gpu's will overclock differently. nickv0947 New Member Total Posts : 7 Reward points : 0 Joined: 2015/11/30 22:53:26 Status: offline Ribbons : 0 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/09 21:56:51 (permalink) I set my GPU clock to +140 and Memory clock to +1300. Using Superposition Benchmark, my score went up from 11688 to 12718 (which is 1000+). I'll keep it like that for awhile and test a few games. gamernut78 iCX Member Total Posts : 383 Reward points : 0 Joined: 2009/05/03 19:28:00Location: Mars Status: offline Ribbons : 0 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/17 09:23:30 (permalink) Why would you overclock this card if it's already overclocked from the box? How is your gaming performance upon overclocking this card? What does it compare to when it comes to PC gaming? Does it behave quite good as a 1080Ti? Please help me earn some bucks to save money in the future as a team player and gamer!My Rewards Program Code: 6JD2JTSGVP coolmistry CLASSIFIED Member Total Posts : 2506 Reward points : 0 Joined: 2009/04/08 11:13:01Location: Hemel Hempstead , London Status: offline Ribbons : 45 Re: Overclocking EVGA GeForce GTX 1660 Ti XC Ultra GAMING 2020/03/17 10:08:56 (permalink) gamernut78Why would you overclock this card if it's already overclocked from the box? How is your gaming performance upon overclocking this card? What does it
2025-04-14Around and let see if someone else use the same board and could confirm your readings . #7 all the temp are good this auxiliary must be the same thing i have with another name that show -128 c on my system , stay around and let see if someone else use the same board and could confirm your readings . OK, thanks...My Motherboard is an ASUS P8H61-M Pro. Feb 18, 2010 30,587 324 107,640 #8 nice little board you could do some overclock with her . #9 I'm quite worried about my Temperature #1 reading. And since you were waiting for somebody with a similar system I thought I'd post:+- ASUS P8Z68-V LX (/mainboard)| +- Nuvoton NCT6776F (/lpc/nct6776f)| | +- Temperature #1 : 78.5 26.5 86 (/lpc/nct6776f/temperature/1)| | +- Temperature #2 : 56 50.5 59 (/lpc/nct6776f/temperature/2)| | +- Temperature #3 : 32 30 32 (/lpc/nct6776f/temperature/3)|+- Intel Core i7-2600K (/intelcpu/0)| +- CPU Core #1 : 41 38 65 (/intelcpu/0/temperature/0)| +- CPU Core #2 : 41 38 68 (/intelcpu/0/temperature/1)| +- CPU Core #3 : 40 37 65 (/intelcpu/0/temperature/2)| +- CPU Core #4 : 35 31 58 (/intelcpu/0/temperature/3)| +- CPU Package : 41 38 68 (/intelcpu/0/temperature/4)|+- NVIDIA GeForce GTX 570 (/nvidiagpu/0)| +- GPU Core : 54 54 57 (/nvidiagpu/0/temperature/0)Any ideas / opinions guys? Feb 18, 2010 30,587 324 107,640 #10 your on stock cooler i would suggest to use after market one for the cpu like the evo 212 and mx2 or mx4 as cooling paste they do not need cure time . #11 I have similar temperatures, no problems so far:|+- ASUS P8H67-M PRO (/mainboard)| || +- Nuvoton NCT6776F (/lpc/nct6776f)| | +- CPU Core : 47.5 45.5 50 (/lpc/nct6776f/temperature/0)| | +- Temperature #1 : 73.5 50.5 83 (/lpc/nct6776f/temperature/1)| | +- Temperature #2 : 90.5 88 96 (/lpc/nct6776f/temperature/2)| | +- Temperature #3 : 37 37 38 (/lpc/nct6776f/temperature/3)|+- Intel Core i5-2400 (/intelcpu/0)| +- CPU Core #1 : 53 49 59 (/intelcpu/0/temperature/0)| +- CPU Core #2 : 54 52 61 (/intelcpu/0/temperature/1)| +- CPU Core #3 : 55 51 59 (/intelcpu/0/temperature/2)| +- CPU Core #4 : 53 49 58 (/intelcpu/0/temperature/3)| +- CPU Package : 58 55 61 (/intelcpu/0/temperature/4)+- NVIDIA GeForce GTX 570 (/nvidiagpu/0)| +- GPU Core : 59 58 60 (/nvidiagpu/0/temperature/0)I looked for the data sheet of the chip and the specifications say the operating temperature is T = 0°C to +70°C (well, we're 20 degrees over! )Storage temperature -55 to +150°CSource: page 416 Advertising Cookies Policies Privacy Term & Conditions Topics
2025-04-04