News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by juan
 - December 24, 2013, 00:42:42
Este juego no reconoció mi tarjeta de vídeo de amd radeon hd 7670m en vez de eso solo opero con la intel graphics 4000 lo cual me dio un rendimiento decepcionante, por lo tanto decidí devolver el videojuego, mi recomendación seria comprar este videojuego sola y únicamente hasta que arreglen ese problema de compatibilidad por que de lo contrario sera un desperdicio de dinero
Posted by Nuseeker
 - December 16, 2013, 17:23:03
This is one of the first games to be specifically designed to maximize the new Haswell Intel CPU's those are any CPU's that start with 4. I ordered a Haswell Intel Core I7 4700 yesterday and will let you know if it makes a difference. I am currently running a Sandy Bridge Core I7 3.4 but there is clearly a bottleneck there. I would like to see a retest with a Haswell CPU I don't think dual GPU's will make any difference.
Posted by EvilJ
 - December 12, 2013, 17:04:51
If you leave "unlimited video memory" unchecked, your selected graphics settings may not be applied when loading the game. It will automatically downgrade certain graphics settings to reached a desired frame rate range. For a test like this, you need to check that box so that your settings are applied.
Posted by GilbertRoyAlva
 - October 19, 2013, 18:58:52
I'm playing the game using Intel HD graphics 3000 on medium from 5 - 24 fps textures, unit details, terrain at medium and high, shadows, particles still at low
Posted by David K
 - September 17, 2013, 10:45:25
This game rather cpu bound and actually gets cpu bottlenecked at anything over a 780m or 660ti(desktop) gpu. You can clearly see that becuase the rates at lower resolutions on the i7-3770k+660ti are higher than the rates of i7-2700k+680. The high resolution differentce of 4 fps is also very low considering the power of the 680 compared to the 660ti in other games. I am guessing that dual gpu setups will not see very big gains.
Posted by jack smith
 - September 14, 2013, 14:09:09
From what I heard, I am not sure if rome 2 supports dual GPU yet
Posted by HF
 - September 14, 2013, 09:46:18
Yea i agree it would be cool to see some dual xfire or sli cards. Also you guy's should run this game on a 4k screen and see how a dual desktop card handles it ^^ If only those cheap 4k screens came with display port :/
Posted by Paul Anthony Soh
 - September 13, 2013, 03:10:31
Heya NBC, might I suggest you start throwing in Dual GPU notebooks in your benchmarking articles? If 1 GPU like say the GTX 680M or the GTX 780M can't run fluently, then how about 2? Because Dual GPUs are increasingly more common these days, shouldn't be any reason not to take your testings to the next level there.

My 2 cents!
Posted by Redaktion
 - September 13, 2013, 00:16:25
Fun or just frustrating? Following the current trend, Creative Assembly and SEGA released their new epic strategy game before it was quite ready. Gamers are registering complaints about performance issues, texture problems and other sorts of bugs. You can find out here how well various notebook GPUs get along with Rome II.

http://www.notebookcheck.net/Total-War-Rome-II-Benchmarked.101446.0.html