![]() |
Eyefinity FAQ
The purpose of this thread is to help clarify the pre-requisites and configuration steps needed for ATI Eyefinity Technology. If you have a question that isn't answered in the following list, please post it and we'll try to answer it for you.
![]()
![]() |
But Passive DisplayPort Adapters are in ATI's Validated Dongles list
I'm still a little confused. :nuts:
If passive DisplayPort adapters can't generate a needed signal, how did several passive adapters end up on ATI's Validated Dongles list? http://support.amd.com/us/eyefinity/...y-dongles.aspx It would be lovely to be able to pick up an approved adapter from Amazon for under $25 and then add a little something to get the free shipping... http://www.amazon.com/Accell-B087B-0...dp/B001CXXD2Q/ I was happy to find the list, but how is it to be used? Are the passives only useful for the Eyefinity 6 boards (which supposedly come with passive adapters)? And the wording on the dongle page suggests that you could use 2 DVI monitors in a dual-display Eyefinity setup. It makes it sound like you only need an active adapter or native DisplayPort monitor to step up to three screens. (Playing a game with a big swath of bezel right down the center sort of kills the dual screen option, but it is "technically" an option...) |
Quote:
I also was wondering how those passive adapters got on the list? |
I am enlightened, thanks to caveman-jim!
caveman-jim -
I finally understand the DisplayPort situation, thanks to your post here: http://www.rage3d.com/board/showthread.php?p=1336238523Your explanation was simple and thorough, so I'll quote it here for those in need: "DVI and HDMI signals require timing signals generated by internal clocks on the card. DVI and HDMI displays need timing signals to display the picture. DisplayPort capable displays do not need timing signals. To save space, cost and complexitiy there are two internal timing signal generators in the ASIC of the AMD ATI Radeon HD 5000 series. This limits the number of outputs with timing signals to two. To enable the third display, you must use the card's DisplayPort connector. If you use an adapter to convert DisplayPort into DVI or HDMI, you must add the timing signals to the output, too. This is done by using an active DisplayPort adapter."Thanks again for explaining the hardware design and its impact... |
FINALLY!!!!
Yes! I finally got a stable working Eyefinity setup a few hours ago. I guess it's been a total of 3 hours now and have not had even the first flicker/blank screen. Here's how it came about: I saw that Office Depot had another ViewSonic VX2433wm LCD just like the two I recently got. The first two I was able to get on sale for $199.99 USD plus tax. The one at Office Depot today was the last one they had on hand and was their display model. I was able to get it for $179.99 USD plus tax. Then followed a trip up to Fry's to pick up a passive DP -> VGA adapter. The model I got is a Monster-brand. It cost $49.99 USD plus tax. This is considerably less than these "iffy" model active adapters like the one I had to take back for a refund. Once I got home, it was less than 5 minutes to having Eyefinity up and running. Since all three LCDs are the exact same model, the image color consistency is excellent! AND, the Bezel Compensation works without a hitch. I had to spend more time simply getting all three monitors leveled up because the mounts aren't exactly consistent. So far, these game and a benchmark are working just fine: 1. Dragon Age: Origins 2. Star Trek Online 3. Heaven 2.0 Benchmark I'm having problems with Far Cry 2 for some reason. It doesn't want to "see" the Eyefinity resolution like it was doing before. I had no trouble previously, but now it just puts the game in 3 instances on the three monitors and the game is 'squished' toward the middle. Would like some suggestions on what needs to be done to fix this, please. ![]() |
Glad you got it working! :up:
FarCry 2 I believe you have to edit the .ini file to get the res selected. It works fine with the benchmark tool. |
Quote:
EDIT: OK...FC2 problems again: I was able to play the previous games with Bezel Compensation working. I disabled that and tried FC2 again and it worked...AFTER changing the resolutions around. So, I went back and enabled the Bezel comp again and now FC2 refuses to see anything above 1920x1080. More work to be done..... UPDATE Some More: I did get it to work again and with Bezel Comp enabled, too. But, now, for some reason, the menu options are stretched across all three screens. When it worked previously for me, the options screen was restricted to the center screen. AND, one more game tested: Mass Effect 2. Although it works, I can't zoom out from the character nor does my 360 controller work. |
Another question:
When I view the new 3D Mark 11 video in Full Screen it makes my two side screens go pure white. Is there anyway to configure to make those screens go black instead? http://www.futuremark.com/benchmarks/3dmark11/teaser/ |
:confused: My additional panels blank to black, or powersave. Don't know why yours are white. What player are you using?
|
Quote:
|
Which latest player? the latest general release or the latest GPU accelerated beta? Flash player in a browser, which one?
|
"You have version 10,1,53,7 installed" is what Flash Player tells me.
I'm using FF 3.6.3. for the browser. Edit: Just installed the latest downloadable public release. Side screens still turn white when I use Full Screen for the video. "You have version 10,0,45,2 installed" NOTE: Isn't this a step backwards, though? ??? Further NOTE: FWIW, I did download and install the latest Pre-release. Still white screens on the side. |
Another question:
I'm wondering why the center image is being extended onto the side screens by a few inches? Like this: ![]() It's not usually a problem. But, in some cases, when in a game, the subtitles [which I have to have due to deafness] get extended through the bezels, too. Is there anyway to restrict the center screen to the actual 1920x1080? It appears that Eyefinity is actually cutting off some of the vertical portion of the image and then expanding it. Does this make sense? |
Huh?
Bezel compensation doesn't show whatever is behind the bezels so that would be working normally. If you dont have BC enabled then things aren't chopped off but then things are of course widened out. I'd imagine that the windows login screen ignores the bezel compensation. |
The windows logon screen does indeed use bezel compensation, what you're seeing there is how big that screen scales. Windows logon maintains the aspect ratio unless you specifically edit the registry to tell it to stretch or fill, leaving best fit as the default option.
|
OK.
So, how would I configure a game to restrict the center screen so that it doesn't "bleed" over onto the side screens? Or, is there a way at all? Like I said above, it's not normally a problem except when text/subtitles extend through the bezels. Although I can quickly make heads and tails of what is being said, it can be a distraction in the game. The specific game that I encounter this in is "Alpha Protocol." But, it does happen with any game when the text is long enough to go past the center screen. |
The game has to be updated with the Eyefinity SDK to use the driver hooks to find the bezels and center, left/right justify or float important items.
|
Added Eyefinity Surround Sight info video and information about Hydragrid with Eyefinity.
|
Why, all of a sudden, does CCC tell me that Bezel Compensation won't be optimized due to my screen size and resolutions not being detected as being identical?
All three of my screens are the exact same make and model and this prompt has only started showing up in the last week or so. However, once I click OK it goes ahead and does the Bezel Comp option and I can move along. Is this something AMD has added to allow everyone to use Bezel Comp even if they have non-identical screens? |
I'm not aware of any changes, I'll find out - thanks for bringing it up.
|
Got some answers for you, check it out here
|
Quote:
|
i wish AMD support resolutions higher than desktop and ability to map any user defined region to any output device/monitor ...
example: render 3xmonitors widescreen and extra monitor for mirror, another monitor for instrument board and another for map now before You go it's not possible, it's been done and it's possible for at minimum 4 or 5 years (and that's by indie coder via software tool) |
thats explicitly supported by the AMD Eyefinity SDK I believe. For that circumstance you would be running a 3x1 SLS plus two additional monitors, if the game supported it through the SDK no problems.
|
Quote:
what i got in mind is part of total area > monitor etc. afaik that wasn't possible with the EF SDK... but maybe it was updated since |
Refresh rate lock in Eyefinity setting
Hey I have been trying to change my desktop refresh rate from 59/60 to 75. But havent found a way to make the change with an Eyefinity setup
I have tried ATI Traytool and Powerstrip, but without any succes. Its on my windows 7 64Bit platform I would like to enable it. Any good ideas of how to do so? |
Do all your panels support 75hz at the resolution each is using in the eyefinity group?
|
Quote:
Im not sure if the problem is, that I have 2 types of monitors :confused: |
All times are GMT -5. The time now is 04:45 PM. |
Powered by vBulletin® Version 3.6.5
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
All trademarks used are properties of their respective owners. Copyright ©1998-2011 Rage3D.com