HOWTO: 10 bpc laptop running Linux

Updated initramfs (several times since the output of it didn't include the EDID binary file somehow; included in dump). Also dumping filtered syslog and full Xorg.0.log.
Great! I actually realized that the custom EDID file wasn't loaded as part of the initramfs. I tried building initramfs with several times between reboots hoping it would pick up (as you did), but it didn't. As Arch Wiki suggested, I added a FILES line (and updated initramfs, grub etc. then rebooted several times) but that did nothing.

Later I found out that, including FILES is an Archlinux specific feature, not Debian (or Ubuntu) specific. In Debian-based systems, they use hooks to include files [1]. I tried setting up a hook that includes the EDID firmware file during initramfs creation.

[1] Ubuntu Manpage: initramfs-tools - an introduction to writing scripts for mkinitramfs

 

For the good part, unlike previous attempts, the output during initramfs creation did include the EDID file. For the horrible part, system refused to boot.

Yes, I bricked the computer.

It fails to load initramfs right after selecting the OS in Grub. Now I have to get a live system ready, and chroot into it to create a sane initramfs.

I think what I need to do to include that EDID file in initramfs is one of the following methods. Instead, I used the helper function prepend_earlyinitramfs mentioned in that man page, which was something else.

Unbricked the laptop.

Injected the EDID file into initramfs successfully (see lsinitramfs -l result). Dumping Xorg.0.log, filtered syslog, parameters from interesting mods, and PTB output.
Responding inline.


---In PSYCHTOOLBOX@yahoogroups.com, <mario.kleiner@...> wrote :

XXX---In PSYCHTOOLBOX@yahoogroups.com, <hormet.yiltiz@...> wrote :
I think i'll have to look into a better solution for this. One that bypasses the need to get the EDID into initramfs, e.g., deferring kms driver init by a few seconds. Simply switching the video mode once, e.g. from 60 Hz video refresh to 50 Hz and back to 60 Hz after login, or blanking the display for a second would very likely have achieved the same result from what i could see from the logs. A bit of a hack, but plug & play and much less potential for anything going wrong.

> Since xrandr is easily available, this should be a much easier solution, if it works. Also, since Gnome/Unity calls xrandr internally to control the display, we might even be better off suggesting users to use System Settings / Display tool to change the resolution/refresh rate then take it back. Alternatively, we should also be able to do it directly by a syscall right from within Psychtoolbox, right? Many games can change resolution back and forth.

You would probably get some slight colorization for every 2nd intensity step, when the finer resolved red+green intensity would be 1 step higher than blue -> Some slight yellowish tint away from pure gray. It's a bit like bitstealing, where you increase "grayscale" resolution but pay with some slight colorization where there shouldn't be any. If this really matters in your case, or is perceptible at such small increments, or how it interacts with spatial dithering, i don't know.

> This may be worth trying. Last time when I was using the llvm-based software rendering, I had colorful pixels while it were supposed to be only gray scale pixels, and was too obvious it could ruin an experiment. Not sure how salient the colorization would be in the hardware rendering case, though.

-> The "16 bit" mode, which if it works at all would actually be a dithered 12 bit mode, shouldn't have that problem, as all color channels are resolved at the same precision. As i said, the > 10 bpc modes are not tested under realistic conditions as i lacked the neccessary hardware, so this would the first realistic test to validate it. The "help PsychImaging" points you to a special xorg.conf file bundled with PTB that enables that mode, and it describes the conditions that need to be satisfied for that mode to work at all. Procedure is as usual -> copy xorg.conf file into place, logout/login, test. To undo, delete xorg.conf file, logout/login again.

I enabled the xorg.conf file, then logged out. Trying to log back in wasn't possible, since even lightdm couldn't start. From TTY1, I found that even xinit could not start. According to Xorg.0.log, radeon was refusing to use a 30 bpc frame buffer. I tried various values (8, 10, 24, 30, 32 etc.), and the only one that can start X is 24. Kernel module radeon has deep_color enabled, and has a vm_block_size of 12. So I am not sure why it is refusing the a DefaultDepth of 30. Attaching Xorg.0.log right after a failed xinit, the xorg.conf file and radeon module parameters.

-> Given the unknown spectral color emission profile or other non-linearities of your LCD panel, or view angle dependence of the panel, or how well dithering interacts with the 10 bpc panel, or what dithering strategy the firmware of your Laptop would choose, and how much that strategy was optimized for the specific panel, and all the variability on the receiver side, the best you can do is try. Obviously you would start off verifying if 10 bpc works as expected.

I asked around in our department, and wasn't able to find anyone who has a colorimeter so I may not be able to measure the increase in each channel in a custom calibrated display. We do have a photometer, so I can measure it with some factory provided icc profile, I suppose.

-> I just want that one or two lines of PTB's debug output into octave's command window, which should happen in response to the 'ConfigureDisplay' call. Something with "PTB-INFO: SetDitherMode" in it. All other output doesn't matter anymore, now that we know it is fine. There's some warning('off', '') setting btw. that will suppress those pointless "shadows a core library function" clutter. You can put it at the top of the ~/.octaverc startup file. Again your message was truncated for exceeding 64 kb before any interesting info showed up.

hyiltiz@HP-ZBook-17-G2 ~> octave --no-gui --eval "w=Screen('Openwindow',0); Screen('ConfigureDisplay', 'Dithering', 0, 1); sca;"
warning: function ./Speak.m shadows a core library function
...
PTB-INFO: SetDitherMode: Trying to enable digital display dithering on display head 0.
PTB-INFO: SetDitherMode: Dithering off. Enabling with userspace provided setting 1. Cross your fingers!
...

-> Btw. If your experiments involves text/letter stimuli some slight trickery in your script will be needed to take advantage of the precision. All our current text renderers only operate at 8 bpc for text color when drawing into the framebuffer. That's a software limitation of our current text rendering plugin and a hard limitation of the older legacy text renderers.

Alas, sad to know. I looked around, and it seems no text renderer has 10 bpc support. So, I should draw text, convert it to binary image matrix, scale it with a desired contrast value (not using color) for 10 bpc frame buffer, then present the image with PsychImaging?

-mario

~> 
octave --no-gui --eval "w=Screen('Openwindow',0); Screen('ConfigureDisplay', 'Dithering', 0, 1); sca;"
warning: function ./Speak.m shadows a core library function
warning: function /usr/share/octave/packages/plot-1.1.0/zoom.m shadows a core library function
warning: function /home/hyiltiz/Documents/Psychtoolbox/Quest/QuestTrials.m shadows a core library function


(Message over 64 KB, truncated)
According to xdpyinfo, 30 or 10 are not among the available depth values. 

root@HP-ZBook-17-G2 /h/hyiltiz# xdpyinfo -display :0
name of display:    :0
version number:    11.0
vendor string:    The X.Org Foundation
vendor release number:    11804000
X.Org version: 1.18.4
maximum request size:  16777212 bytes
motion buffer size:  256
bitmap unit, bit order, padding:    32, LSBFirst, 32
image byte order:    LSBFirst
number of supported pixmap formats:    7
supported pixmap formats:
    depth 1, bits_per_pixel 1, scanline_pad 32
    depth 4, bits_per_pixel 8, scanline_pad 32
    depth 8, bits_per_pixel 8, scanline_pad 32
    depth 15, bits_per_pixel 16, scanline_pad 32
    depth 16, bits_per_pixel 16, scanline_pad 32
    depth 24, bits_per_pixel 32, scanline_pad 32
    depth 32, bits_per_pixel 32, scanline_pad 32
keycode range:    minimum 8, maximum 255
focus:  window 0x4a0000b, revert to Parent
number of extensions:    29
    BIG-REQUESTS
    Composite
    DAMAGE
    DOUBLE-BUFFER
    DPMS
    DRI2
    GLX
    Generic Event Extension
    MIT-SCREEN-SAVER
    MIT-SHM
    Present
    RANDR
    RECORD
    RENDER
    SECURITY
    SGI-GLX
    SHAPE
    SYNC
    X-Resource
    XC-MISC
    XFIXES
    XFree86-DGA
    XFree86-VidModeExtension
    XINERAMA
    XInputExtension
    XKEYBOARD
    XTEST
    XVideo
    XVideo-MotionCompensation
default screen number:    0
number of screens:    1

screen #0:
  dimensions:    1920x1080 pixels (508x285 millimeters)
  resolution:    96x96 dots per inch
  depths (7):    24, 1, 4, 8, 15, 16, 32
  root window id:    0x498
  depth of root window:    24 planes
  number of colormaps:    minimum 1, maximum 1
  default colormap:    0x20
  default number of colormap cells:    256
  preallocated pixels:    black 0, white 16777215
  options:    backing-store WHEN MAPPED, save-unders NO
  largest cursor:    128x128
  current input event mask:    0x7a803f
    KeyPressMask             KeyReleaseMask           ButtonPressMask          
    ButtonReleaseMask        EnterWindowMask          LeaveWindowMask          
    ExposureMask             StructureNotifyMask      SubstructureNotifyMask   
    SubstructureRedirectMask FocusChangeMask          PropertyChangeMask       
  number of visuals:    480
  default visual id:  0x21
  visual:
    visual id:    0x21
    class:    TrueColor
    depth:    24 planes
    available colormap entries:    256 per subfield
    red, green, blue masks:    0xff0000, 0xff00, 0xff
    significant bits in color specification:    8 bits
  visual:
    visual id:    0x22
...


XXX---In PSYCHTOOLBOX@yahoogroups.com, <hormet.yiltiz@...> wrote :

Responding inline.


XXX---In PSYCHTOOLBOX@yahoogroups.com, <mario.kleiner@...> wrote :

XXX---In PSYCHTOOLBOX@yahoogroups.com, <hormet.yiltiz@...> wrote :
I think i'll have to look into a better solution for this. One that bypasses the need to get the EDID into initramfs, e.g., deferring kms driver init by a few seconds. Simply switching the video mode once, e.g. from 60 Hz video refresh to 50 Hz and back to 60 Hz after login, or blanking the display for a second would very likely have achieved the same result from what i could see from the logs. A bit of a hack, but plug & play and much less potential for anything going wrong.

> Since xrandr is easily available, this should be a much easier solution, if it works. Also, since Gnome/Unity calls xrandr internally to control the display, we might even be better off suggesting users to use System Settings / Display tool to change the resolution/refresh rate then take it back. Alternatively, we should also be able to do it directly by a syscall right from within Psychtoolbox, right? Many games can change resolution back and forth.

--> PTB supports that.

You would probably get some slight colorization for every 2nd intensity step, when the finer resolved red+green intensity would be 1 step higher than blue -> Some slight yellowish tint away from pure gray. It's a bit like bitstealing, where you increase "grayscale" resolution but pay with some slight colorization where there shouldn't be any. If this really matters in your case, or is perceptible at such small increments, or how it interacts with spatial dithering, i don't know.

> This may be worth trying. Last time when I was using the llvm-based software rendering, I had colorful pixels while it were supposed to be only gray scale pixels, and was too obvious it could ruin an experiment. Not sure how salient the colorization would be in the hardware rendering case, though.

-> The "16 bit" mode, which if it works at all would actually be a dithered 12 bit mode, shouldn't have that problem, as all color channels are resolved at the same precision. As i said, the > 10 bpc modes are not tested under realistic conditions as i lacked the neccessary hardware, so this would the first realistic test to validate it. The "help PsychImaging" points you to a special xorg.conf file bundled with PTB that enables that mode, and it describes the conditions that need to be satisfied for that mode to work at all. Procedure is as usual -> copy xorg.conf file into place, logout/login, test. To undo, delete xorg.conf file, logout/login again.

I enabled the xorg.conf file, then logged out. Trying to log back in wasn't possible, since even lightdm couldn't start. From TTY1, I found that even xinit could not start. According to Xorg.0.log, radeon was refusing to use a 30 bpc frame buffer. I tried various values (8, 10, 24, 30, 32 etc.), and the only one that can start X is 24. Kernel module radeon has deep_color enabled, and has a vm_block_size of 12. So I am not sure why it is refusing the a DefaultDepth of 30. Attaching Xorg.0.log right after a failed xinit, the xorg.conf file and radeon module parameters.

--> I didn't look at your log, but seems you used the wrong config file. You need this one, as described in the 16bpc section of help PsychImaging: xorg.conf_For_AMD16bpcFramebuffer

--> The currently shipping AMD userspace driver doesn't support 10 bpc natively, although the kernel does. That's why all the log output apart from PTB's output will claim it is running 8 bpc / 24 bpp. PTB does the 10/11/16 bpc setup itself, "behind the back" of the OS. What the OS does itself via deep_color support is setup the display engines to get > 8 bpc content out of the framebuffer and to the display if that display supports it. I had patches for the userspace drivers for native 10 bpc support, but ran out of time at that time around 2014 to upstream them, so that would need quite a bit of rework before it is acceptable for upstream and i don't have time for it atm.

-> Given the unknown spectral color emission profile or other non-linearities of your LCD panel, or view angle dependence of the panel, or how well dithering interacts with the 10 bpc panel, or what dithering strategy the firmware of your Laptop would choose, and how much that strategy was optimized for the specific panel, and all the variability on the receiver side, the best you can do is try. Obviously you would start off verifying if 10 bpc works as expected.

I asked around in our department, and wasn't able to find anyone who has a colorimeter so I may not be able to measure the increase in each channel in a custom calibrated display. We do have a photometer, so I can measure it with some factory provided icc profile, I suppose.

--> Not sure what you mean with the icc profile here? You do need to measure if you get the ~1024 luminance levels you'd expect with PTB after setting up 10 bpc mode with PsychImaging, and do gamma correction as needed via PsychColorCorrection()'s methods, as the hardware gamma tables are disabled whie PTBs 10 bpc mode is active.

hyiltiz@HP-ZBook-17-G2 ~> octave --no-gui --eval "w=Screen('Openwindow',0); Screen('ConfigureDisplay', 'Dithering', 0, 1); sca;"
warning: function ./Speak.m shadows a core library function
...
PTB-INFO: SetDitherMode: Trying to enable digital display dithering on display head 0.
PTB-INFO: SetDitherMode: Dithering off. Enabling with userspace provided setting 1. Cross your fingers!
...

--> Interesting. According to this, the firmware completely and unconditionally disables dithering on your 10 bpc panel. So if this info is correct, the display would not show anything more than 10 bpc, even in 11 bpc or 16 bpc PTB modes, as those higher depths can only be simulated by dithering. So we'd need to program something else ourselves if you wanted to try simulated 12 bpc.

--> I suggest you reboot the machine, and first verify with a photometer that you do get 10 bpc with PTB from the laptop, before trying if 12 bpc dithered mode would work and have a benefit.


-> Btw. If your experiments involves text/letter stimuli some slight trickery in your script will be needed to take advantage of the precision. All our current text renderers only operate at 8 bpc for text color when drawing into the framebuffer. That's a software limitation of our current text rendering plugin and a hard limitation of the older legacy text renderers.

Alas, sad to know. I looked around, and it seems no text renderer has 10 bpc support. So, I should draw text, convert it to binary image matrix, scale it with a desired contrast value (not using color) for 10 bpc frame buffer, then present the image with PsychImaging?

--> The text renderer plugin doesn't implement > 8 bpc atm. It would be possible to implement it, but i'd predict anywhere between 1-3 days of fulltime work from myself for that to implement it properly. The legacy renderers can't do it at all on any OS.

--> You would render your text into a standard 8 bpc offscreen window, essentially as a text slide, with the offscreen windows background color set to [0 0 0 0]. Text color would be full intensity white. So your offscreen window would become a standard 8 bpc texture with white text on black background, and the alpha channel would encode text transparency. The 255 alpha levels would allow for the usual anti-aliasing.

Then with the onscreen window set up for 10 bit mode and alpha blending properly set, you could use 'DrawTexture' to draw the slide of text into the high precision framebuffer, and use the 'globalAlpha' or 'modulateColor' arguments of Screen('DrawTexture', ...) to actually define the contrast or intensity of your text with > 8 bpc precision. It's a bit of a indirection, because DrawTexture and alpha blending do work at maximum precision on the pixel data provided by your "pure 8 bpc" text slide.

-mario

~> 
octave --no-gui --eval "w=Screen('Openwindow',0); Screen('ConfigureDisplay', 'Dithering', 0, 1); sca;"
warning: function ./Speak.m shadows a core library function
warning: function /usr/share/octave/packages/plot-1.1.0/zoom.m shadows a core library function
warning: function /home/hyiltiz/Documents/Psychtoolbox/Quest/QuestTrials.m shadows a core library function


(Message over 64 KB, truncated)
Sorry for using the wrong xorg.conf file. I was trying to first test the 10bpc file, as I understood that 16 bit frame buffers are more expensive and a bit hacky workaround. Anyway, that didn't work.

Enabled xorg.conf_For_AMD16bpcFramebuffer. X started normally. PTB3 said it OS won't support Native16Bit and will use home made hack. xdpyinfo (as expected) reports an 8 bits color depth.

Now, I think we are quite at the point to see whether we can get the 10 bit performance. I wrote the following minimal script that I tried to use to measure the luminance increase with a photometer. However, once I run the script (regardless of bit=8 or bit=10), the screen turns to full green, and this color won't change if space is pressed. Pressing escape returns to Octave. The screen resolution except Unity Dash and Unity Mir/Menubar is screwed (e.g. Terminal, Chrome, Desktop etc. has something like 600x400 resolution), and mouse won't click on where it points (mapping is wrong). Logging out then back in will fix it. 

Do you have a minimal script that can test and measure a 10 bpc setup? I wrote the following script. Also dumping PTB3 output after running it.

clear all;
bit=8; % change to 10 to test 10bpc setup
PsychImaging('PrepareConfiguration');
PsychImaging('AddTask','General','EnableNative16BitFramebuffer', true, bit); % dithering off, 10 bpc
w=PsychImaging('OpenWindow', 0);
Screen('LoadNormalizedGammaTable', w, linspace(0,1,2^bit)' * [1 1 1]); % a 10 bit Gamma table
Screen('Flip', w);
for i=1:2^bit
    Screen('FillRect', w, i); % increment using Gamma table index
    Screen('Flip', w);
    [~,keyCode] = KbPressWait;
    if strcmp(KbName(keyCode), 'Escape'); break;end
end
sca;

PTB-INFO: This is Psychtoolbox-3 for GNU/Linux X11, under GNU/Octave 64-Bit (Version 3.0.14 - Build date: Dec 22 2016).
PTB-INFO: Support status on this operating system release: Linux 4.4.0-57-generic Supported.
PTB-INFO: Type 'PsychtoolboxVersion' for more detailed version information.
PTB-INFO: Most parts of the Psychtoolbox distribution are licensed to you under terms of the MIT License, with
PTB-INFO: some restrictions. See file 'License.txt' in the Psychtoolbox root folder for the exact licensing conditions.

PTB-INFO: Advanced Micro Devices, Inc. [AMD/ATI] - Bonaire XT [Radeon R9 M280X] GPU found. Trying to establish low-level access...
PTB-INFO: Connected to Advanced Micro Devices, Inc. [AMD/ATI] Bonaire XT [Radeon R9 M280X] GPU with DCE-8.0 display engine [2 heads]. Beamposition timestamping enabled.
PTB-INFO: Trying to enable at least 16 bpc fixed point framebuffer.
PTB-INFO: Native 16 bit per color framebuffer requested, but the OS doesn't allow it. It only provides 8 bpc.
PTB-INFO: Will now try to use our own high bit depth setup code as an alternative approach to fullfill your needs.
PTB-INFO: Assuming kernel driver provided color resolution of the GPU framebuffer will be 16 bits per RGB color component.

PTB-WARNING: Flip for window 10 didn't use pageflipping for flip. Visual presentation timing and timestamps are likely unreliable!
PTB-WARNING: Something is misconfigured on your system, otherwise pageflipping would have been used by the graphics driver for reliable timing.
PTB-WARNING: Read the Linux specific section of 'help SyncTrouble' for some common causes and fixes for this problem.

...


XX---In PSYCHTOOLBOX@yahoogroups.com, <hormet.yiltiz@...> wrote :

Sorry for using the wrong xorg.conf file. I was trying to first test the 10bpc file, as I understood that 16 bit frame buffers are more expensive and a bit hacky workaround. Anyway, that didn't work.

Enabled xorg.conf_For_AMD16bpcFramebuffer. X started normally. PTB3 said it OS won't support Native16Bit and will use home made hack. xdpyinfo (as expected) reports an 8 bits color depth.

Now, I think we are quite at the point to see whether we can get the 10 bit performance. I wrote the following minimal script that I tried to use to measure the luminance increase with a photometer. However, once I run the script (regardless of bit=8 or bit=10), the screen turns to full green, and this color won't change if space is pressed. Pressing escape returns to Octave. The screen resolution except Unity Dash and Unity Mir/Menubar is screwed (e.g. Terminal, Chrome, Desktop etc. has something like 600x400 resolution), and mouse won't click on where it points (mapping is wrong). Logging out then back in will fix it. 

-> The 16bpc support is a gigantic hack with unknown chance of actually working for real. I'll need to retest this when i'm on a AMD card in the lab, to see if it is still remotely functional on current distributions. I think i did the basic testing, which at least worked without obvious malfunctions, on Ubuntu 14.04, maybe with KDE, maybe with Unity, can't remember, it's been multiple years.

-> That's why i told you to *first test the basic 10 bpc support* with a photometer. There is absolutely no point in trying this advanced and super-experimental stuff before you know for sure that the basics work which should work.

-> Remove all xorg.conf files, you *don't* need them for 10 bpc or 11 bpc support. Reboot the machine, just to be safe the 16 bpc test didn't screw up anything. You don't need to send any log files etc. as there isn't anything new to be learned. Only the PTB output of your test script to verify basic setup.

-> Then you can test 10 bpc support, following the way it is done in AdditiveBlending... Tutorial et al.

Do you have a minimal script that can test and measure a 10 bpc setup? I wrote the following script. Also dumping PTB3 output after running it.

-> This script can't work, it isn't written like the demo scripts for 10 bpc etc. that we have.

-> Use the 'EnableNative10BitFramebuffer' task, with no extra arguments whatsoever, just the defaults.

-> As i said before, the hardware gamma tables are disabled during 10 bpc framebuffer mode, so 'LoadNormalizedGammatable' ideally will do nothing at all, worst case will screw up the display in some way. You would need to use the PsychColorCorrection functionality to use PTB's own color management and gamma correction etc., but that isn't needed for this basic photometer test.

-> in > 8 bpc modes we use a normalized 0-1 color range afaik, so your 'FillRect' command would need to run through a range of i = linspace(0,1, 1024) for 1024 increments in the 0.0 to 1.0 range.

-> What should also work if you *don't* enable a Psychtoolbox 10 bpc framebuffer, is using 'LoadNormalizedGammatable' to twiddle gamma table values in 1/1024 increments, as with the normal standard 8 bpc framebuffer the 256-slots 10 bit wide hardware gamma tables will be active and they would translate from 8 bpc framebuffer values into 10 bpc display output values. We likely have some scripts for such measurements as part of the PsychCal folder, but that's ultimately not what you are interested in.

-> Even the 11 bpc mode would require us to manually enable and configure display dithering in the hope that we can override the decision of the laptop firmware to completely disable dithering. Something that may or may not work. But first you need to make sure 10 bpc mode works solid.

-mario

clear all;
bit=8; % change to 10 to test 10bpc setup
PsychImaging('PrepareConfiguration');
PsychImaging('AddTask','General','EnableNative16BitFramebuffer', true, bit); % dithering off, 10 bpc
w=PsychImaging('OpenWindow', 0);
Screen('LoadNormalizedGammaTable', w, linspace(0,1,2^bit)' * [1 1 1]); % a 10 bit Gamma table
Screen('Flip', w);
for i=1:2^bit
    Screen('FillRect', w, i); % increment using Gamma table index
    Screen('Flip', w);
    [~,keyCode] = KbPressWait;
    if strcmp(KbName(keyCode), 'Escape'); break;end
end
sca;

PTB-INFO: This is Psychtoolbox-3 for GNU/Linux X11, under GNU/Octave 64-Bit (Version 3.0.14 - Build date: Dec 22 2016).
PTB-INFO: Support status on this operating system release: Linux 4.4.0-57-generic Supported.
PTB-INFO: Type 'PsychtoolboxVersion' for more detailed version information.
PTB-INFO: Most parts of the Psychtoolbox distribution are licensed to you under terms of the MIT License, with
PTB-INFO: some restrictions. See file 'License.txt' in the Psychtoolbox root folder for the exact licensing conditions.

PTB-INFO: Advanced Micro Devices, Inc. [AMD/ATI] - Bonaire XT [Radeon R9 M280X] GPU found. Trying to establish low-level access...
PTB-INFO: Connected to Advanced Micro Devices, Inc. [AMD/ATI] Bonaire XT [Radeon R9 M280X] GPU with DCE-8.0 display engine [2 heads]. Beamposition timestamping enabled.
PTB-INFO: Trying to enable at least 16 bpc fixed point framebuffer.
PTB-INFO: Native 16 bit per color framebuffer requested, but the OS doesn't allow it. It only provides 8 bpc.
PTB-INFO: Will now try to use our own high bit depth setup code as an alternative approach to fullfill your needs.
PTB-INFO: Assuming kernel driver provided color resolution of the GPU framebuffer will be 16 bits per RGB color component.

PTB-WARNING: Flip for window 10 didn't use pageflipping for flip. Visual presentation timing and timestamps are likely unreliable!
PTB-WARNING: Something is misconfigured on your system, otherwise pageflipping would have been used by the graphics driver for reliable timing.
PTB-WARNING: Read the Linux specific section of 'help SyncTrouble' for some common causes and fixes for this problem.

...


Ping? Does this work in 10 bpc now?
-mario

---In PSYCHTOOLBOX@yahoogroups.com, <mario.kleiner@...> wrote :

XX---In PSYCHTOOLBOX@yahoogroups.com, <hormet.yiltiz@...> wrote :

Sorry for using the wrong xorg.conf file. I was trying to first test the 10bpc file, as I understood that 16 bit frame buffers are more expensive and a bit hacky workaround. Anyway, that didn't work.

Enabled xorg.conf_For_AMD16bpcFramebuffer. X started normally. PTB3 said it OS won't support Native16Bit and will use home made hack. xdpyinfo (as expected) reports an 8 bits color depth.

Now, I think we are quite at the point to see whether we can get the 10 bit performance. I wrote the following minimal script that I tried to use to measure the luminance increase with a photometer. However, once I run the script (regardless of bit=8 or bit=10), the screen turns to full green, and this color won't change if space is pressed. Pressing escape returns to Octave. The screen resolution except Unity Dash and Unity Mir/Menubar is screwed (e.g. Terminal, Chrome, Desktop etc. has something like 600x400 resolution), and mouse won't click on where it points (mapping is wrong). Logging out then back in will fix it. 

-> The 16bpc support is a gigantic hack with unknown chance of actually working for real. I'll need to retest this when i'm on a AMD card in the lab, to see if it is still remotely functional on current distributions. I think i did the basic testing, which at least worked without obvious malfunctions, on Ubuntu 14.04, maybe with KDE, maybe with Unity, can't remember, it's been multiple years.

-> That's why i told you to *first test the basic 10 bpc support* with a photometer. There is absolutely no point in trying this advanced and super-experimental stuff before you know for sure that the basics work which should work.

-> Remove all xorg.conf files, you *don't* need them for 10 bpc or 11 bpc support. Reboot the machine, just to be safe the 16 bpc test didn't screw up anything. You don't need to send any log files etc. as there isn't anything new to be learned. Only the PTB output of your test script to verify basic setup.

-> Then you can test 10 bpc support, following the way it is done in AdditiveBlending... Tutorial et al.

Do you have a minimal script that can test and measure a 10 bpc setup? I wrote the following script. Also dumping PTB3 output after running it.

-> This script can't work, it isn't written like the demo scripts for 10 bpc etc. that we have.

-> Use the 'EnableNative10BitFramebuffer' task, with no extra arguments whatsoever, just the defaults.

-> As i said before, the hardware gamma tables are disabled during 10 bpc framebuffer mode, so 'LoadNormalizedGammatable' ideally will do nothing at all, worst case will screw up the display in some way. You would need to use the PsychColorCorrection functionality to use PTB's own color management and gamma correction etc., but that isn't needed for this basic photometer test.

-> in > 8 bpc modes we use a normalized 0-1 color range afaik, so your 'FillRect' command would need to run through a range of i = linspace(0,1, 1024) for 1024 increments in the 0.0 to 1.0 range.

-> What should also work if you *don't* enable a Psychtoolbox 10 bpc framebuffer, is using 'LoadNormalizedGammatable' to twiddle gamma table values in 1/1024 increments, as with the normal standard 8 bpc framebuffer the 256-slots 10 bit wide hardware gamma tables will be active and they would translate from 8 bpc framebuffer values into 10 bpc display output values. We likely have some scripts for such measurements as part of the PsychCal folder, but that's ultimately not what you are interested in.

-> Even the 11 bpc mode would require us to manually enable and configure display dithering in the hope that we can override the decision of the laptop firmware to completely disable dithering. Something that may or may not work. But first you need to make sure 10 bpc mode works solid.

-mario

clear all;
bit=8; % change to 10 to test 10bpc setup
PsychImaging('PrepareConfiguration');
PsychImaging('AddTask','General','EnableNative16BitFramebuffer', true, bit); % dithering off, 10 bpc
w=PsychImaging('OpenWindow', 0);
Screen('LoadNormalizedGammaTable', w, linspace(0,1,2^bit)' * [1 1 1]); % a 10 bit Gamma table
Screen('Flip', w);
for i=1:2^bit
    Screen('FillRect', w, i); % increment using Gamma table index
    Screen('Flip', w);
    [~,keyCode] = KbPressWait;
    if strcmp(KbName(keyCode), 'Escape'); break;end
end
sca;

PTB-INFO: This is Psychtoolbox-3 for GNU/Linux X11, under GNU/Octave 64-Bit (Version 3.0.14 - Build date: Dec 22 2016).
PTB-INFO: Support status on this operating system release: Linux 4.4.0-57-generic Supported.
PTB-INFO: Type 'PsychtoolboxVersion' for more detailed version information.
PTB-INFO: Most parts of the Psychtoolbox distribution are licensed to you under terms of the MIT License, with
PTB-INFO: some restrictions. See file 'License.txt' in the Psychtoolbox root folder for the exact licensing conditions.

PTB-INFO: Advanced Micro Devices, Inc. [AMD/ATI] - Bonaire XT [Radeon R9 M280X] GPU found. Trying to establish low-level access...
PTB-INFO: Connected to Advanced Micro Devices, Inc. [AMD/ATI] Bonaire XT [Radeon R9 M280X] GPU with DCE-8.0 display engine [2 heads]. Beamposition timestamping enabled.
PTB-INFO: Trying to enable at least 16 bpc fixed point framebuffer.
PTB-INFO: Native 16 bit per color framebuffer requested, but the OS doesn't allow it. It only provides 8 bpc.
PTB-INFO: Will now try to use our own high bit depth setup code as an alternative approach to fullfill your needs.
PTB-INFO: Assuming kernel driver provided color resolution of the GPU framebuffer will be 16 bits per RGB color component.

PTB-WARNING: Flip for window 10 didn't use pageflipping for flip. Visual presentation timing and timestamps are likely unreliable!
PTB-WARNING: Something is misconfigured on your system, otherwise pageflipping would have been used by the graphics driver for reliable timing.
PTB-WARNING: Read the Linux specific section of 'help SyncTrouble' for some common causes and fixes for this problem.

...


I created the following minimal script (attached) to test the 10 bpc performance with a photometer. Didn't try the whole 2^10 range, but I tried about the first 100 "color" levels. A change in gray level measured by a photometer is strong enough to be also detected with bare eye when observed peripherally. 1 in every 4 levels did NOT produce any change in gray level, but all the rest did produce a change. 

Although one level in every 4th didn't produce a measurable change in luminance, others did. I would say, if we were having a 8 bpc performance, it should have been that only 1 in every 4th would produce a measurable change and the rest 3 would produce no change in luminance. So it should be working in somewhat weird way, no?

I am sorry for not being able to respond in a timely manner. MATLAB stopped starting without any output/warning/error, and just finished fixing it. The Mathworks and Ubuntu forum pointed to standard C++ library dynamic overloading issues. It turned out that Mathworks renewed their license file, and decided not to provide any output whatsoever.
thanks hormet



Denis Pelli
Professor of Psychology & Neural Science, New York University
+1-646-258-7524 | denis.pelli@... | http://psych.nyu.edu/pelli/ | Skype: denispelli | http://denispelli.com

On Wed, Feb 8, 2017 at 8:51 PM, hormet.yiltiz@... [PSYCHTOOLBOX] <PSYCHTOOLBOX@yahoogroups.com> wrote:
[Attachment(s) from hormet.yiltiz@... [PSYCHTOOLBOX] included below]

I created the following minimal script (attached) to test the 10 bpc performance with a photometer. Didn't try the whole 2^10 range, but I tried about the first 100 "color" levels. A change in gray level measured by a photometer is strong enough to be also detected with bare eye when observed peripherally. 1 in every 4 levels did NOT produce any change in gray level, but all the rest did produce a change.


Although one level in every 4th didn't produce a measurable change in luminance, others did. I would say, if we were having a 8 bpc performance, it should have been that only 1 in every 4th would produce a measurable change and the rest 3 would produce no change in luminance. So it should be working in somewhat weird way, no?

I am sorry for not being able to respond in a timely manner. MATLAB stopped starting without any output/warning/error, and just finished fixing it. The Mathworks and Ubuntu fo! rum pointed to standard C++ library dynamic overloading issues. It turned out that Mathworks renewed their license file, and decided not to provide any output whatsoever.


I adjusted the script a bit so it can display either a uniform color/gray patch, or a gradient that consists of long rectangles with 10 pixels wide. 

Using the gradient, it seems that, as we go to the mid-gray level, the step becomes more uniform. I noticed that 10 bpc code only works (e.g. get around 2^10 grays instead of around 2^8) when I explicitly set a gamma value:
    PsychColorCorrection('SetEncodingGamma', w, 1 / 2.0);

I found that this gamma value can effect the previous "3 level out of 4 step" effect; setting it around 1/0.8 gives way less colors (smaller than ~2^8) while 1/2.0 gives "3 levels of colors out of 4 steps". So, I set it to 1/2.4, which produced 1 gray level for each step. However, this 1/2.4 value is quite arbitrary. Is there any way to systematically optimize this value?

Now, about 11 bpc. The display was strongly colorized to red, cyan etc. (its code and photo taken using my phone attached). Below is the output:


PTB-INFO: This is Psychtoolbox-3 for GNU/Linux X11, under GNU/Octave 64-Bit (Version 3.0.14 - Build date: Dec 22 2016).
PTB-INFO: Support status on this operating system release: Linux 4.4.0-62-generic Supported.
PTB-INFO: Type 'PsychtoolboxVersion' for more detailed version information.
PTB-INFO: Most parts of the Psychtoolbox distribution are licensed to you under terms of the MIT License, with
PTB-INFO: some restrictions. See file 'License.txt' in the Psychtoolbox root folder for the exact licensing conditions.

PTB-INFO: Advanced Micro Devices, Inc. [AMD/ATI] - Bonaire XT [Radeon R9 M280X] GPU found. Trying to establish low-level access...
PTB-INFO: Connected to Advanced Micro Devices, Inc. [AMD/ATI] Bonaire XT [Radeon R9 M280X] GPU with DCE-8.0 display engine [2 heads]. Beamposition timestamping enabled.
PTB-INFO: Trying to enable at least 11 bpc fixed point framebuffer.
PTB-INFO: Native 11 bit per color framebuffer requested, but the OS doesn't allow it. It only provides 8 bpc.
PTB-INFO: Will now try to use our own high bit depth setup code as an alternative approach to fullfill your needs.
PTB-INFO: Assuming kernel driver provided color resolution of the GPU framebuffer will be 11 bits per RGB color component.


PTB-INFO: OpenGL-Renderer is X.Org :: Gallium 0.4 on AMD BONAIRE (DRM 2.43.0 / 4.4.0-62-generic, LLVM 3.8.0) :: 3.0 Mesa 12.0.6
PTB-INFO: VBL startline = 1080 , VBL Endline = 1109
PTB-INFO: Measured monitor refresh interval from beamposition = 16.660121 ms [60.023574 Hz].
PTB-INFO: Will try to use OS-Builtin OpenML sync control support for accurate Flip timestamping.
PTB-INFO: Measured monitor refresh interval from VBLsync = 16.660151 ms [60.023464 Hz]. (50 valid samples taken, stddev=0.000557 ms.)
PTB-INFO: Reported monitor refresh interval from operating system = 16.661390 ms [60.019001 Hz].
PTB-INFO: Small deviations between reported values are normal and no reason to worry.
PTB-INFO: System framebuffer switched to BGR101111 mode for screen 0 [head 0].
PTB-INFO: Psychtoolbox imaging pipeline starting up for window with requested imagingmode 1061 ...
PTB-INFO: Will use 32 bits per color component floating point framebuffer for stimulus drawing. Alpha blending should work correctly.
PTB-INFO: Will use 32 bits per color component floating point framebuffer for stimulus post-processing (if any).
Building a fragment shader:Reading shader from file /usr/local/share/Psychtoolbox/PsychOpenGL/PsychGLSLShaders/ICMSimpleGammaCorrectionShader.frag.txt ...
Compiling all shaders matching RGBMultiLUTLookupCombine_FormattingShader * into a GLSL program.
Building a fragment shader:Reading shader from file /usr/local/share/Psychtoolbox/PsychOpenGL/PsychGLSLShaders/RGBMultiLUTLookupCombine_FormattingShader.frag.txt ...
LoadIdentityClut: Info: Used GPU low-level setup code to configure (hopefully) perfect identity pixel passthrough.
PTB-INFO: SetDitherMode: Trying to enable digital display dithering on display head 0.
PTB-INFO: SetDitherMode: Setting dithering mode to userspace provided setting f100. Cross your fingers!
PTB-INFO: System framebuffer switched to standard ARGB8888 mode for screen 0 [head 0].



I tried the 10bpc, 11bpc via dithering, 12bpc using 3 different xorg.conf file.

Definition of "worked": running the script in requested bit per channel mode displayed visible/measurable grayscale change in each step (10 pixels) horizontally. 
1. Using no xorg.conf file: 
10bpc mode: worked
11bpc mode: didn't work; seems to fallback to 8bpc mode
12bpc mode: didn't test

2. Using  xorg.conf_For_10bpcDepth30Framebuffer (shipts with PTB-3) as xorg.conf file (see contents below)
Window manager (lightdm) crashed and couldn't start, so can't test.

root@ZBook /e/X/xorg.conf.d# cat xorg.conf_For_10bpcDepth30Framebuffer 
# xorg.conf for enabling a depth 30 bit, 10 bpc native
# framebuffer on Linux + X11 on any graphics card and
# driver which supports 10 bpc display without special
# Psychtoolbox hacks.
#
# Just copy this file to /etc/X11/xorg.conf
# as sudo root, then logout - login again to make
# the changes effective.
#

Section "Screen"
    Identifier      "Screen0"
    DefaultDepth    30
EndSection

3. Using xorg.conf_For_AMD16bpcFramebuffer (ships with PTB-3) as xorg.conf:
1. 10bpc mode: worked
2. 11bpc mode: worked (I realized that the previous tests that I were running for 10/11bpc were using this xorg.conf)
3. 12bpc mode: PTB screen turned to fully saturated green, and pressing space (would have moved color gradients) did nothing. Pressing ESC to abort worked (so it didn't crash). However, it turned the Terminal (octave window) resolution much smaller than native resolution (something like 800x600). Running `xrandr --output eDP --mode 1920x1080` fixed this window. 

No observable difference in desktop applications when used this xorg.conf file found.

# xorg.conf file for configuring an AMD graphics card
# to allow for up to 16 bpc, 64 bpp video display.
# For use on Linux + X11 + Open-Source AMD graphics driver with:
# PsychImaging('AddTask','General','EnableNative16BitFramebuffer')
#
# Copy this file to /etc/X11/xorg.conf and logout-login again
# to prepare the system for this mode of operation.
#

Section "Device"
  Identifier  "AMD16bpcFramebuffer"
  Driver      "radeon"
  Option      "ColorTiling"    "off"
  Option      "ColorTiling2D"  "off"
EndSection


For my study, 10bpc is sufficient, but 11bpc is even better. I will post the photometer measurements soon once I have them.



XX---In PSYCHTOOLBOX@yahoogroups.com, <hormet.yiltiz@...> wrote :

I tried the 10bpc, 11bpc via dithering, 12bpc using 3 different xorg.conf file.

Definition of "worked": running the script in requested bit per channel mode displayed visible/measurable grayscale change in each step (10 pixels) horizontally. 
1. Using no xorg.conf file: 
10bpc mode: worked
11bpc mode: didn't work; seems to fallback to 8bpc mode
12bpc mode: didn't test

-> 11 bpc mode should also work? Can't think of a reason it wouldn't, assuming you applied that 'ConfigureDisplays' magic dither settings.

2. Using  xorg.conf_For_10bpcDepth30Framebuffer (shipts with PTB-3) as xorg.conf file (see contents below)
Window manager (lightdm) crashed and couldn't start, so can't test.

-> This one was pointless, as i told you before. That file is not meant at all for use by you. It is only for some development and testing i do with prototype drivers, so complete failure of the GUI is expected. That's why i told you which exact file to use.

3. Using xorg.conf_For_AMD16bpcFramebuffer (ships with PTB-3) as xorg.conf:
1. 10bpc mode: worked
2. 11bpc mode: worked (I realized that the previous tests that I were running for 10/11bpc were using this xorg.conf)
3. 12bpc mode: PTB screen turned to fully saturated green, and pressing space (would have moved color gradients) did nothing. Pressing ESC to abort worked (so it didn't crash). However, it turned the Terminal (octave window) resolution much smaller than native resolution (something like 800x600). Running `xrandr --output eDP --mode 1920x1080` fixed this window. 

No observable difference in desktop applications when used this xorg.conf file found.

-> So it doesn't work with either KDE or (surprisingly) Unity desktop GUI. Can you try with GNOME-3? I think installing the gnome-desktop package will give you that as a choice at login. It "worked" for me with GNOME-3, showing a valid picture, although i can't test the true precision with my too old 10 bit only hardware and 8 bpc only display.

-> Slowdowns are probably not affecting regular desktop apps. Complex stimuli under PTB or video games may push it enough to skip frames under that xorg.conf. "ColorTiling" is a performance optimization on modern GPU's, but has to be disabled for the 12 bpc hack to work.

-> The hack involves setting the size of the virtual desktop to twice the width and height of the true display resolution, so your monitor only displays the top-left quarter of the desktop. PTB then reprograms the display hardware to use the full area, so PTB stimuli should look correct. Some desktop environments don't like this resizing of the desktop, so one gets all kind of weird effects. You can also use SetResolution(0, 1920, 1080) to reset the screen 0 desktop size at the end of a script iirc.

# xorg.conf file for configuring an AMD graphics card
# to allow for up to 16 bpc, 64 bpp video display.
# For use on Linux + X11 + Open-Source AMD graphics driver with:
# PsychImaging('AddTask','General','EnableNative16BitFramebuffer')
#
# Copy this file to /etc/X11/xorg.conf and logout-login again
# to prepare the system for this mode of operation.
#

Section "Device"
  Identifier  "AMD16bpcFramebuffer"
  Driver      "radeon"
  Option      "ColorTiling"    "off"
  Option      "ColorTiling2D"  "off"
EndSection


For my study, 10bpc is sufficient, but 11bpc is even better. I will post the photometer measurements soon once I have them.

-> 12 bpc would be even better and give identical resolution in all color channels. The 11 bpc mode will resolve blue values at 10 bpc only. So you will effectively get 11 bpc luminance resolution, but at every 2nd step (because red and green increase by 1 unit, but blue doesn't - it only increases every 2nd step) you'd get a slightly higher red+green than blue --> A small tint of yellow'ish instead of pure grey. I don't know if that would be perceptible, especially at higher luminance levels like around 50% grey, etc., but something to keep in mind. The 12 bpc mode gives true 12 bpc in all channels so doesn't have that problem if it is one.

-mario

Forgot to attach the script, though it barely changed.
12 bpc setup worked after explicitly setting bpc to 16 in EnableNative16BitFramebuffer, on both GNOME-3 and GNOME-3 Classic. What was its default value?
Cool, so you get a proper picture now. 12 bpc was the setting. On my old hw i had to use 10 bpc - the true output width of the display engine, so i assumed 12 bpc would be the right one for your 12 bit engine. Apparently something in the hardware design has changed and it now wants data always 16 bpc formatted and cuts off the least significant 4 bits (16 - 12 = 4) itself. So you probably had a picture, just very dark. Grab the fixed PsychImaging.m file from here:

https://raw.githubusercontent.com/kleinerm/Psychtoolbox-3/master/Psychtoolbox/PsychGLImageProcessing/PsychImaging.m

Now you just need to verify if you really get the 12 bpc ~ 1/4096th luminance step size, well at least somewhat more than the max 2048 steps at 11 bpc. Hopefully the dithering can fake that, usually dithering gives 2 bpc extra simulated precision. And hopefully no other dithering induced artifacts...

Exciting!
-mario
The new file cannot find a global var, I think: kPsychNeedTripleWidthWindow.


PTB-INFO: Advanced Micro Devices, Inc. [AMD/ATI] - Bonaire XT [Radeon R9 M280X] GPU found. Trying to establish low-level access...
PTB-INFO: Connected to Advanced Micro Devices, Inc. [AMD/ATI] Bonaire XT [Radeon R9 M280X] GPU with DCE-8.0 display engine [2 heads]. Beamposition timestamping enabled.


PTB-INFO: OpenGL-Renderer is X.Org :: Gallium 0.4 on AMD BONAIRE (DRM 2.43.0 / 4.4.0-64-generic, LLVM 3.8.0) :: 3.0 Mesa 12.0.6
PTB-INFO: VBL startline = 1080 , VBL Endline = 1109
PTB-INFO: Measured monitor refresh interval from beamposition = 16.660997 ms [60.020418 Hz].
PTB-INFO: Will try to use OS-Builtin OpenML sync control support for accurate Flip timestamping.
PTB-INFO: Measured monitor refresh interval from VBLsync = 16.661024 ms [60.020320 Hz]. (50 valid samples taken, stddev=0.000534 ms.)
PTB-INFO: Reported monitor refresh interval from operating system = 16.661390 ms [60.019001 Hz].
PTB-INFO: Small deviations between reported values are normal and no reason to worry.
PTB-INFO: Psychtoolbox imaging pipeline starting up for window with requested imagingmode 1029 ...
PTB-INFO: Will use 8 bits per color component framebuffer for stimulus drawing.
PTB-INFO: Will use 8 bits per color component framebuffer for stimulus post-processing (if any).
'kPsychNeedTripleWidthWindow' undefined near line 4991 column 35
error: called from 'PsychImaging>InterBufferRect' in file /home/hyiltiz/Public/NoiseDiscrimination/lib/PsychImaging.m near line 4991, column 5

Yep. Get it from the Psychtoolbox/PsychGLImageProcessing/kPsychNeedTripleWidthWindow.m file on my GitHub master branch.

-mario
XX---In PSYCHTOOLBOX@yahoogroups.com, <hormet.yiltiz@...> wrote :

PTB-3 window opens with full green background and a small region with my gray-level gradient at top left with a size of the quarter of the screen. The gradient looks ok. Used the same script for the test along with the two files provided.

-> So that problem again? My fix to PsychImaging can't cause that - it only changed that default from 12 to 16. I assume this is under GNOME-3, where it worked fine before? Can you retry, also after a "clear all" or even after restarting octave? What does Screen('Rect', w, 1) report as size if it goes wrong? A correct reading should be [0 0 3840 2160] and incorrect one [0 0 1920 1080]. I wonder if there is some race-condition somewhere in the setup code, where PTB changes the screen size to twice the display size (3840 x 2160), but then that info about the change of desktop size doesn't get transmitted back to the display code in time and it still reports an outdated 1920x1080. This would be a timing-sensitive effect, which would cause the thing to randomly work correct or not. Also something that might get modulated by the execution timing of the desktop environment in use.

-> Btw. what two files? There are no files in your message.
-mario