NVidia PerfHud ES, Take 2 (Linux)

April 12th, 2014

PerfHud ES is the best OpenGL ES debugger I’ve used, but it can be a bit tricky to set up. And now that I’m on Linux, even more so.

NVidia’s Page – Troubleshooting Notes on Forum #1

The latest PerfHud ES is now part of the Tegra Android Development Pack (2.2 as of this writing). An older version is available standalone (2.1), but the latest is now part of the pack. The pack itself is actually a download manager, so it’s a lightweight download if you uncheck other features.

To get it working, I had to do the following:

1. Add adb to my path, by adding it to ~/.profile

Something like:

Which just happens to be where my Android SDK is, and adb is found in the platform-tools folder.

In context of my whole ~/.profile file.

Again, that last line is all I added.

2. Log out of Ubuntu, log back in.

You should now be able to an ‘adb devices’ from a terminal to see all connected Android devices.

3. Connect an NVidia device, and do the following to enable PerfHUD.

This needs to be done every time the device reboots.

4. Start the app you want to debug (Mine is an SDL App). **IMPORTANT** DO THIS FIRST!

5. Start PerfHUD ES. **NOTE** DO THIS SECOND!

The order of these 2 steps matters. I was not able to get PerfHUD ES to connect to my NVidia Shield unless I started the app first. And after, if I even shutdown the app, I need to exit PerfHUD ES and repeat to reconnect.

6. Connect to the running session. You’ll first see a message in the lower right corner of PerfHUD ES that you are not connected. Click that.

NVP01

If everything worked correctly, you’ll now be able to click on the currently running instance of the app.

NVP2

Click that, and click connect. Moments later the profiling should begin.

NVP03

Click over to the Frame debugger. That’s where all the fun stuff is. Scrub through each draw command; View geometry, textures and shaders; and so on.

Code Changes?

I’m not sure if this is required or not anymore (probably). If the frame profiling stuff is acting up, you may need to add the following snippet to your code.

The ‘USES_EGL’ is a #define of mine. Feel free to omit it. You’re going to the “egl.h” header though.

Hosted on Ubuntu 14.04 (prerelease), with an Intel HD 3000 GPU. No NVidia card (or OpenCL) required. :)

Linux Setup Notes, Part 2

January 28th, 2014

I’ve been using Linux for a month now, and while I’ve done it, I’ve been adding things to this post over here:

http://www.toonormal.com/2013/12/29/linux-setup-notes/

Well that post is extremely long now, and I accidentally had to reinstall it (video driver fight), so it’s time for a fresh post!

I had some complaints about Ubuntu 13.10, so I decided to go ahead and install the Alpha version of Ubuntu 14.4 (2 months before release).

So far:
- Nautilus (File browser) is now fixed! I can type stuff, and it jumps to files instead of invoking a tree search! Huzzah!
- I has Mesa 10.0.1 with OpenGL 3 drivers stock! And they work!

The way things are going, I think Ubuntu 14.4 LTS is going to be a really good long term update to Ubuntu. Out of the box, these annoyances were solved for me (video driver tomfoolery is what ruined my last install. No more!).

So far, no regrets.

Fun new keys and commands to remember

ProTip: If using Ubuntu, and it says Dpkg, just don’t do it!

It’s so easy to break a Linux install. Apt and Dpkg (Synaptic) are package managers available to you. Both can be run, but these 2 managers don’t communicate at all, so if you’re not careful you can break stuff (like I did).

So if you’re using Ubuntu, prefer:

For all your installation needs.

Or alternatively, make sure to NEVER use Dpkg on “drivers”. Let Apt manage this stuff, and you will save yourself a world of pain.

Backups and Restoring

Be sure to copy your entire user folder, especially the root! Re-setting up applications becomes extremely simple, as you can just copy the .hidden folders back in to your new home folder (~).

I got scared that I would have to waste an UltraEdit activation (ugh DRM), but nope, all I had to do was copy my .idm folder. Huzzah!

Need Flash? Use Chrome

Download it directly from Google.

I’ve been a loyal Firefox user for a very long time. That said, as someone involved in games, Flash content is still kind of important. There was a debacle some time back, Mozilla refusing to support Chrome’s PPAPI. Or basically, a new sandboxed API similar to the legacy NPAPI used by all the other browsers. PPAPI has another advantage in that it’s not only sandboxed from the browser, but sandboxed from the OS. It’s the foundation for Native Client (NaCl).

Adobe declared that it will not support Linux anymore. The last version of Flash for Linux is 11.2. They just didn’t want to maintain the old NPAPI version on Linux. Unusually though, to better support the Chrome Browser, Adobe makes a PPAPI port of Flash.

Recently, Google announced that they will be discontinuing NPAPI support in Chrome. To a lot of folks, this sounded like “no more Flash on Linux”, but most people seemed to have missed the memo that Flash supports PPAPI.

And since PPAPI is really only supported by Chrome, PPAPI Flash comes bundled with Chrome. Easy.

As of this writing, the Flash bundled in Chrome is version 12, the same as on Windows.

So really, the hardest part about Flash on Linux is switching away from Firefox.

VMware Player on Ubuntu 14.4 Alpha

The latest Ubuntu has a brand new version of the Linux kernel. That broke the network bridge driver code that VMware Player 6.0.1 ships with. The solution, a patch, can be found here:

http://dandar3.blogspot.ca/2014/01/vmware-player-601-on-ubuntu-1404-alpha.html

Remapping keys on Ubuntu 14.4

Well it looks like the keyboard symbol files moved in the latest Ubuntu. Now they’re under /usr/share/X11/xkb/symbols/. Below is the modified snippet from my original post.

The Lenovo X220 has Web Forward/Back keys beside the arrow keys. I prefer that they act like alternative PageUp and PageDown keys.

Open up /usr/share/X11/xkb/symbols/inet (i.e. sudo gedit /usr/share/X11/symbols/inet)

Find a key named <I166>. Change “XF86Back” to “Prior“.

Find a key named <I167>. Change “XF86Forward” to “Next“.

Browse to /var/lib/xkb/

Delete *.xkm in the /var/lib/xkb folder. You need to do this to force a keyboard code refresh.

Logout, then Login to refresh. (Or reboot)

Power user GUI config tools

Install CompizConfig Settings Manager.

Install Unity Tweak Tool.

TODO: figure out the name of the tool I installed that made the taskbar work again. No, it wasn’t dconf-tools / dconf-editor.

Setting default file Associations to any program

Source.

Emit Keypresses (when being clever)

Use XDoTool.

On Windows, use nircmd.

Notable Patches

Minimize on Click (out of date, but a simple edit)
Disable Middle Button Paste
Taskbark Whitelisting. *sigh*… or a fix for Xchat.

FUIbuntu/FUXbuntu

TODO: Start a repos/ppa that undoes/fixes some of the annoying UI “fixes” Mr Shuttleworth has introduced in to Ubuntu and Unity. The F stands for what you think it stands for, and the UI is to polite’n up the U, and to be specific that the tweaks are UI related. Alternatively, FUXbuntu if feeling angry.

I want to still use a stock Ubuntu, but holy hell there are some frustrating UI decisions in Ubuntu.

UPDATE: Ubuntu 14.4 does fix the file browser.

SOURCE YOUR SHELL SCRIPTS!

Very important thing I *just* learned: Source!

If you write a shell script that (for example) sets environment variables, they will set the variables for any commands run in that script, but will not propagate outside of the script and in to the shell. The solution is to run the script inside the source shell, like so:

or

Startup and Environment Variables

To run things on startup, add them to .bashrc ~/.profile

Simply log-out and log back in again for the changes to take effect.

More details.

C++ typeof vs decltype

January 25th, 2014

GCC and compatible (Clang) compilers implement a feature known as typeof (or __typeof__).

http://gcc.gnu.org/onlinedocs/gcc/Typeof.html

This is a pre C++11 feature, omitted from the standard, and is unavailable Visual C++.

For the most part it does what you’d expect. Given a variable, it returns the type. This lets you create another instance of a type without having to use its full name. This is helpful if you happen to be using templates.

Regrettably, typeof seems to be somewhat flawed.

Generally speaking, there is no way using typeof to read type members (typedefs, structs, unions, classes).

As of C++11, a new keyword decltype was introduced. It is functionally the same as typeof, but the case shown above works. It has been available since GCC 4.3 (2008) and Visual C++ 2010.

In practice, you’re often using this in conjunction with an assignment. So if you have C++11 available, you may as well just use an auto.

*sigh*… I wish a certain company didn’t use GHS Multi, so I could … you know, use decltype and auto. ;)

4K, HDMI, and Deep Color

January 10th, 2014

As of this writing (January 2014), there are 2 HDMI specifications that support 4K Video (3840×2160 16:9). HDMI 1.4 and HDMI 2.0. As far as I know, there are currently no HDMI 2.0 capable TVs available in the market (though many were announced at CES this week).

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ on HDMI.org

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ

A detail that tends to be neglected in all this 4K buzz is the Chroma Subsampling. If you’ve ever compared an HDMI signal against something else (DVI, VGA), and the quality looked worse, one of the reasons is because of Chroma Subsampling (for the other reason, see xvYCC at the bottom of this post).

Chroma Subsampling is extremely common. Practically every video you’ve ever watched on a computer or other digital video player uses it. As does the JPEG file format. That’s why we GameDevs prefer formats like PNG that don’t subsample. We like our source data raw and pristine. We can ruin it later with subsampling or other forms of texture compression (DXT/S3TC).

In the land of Subsampling, a descriptor like 4:4:4 or 4:2:2 is used. Images are broken up in to 4×2 pixel cells. The descriptor says how much color (chroma) data is lost. 4:4:4 is the perfect form of Chroma Subsampling. Chroma Subsampling uses the YCbCr color space (sometimes called YCC) as opposed to the standard RGB color space.

Great subsampling diagram from Wikipedia, showing the different encodings mean

Great subsampling diagram from Wikipedia, showing the different encodings mean

Occasionally the term “4:4:4 RGB” or just “RGB” is used to describe the standard RGB color space. Also note, Component Video cables, though they are colored red, green, and blue, are actually YPbPr encoded (the Analog version of YCbCr).

Looking at the first diagram again, we can make a little more sense of it.

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ on HDMI.org

In other words:

  • HDMI 1.4 supports 8bit RGB, 8bit 4:4:4 YCbCr, and 12bit 4:2:2 YCbCr, all at 24-30 FPS
  • HDMI 2.0 supports RGB and 4:4:4 in all color depths (8bit-16bit) at 24-30 FPS
  • HDMI 2.0 only supports 8bit RGB and 8bit 4:4:4 at 60 FPS
  • All other color depths require Chroma Subsampling at 60 FPS in HDMI 2.0
  • Peter Jackson’s 48 FPS (The Hobbit’s “High Frame Rate” HFR) is notably absent from the spec

Also worth noting, the most well supported color depths are 8bit and 12bit. The 12 bit over HDMI is referred to as Deep Color (as opposed to High Color).

The HDMI spec has supported only 4:4:4 and 4:2:2 since HDMI 1.0. As of HDMI 2.0, it also supports 4:2:0, which is available in HDMI 2.0′s 60 FPS framerates. Blu-ray movies are encoded in 4:2:0, so I’d assume this is why they added this.

All this video signal butchering does beg the question: Which is the better trade off? More color range per pixel, or more pixels with color channels?

I have no idea.

If I was to guess though, because TV’s aren’t right in front of your face like a Computer Monitor, I’d expect 4K 4:2:2 might actually be better. Greater luminance precision, with a bit of chroma fringing.

Some Plasma and LCD screens use something called Pentile Matrix arrangement of their red, green, and blue pixels.

The AMOLED screen of the Nexus One

The AMOLED screen of the Nexus One. A green for every pixel, but every other pixel is a blue, switching red/blue order every line. Not all AMOLED screens are Pentile. The Super AMOLED Plus screen found in the PS Vita uses a standard RGB layout

So even if we wanted more color fidelity per individual pixel, it may not be physically there.

Deep Color

Me, my latest graphics fascination is Deep Color. Deep Color is the marketing name for more than 8 bits per pixel of a color. It isn’t necessarily something we need in asset creation (not me, but some do want full 16bit color channels). But as we start running filters/shaders on our assets, stuff like HDR (but more than that), we end up losing the quality of the original assets as they are re-sampled to fit in to an 8bit RGB color space.

This can result in banding, especially in near flat color gradients.

From Wikipedia

From Wikipedia, though it’s possible the banding shown may be exaggerated

Photographers have RAW and HDR file formats for dealing with this stuff. We have Deep Color, in all its 30bit (10bpp), 36bit (12bpp) and 48bit (16bpp) glory. Or really, just 36bit (12bpp), but 48bit can be used as a RAW format if we wanted.

So the point of this nerding: An ideal display would be 4K, support 12bit RGB or 12bit YCbCr, at 60 FPS.

The thing is, HDMI 2.0 doesn’t support it!

Perhaps that’s fine though. Again, HDMI is a television spec. Most television viewers are watching video, and practically all video is 4:2:0 encoded anyway (which is supported by the HDMI 2.0 spec). The problem is gaming, where our framerates can reach 60FPS.

The HDMI 2.0 spec isn’t up to spec. ;)

Again this is probably fine. The now-current generation of consoles, nobody is really pushing them as 4K machines anyway. Sony may have 4K video playback support, but most high end games are still targeting 1080p and even 720p. 4K is 4x the pixels of 1080p. I suppose it’s an advantage that 4K only supports 30FPS right now, meaning you only need to push 2x the data to be a “4K game”, but still.

HDMI Bandwidth is rated in Gigabits per second.

  • HDMI 1.0->1.2: ~4 Gb
  • HDMI 1.3->1.4: ~8 Gb
  • HDMI 2.0: ~14 Gb (NEW)

Not surprisingly, 4K 8bit 60FPS is ~12 Gb of data, and 30FPS is ~6 Gb of data. Our good friend 4K 12bit 60FPS though is ~18 Gb of data, well above the limits of HDMI 2.0.

To compare, Display Port.

  • DisplayPort 1.0 and 1.1: ~8 Gb
  • DisplayPort 1.2: ~17 Gb
  • DisplayPort 1.3: ~32 Gb (NEW)

They’re claiming 8K and 4K@120Hz (FPS) support with the latest standard, but 18 doesn’t divide that well in to 32, so somebody has to have their numbers wrong (admittedly I did not divide mine by 1024, but 1000). Also since 8k is 4x the resolution of 4K, and the bandwidth only roughly doubled, practically speaking DisplayPort 1.3 can only support 8k 8bit 30FPS. Also that 4K@120Hz is 4K 8bit 120FPS. Still, if you don’t want 120FPS, that leaves room for 4K 16bit 60FPS, which should be more than needed (12bit). I wonder if anybody will support 4K 12bit 90FPS over DisplayPort?

And that’s 4K.

1080p and 2K Deep Color

Today 1080p is the dominant high resolution: 1920×1080. To the film guys, true 2K is 2048×1080, but there are a wide variety of devices in the same range, such as 2560×1600 and 2560×1440 (4x 720p). These, including 1080p, are often grouped under the label 2K.

A second of 1080p 8bit 60FPS data requires ~3 Gb of bandwidth, well within the range supported by the original HDMI 1.0 spec (though why we even had to deal with 1080i is a good question, probably due to the inability to even meet the HDMI spec).

To compare, a second of 1080p 12bit 60FPS data requires ~4.5 Gb of bandwidth. Even 1080p 16bit 60FPS needed only ~6 Gb, well within the range supported by HDMI 1.3 (where Deep Color was introduced). Plenty of headroom still. Only when we push 2560×1440 12bit 60FPS (~8 Gb) do we hit the limits of HDMI 1.3.

So from a specs perspective, I just wanted to note this because Deep Color and 1080p are reasonable to support on now-current generation game consoles. Even the PlayStation 3, by specs, supported this. High end games probably didn’t have enough processing to spare for this, but it’s something to definitely consider supporting on PlayStation 4 and Xbox One. As for PC, many current GPUs support Deep Color in full-screen resolutions. Again, full-screen, not necessarily on your Desktop (i.e. windowed). From what I briefly read, Deep Color is only supported on the Desktop with specialty cards (FirePro, etc).

One more thing: YCrCb (YCC) and xvYCC

You make have noticed watching a video file that the blacks don’t look very black.

Due to a horrible legacy thing (CRT displays), data encoded as YCrCb use values from 16->240 (15-235?) instead of 0->255. Thats quite the loss, nearly 12% of the available data range, effectively lowering the precision below 8bit. The only reason it’s still done is because of old CRT televisions, which can be really tough to find these days. Regrettably, that does mean both of the original DVD and Bluray movies standards were forced to comply to this.

Sony proposed x.v.Color (xvYCC) as a way of finally forgetting this stupid limitation of old CRT displays, and using the full 0->255 range. As of HDMI 1.3 (June 2006), xvYCC and Deep Color are part of the HDMI spec.

Several months later (November 2006), The PlayStation 3 was launched. So as a rule of thumb, only HDMI devices newer than the PlayStation 3 will could potentially support xvYCC. This means televisions, audio receivers, other set top boxes, etc. It’s worth noting that some audio receivers may actually clip video signals to the 16-240 range, thus ruining picture quality of an xvYCC source. Also the PS3 was eventually updated to HDMI 1.4 via a software update, but the only 1.4 feature supported is Stereoscopic 3D.

Source. Wikipedia.

The point of bringing this up is to further emphasize the potential for color banding and terrible color reproduction over HDMI. An 8bit RGB framebuffer is potentially being compressed to fit within the YCbCr 16-240 range before it gets sent over HDMI. The PlayStation 3 has a setting for enabling the full color range (I forget the name used), and other new devices probably do to (unlikely named xvYCC).

According to Wikipedia, all of the Deep Color modes supported by HDMI 1.3 are xvYCC, as they should be.

Linux Device Input Notes

January 8th, 2014

I can use the following to list all attached USB devices.

The output will be something like the following.

Take note of Device 009, i.e. “Microsoft Corp.”. Thats an Xbox One controller plugged in to a USB port.

(Also FYI, the device known as “Logic3″ is a RockCandy brand Xbox 360 controller)

I can retrieve some data about the controller as follows:

Bus 001, Device 009.

It is not controller data though. Pressing buttons does not update any of the values.

Using ‘lsusb -v‘ provides an interpretation of all attached devices (i.e. if you follow along, you’ll note the data above is the same as the data below).

An article on creating Linux Drivers (kernel modules).

lsmod can be used to list all currently installed Kernel Modules.

The other option is communicating using usbfs, but usbfs is legacy and no longer enabled on Ubuntu. Instead, Ubuntu uses udev. udev is what’s going on in the /dev/ folder. A dynamically generated file system of attached hardware.

If I do the following:

I’ll get a realtime stream of raw input data.

That’s all. Just thought this was interesting.

Quadruped Animation Test

January 6th, 2014

Yeah, the list of things needed to make that animation better are extensive (it’s so floaty). Still, first time I’ve ever animated a quadruped, and first time I’ve ever seriously animated something in Spine.