{{FULL_COURSE}} Homework 4 - OpenGL Fun


Overview ----- You will practice working with basic OpenGL constructs and functions by writing code to construct a cube out of vertex buffer objects. You will also program portions of OpenGL's graphics pipeline by writing a few different vertex and fragment shaders to apply different coloration effects to the surfaces of 3D models. Supplied Code --------- Click here to access the homework's Github repository. We will provide you with a basic Qt GUI that draws scenes using OpenGL, along with a user interface that allows you to switch between different shaders with which to color the surface of the object. __Check your terminal window output when you run your code in order to see any shader program compilation errors. They will not appear in your C++ compiler output because the shaders are not compiled until the program is run.__ Conceptual Questions (Due Monday, October 5 at 11:59 PM) ------------- Before you begin the programming portion of this homework assignment, read and answer the following conceptual questions. Your answers should be submitted on Canvas. * (5 pts) In the OpenGL Shading Language (GLSL), what is a uniform variable? What is an "in" variable? What is an "out" variable? How does a vertex shader pass data to a fragment shader? * (5 pts) The SurfaceShader class has several member variables of type int, such as attrPos and unifModel. What do these variables represent? How are they given values in the first place? Help Log (5 points) ------- Maintain a log of all help you receive and resources you use. Make sure the date and time, the names of everyone you work with or get help from, and every URL you use, except as noted in the collaboration policy. Also briefly log your question, bug or the topic you were looking up/discussing. Ideally, you should also the answer to your question or solution to your bug. This will help you learn and provide a useful reference for future assignments and exams. This also helps us know if there is a topic that people are finding difficult. If you did not use external resources or otherwise receive help, please submit a help log that states you did not receive external help. You may submit your help log as an ASCII (plain) text file or as a PDF. Refer to the Policies section of the course web site for more specifications. Code Requirements (Due Wednesday, October 7 at 11:59 PM) ------- ### Surface Shader Programs For the sections that require you to modify surface shader files, you can find the shaders under `Resources/glsl.qrc/glsl/surface` in Qt Creator's project file browser. Note that we have provided you with a working Lambertian shader filled with comments explaining each shader program element. You can find this program in `lambert.vert.glsl` and `lambert.frag.glsl`. ### Cube Vertex Buffers (12 points) ### In the `Mesh` class, which can be found in `scene/mesh.h`, a function called `createCube` has been declared and partially defined. In the body of this function, write code that fills `std::vector`s with vertex data for the following attributes of a cube that spans the range [-1, 1] in the X, Y, and Z axes (in world space): * Position (`glm::vec4`) * Normal (`glm::vec4`) * UV (`glm::vec2`) To make coding your cube a little easier, we recommend writing the vertex attributes on a per-face basis, i.e. front quad, right quad, back quad, etc. Once you have made your collections of vertex data, you will have to triangulate your cube by making a `std::vector` of `GLuint`s (unsigned integers that are guaranteed to be 32 bits) and storing triangle indices. Make sure your cube's faces' surface normals are correct; this means you will need to create duplicate vertex positions in order to maintain a 1:1 ratio of position:normal. You may set the UVs of each cube face to span the range (0,0) to (1,1), like the UVs of the example square. We have already written code in `createCube` that sets up the vertex buffer objects for a square. Take this code and expand it to create a cube with the specifications above. You can display your cube by changing the Model dropdown menu to `Cube`. ### `SurfaceShader` Uniform Handle (5 points) ### Add an integer member variable to the `SurfaceShader` class to act as a handle to a `uniform vec3` in a surface shader that represents your camera's position in world space. Use this variable in at least the Blinn-Phong shader to compute the lighting on the surface. Set this handle to your shader variable in `SurfaceShader::setupMemberVariables`, and update its value in `MyGL::paintGL`, just before the call to `Render3DScene()`. You might want to write a helper function in the style of `SurfaceShasder::setModelMatrix` to update your camera position variable in your GPU. We want to use this uniform instead of the current implementation involving the inverse of our view matrix because invoking the inverse function is costly. ### Blinn-Phong Reflection Shader (5 points) ### Add code to the shader files `blinnPhong.vert.glsl` and `blinnPhong.frag.glsl` in order to implement Blinn-Phong reflection on the surface of the models provided. Blinn-Phong reflection is a method of emulating the specular highlights one sees on materials like plastic. To compute this shading effect, one needs to know the following information for a given pixel fragment: * The vector from the fragment's world-space position to the camera, i.e. the view vector * The vector from the fragment's world-space position to the light source (assuming a point light source) * The surface normal at the fragment Given these values, one can compute the intensity of the specular highlight for the current fragment using the following formula: `specularIntensity = max(pow(dot(H, N), exp), 0)`, where `H` is the average of the view vector and the light vector and `N` is the surface normal. As always, both vectors should be normalized. The `exp` value is any number greater than 1. The higher the `exp`, the smaller and brighter the specular highlight. Once you have the specular highlight intensity, you can add it to a basic Lambertian shading calculation, and achieve an effect like this: Drawing ### Matcap Reflection Shader (5 points) ### Add code to the shader files `matcap.vert.glsl` and `matcap.frag.glsl` in order to implement the "matcap" method of shading surfaces on the models provided. Matcap shading is most commonly used to give 3D models the appearance of a complex material with dynamic lighting without having to perform expensive lighting calculations. The implementation of the matcap technique is actually quite simple, yet with the right textures it can look photorealistic. Given a 2D texture like this: Drawing you can make an entire model appear to be made of reddish clay: Drawing All that needs to be done to achieve this effect is to map the (x, y) coordinates of the surface normal of a fragment to a point in the circular matcap texture. So, if the (x, y) is (-1, 0), that would map to the leftmost point of the circle. Sample the texture at this point to find the color of your surface. ### Iridescent Shader (5 points) ### Add code to the shader files `gradient.vert.glsl` and `gradient.frag.glsl` in order to color your model using a prodedurally-generated color palette based on cosine curves with different periods. Following the example discussed [here](http://www.iquilezles.org/www/articles/palettes/palettes.htm), create your own custom color palette. Then, map this palette to the surface of the provided models by using the Lambertian dot product as the `t` value in this formula: `color(t) = a + b * cos(2 * PI * (c * t + d))`. Your models will look similar to this image (but may be different due to your chosen palette or `t` mapping): Drawing ### Custom vertex deformation shader (15 points) ### Modify `deform.vert.glsl` and `deform.frag.glsl` so that the shader non-uniformly modifies the positions of the provided models' vertices as a function of time, and also non-uniformly modifies the fragments' colors as a function of time. This means the model should not simply gradiate between one overall color and another; different fragments should have different colors in the same frame. As shown in class, one inspiration for your shader might be to use a cosine-curve palette as a color basis, and alter your fragment color based on world-space surface normal orientation. Then, you could modify the vertex shader to interpolate between some "fixed" form, like a sphere, and the model's normal form. You can access the current time through the `u_Time` uniform, which is passed an ever-increasing value in `paintGL`. We encourage you to look up the different functions available to you in GLSL. There are more convenience functions than you might think. For instance, there is a function called `mix`, which linearly interpolates between two values. There is also a similar function called `smoothstep`, which interpolates between two values along a Hermite curve (effectively, it "eases in" and "eases out" near the extremes of the interpolation). Have fun playing around with different combinations of functions; we want you to be creative! For a more "procedural" appearance, you could even use a deterministic noise function as the basis for color or vertex offset. For example, the following code produces a "random" float with values between 0 and 1 given an input vec3, but always produces the same output for the same input: >
float mod289(float x){return x - floor(x * (1.0 / 289.0)) * 289.0;}
vec4 mod289(vec4 x){return x - floor(x * (1.0 / 289.0)) * 289.0;}
vec4 perm(vec4 x){return mod289(((x * 34.0) + 1.0) * x);}

float noise(vec3 p) {
    vec3 a = floor(p);
    vec3 d = p - a;
    d = d * d * (3.0 - 2.0 * d);

    vec4 b = a.xxyy + vec4(0.0, 1.0, 0.0, 1.0);
    vec4 k1 = perm(b.xyxy);
    vec4 k2 = perm(k1.xyxy + b.zzww);

    vec4 c = k2 + a.zzzz;
    vec4 k3 = perm(c);
    vec4 k4 = perm(c + 1.0);

    vec4 o1 = fract(k3 * (1.0 / 41.0));
    vec4 o2 = fract(k4 * (1.0 / 41.0));

    vec4 o3 = o2 * d.z + o1 * (1.0 - d.z);
    vec2 o4 = o3.yw * d.x + o3.xz * (1.0 - d.x);

    return o4.y * d.y + o4.x * (1.0 - d.y);
}
[Code source](https://gist.github.com/patriciogonzalezvivo/670c22f3966e662d2f83) Here is a recording of what we created by playing around with functions: Drawing ### Post-Process Shader Programs For the sections that require you to modify post-process shader files, you can find the shaders under `Resources/glsl.qrc/glsl/post` in Qt Creator's project file browser. ### Greyscale and vignette shader (7 points) ### Add code to the shader file `greyscale.frag.glsl` in order to implement a post-process effect that converts your 3D scene to greyscale and applies a vignette effect to the screen edges. Converting a color to greyscale is actually quite simple; you just take a weighted average of the red, green, and blue channels of the original color using this formula: `grey = 0.21 * red + 0.72 * green + 0.07 * blue`. You'll notice that the green channel has the highest contribution of the three by a large margin; this is because the human eye is most sensitive to green light so we perceive changes in green color much more easily. If you remember the CIE color gamut of light perceived by humans, you'll notice that the green area is far larger than the red or blue areas, and that the blue area is the smallest: ![](CIExy1931.png) Applying a vignette effect is also fairly simple. You want to find the distance at which a fragment lies from the screen's center, and use that as the input to a function to alter the brightness of the image. We'll leave the exact implementation up to you to determine, but here is a render we produced using greyscale conversion and vignetting: Drawing ### Gaussian Blur Shader (7 points) ### Add code to the shader file `gaussian.frag.glsl` in order to implement a post-process effect that blurs your 3D scene. A Gaussian blur effectively performs a weighted average of NxN pixels and stores the result in the pixel at the center of that NxN box (this means N must always be odd). The larger the blur radius, the smoother the blur will be. Additionally, altering the weighting of the blur will increase or decrease its intensity. If you take a look at slides 23 - 24 in the procedural color slides, you'll have a better idea of how a Gaussian blur works. In order to achieve the image below, create a const array of 121 floats just __above__ main(), hard-coding into it the kernel values from the Gaussian kernel generator [here](http://dev.theomader.com/gaussian-kernel-calculator/). Index into this array the same way you did your Z-buffer in hw03 to treat it like an 11x11 array. We used a gamma value of 9 in the generator linked above. Drawing ### Sobel Filter Shader (7 points) ### Add code to the shader file `sobel.frag.glsl` in order to implement a post-process effect that detects and enhances the edges of shapes in your 3D scene. What a Sobel filter effectively does is compute the approximate gradient (i.e. slope) of the color at each pixel, and where the color abruptly changes it returns a high value, otherwise it returns roughly black, or a slope of zero. To compute a Sobel filter, you need two kernels: one for computing the horizontal gradient, and one for computing the vertical gradient: ``` Horizontal = | 3 0 -3 | | 10 0 -10 | | 3 0 -3 | Vertical = | 3 10 3 | | 0 0 0 | | -3 -10 -3 | ``` Multiply each of these kernels by the 3x3 set of pixels surrounding a pixel to compute its gradients, finding the sum of each of these multiplications. The final output will be a color represented as a vec3. Once you have your horizontal and vertical color gradients, square them, sum them, and set the output of your shader to the square root that that sum. This is what we rendered using the Sobel kernels above: Drawing ### Fake Bloom Shader (7 points) ### Add code to the shader file `bloom.frag.glsl` in order to implement a post-process effect that applies a "bloom" effect to your 3D scene. Normally, one would actually use two post-process passes to properly generate a bloom effect, but we're going to fake it for this assignment. To implement bloom, one needs to take a weighted average of all the pixels surrounding the current pixel, but only factor them into the average if their luminance is above a given threshold. The higher the threshold, the smaller the bloom effect. Once one has acquired this weighted average, one just adds it to the base color of the current pixel. If you'd like to read more about bloom, take a look at slide 25 of the procedural color slide deck. Here is the fake bloom we achieved using a 11x11 Gaussian kernel with a sigma of 9, and a luminance threshold of 0.6: Drawing ### Custom noise-based post-process shader (15 points) ### Modify `worleywarp.frag.glsl` so that the shader uses Worley noise to modify the 3D scene by warping the UV coordinates of the screen-space render and by modifying the image color. We encourage you to be creative with your post-process effect; your score in this section is partially dependent on how complex your effect is. The more visually interesting (and perhaps aesthetically pleasing) your images, the more points you'll receive. For inspiration, here are some effects we achieved using Worley noise as our base: Drawing Drawing Drawing Drawing Here are some ideas for how to use noise to modify your image: * Take the gradient of the noise (i.e. sample the noise at +/-1 X and +/-1 Y and take the difference) and use it as a directional vector for UV displacement * Treat the gradient as a surface normal (i.e. use the X and Y gradients as the X and Y of a normal, and set the normal's Z to` sqrt(1 - x*x - y*y)`) and use the normal in a surface reflection function (e.g. Blinn-Phong) * Use the noise as the `t` input to a cosine-curve gradient and modify the image's original color with the palette color * Sample the image's original color at each of the random points' locations and use that color to influence the color of all fragments that belong to a given random point Additionally, here are some general post-process effect ideas: * Remember, you can re-use and combine effects from your other shaders! * Chromatic aberration, which reads from the red, green, and blue channels of the original image at slightly different UV coordinates. This produces three different images overlaid on each other, one for each channel. * You can extend chromatic aberration to Hue/Saturation/Value space if you wish. You can convert an RGB color to HSV, but we'll let you look up the formula if you want to use it. * Use a sine or cosine curve to apply a CRT-TV effect to the entire screen; darken or brighten rows of pixels based on the value of the curve. The higher the frequency of your curve, the smaller and more frequent the lines become. * You can find additional inspiration from [ShaderToy](https://www.shadertoy.com/). Coding Style (10 points) ------- We will provide you with feedback on the organization and clarity of the code you have written for this assignment. Please refer to our course style guide as you implement your shaders and VBOs. Extra Credit (Maximum 25 points) --------- We will grant you extra credit if we deem your custom surface deformation shader to be particularly complex or interesting. Take time to experiment with different functions, both in your vertex shader and fragment shader, to produce visually interesting results. If you want to spend the time, you could even hard-code some sort of artistically-driven animation of your mesh based on the `u_Time` variable that all your shaders support. You may also create additional surface or post-process effect shaders for extra credit. The more complex or visually interesting the effect, the more points you'll receive. We suggest playing with signed distance functions and modifications of various noise functions if you don't have any immediate ideas. To add new shaders, you will have to add more `.glsl` files to your project. Add additional options to the shader drop-down menus in `shadercontrols.ui`, add your new `.glsl` files to `glsl.qrc`, and add code to `MyGL::createShaders` to add your new shader to `MyGL`'s list of shader programs. If you are interested in playing with signed distance functions, [here is a useful resource](http://iquilezles.org/www/articles/raymarchingdf/raymarchingdf.htm). Here is a shader we created by rendering the red, green, and blue channels of the image at different offsets: Drawing Submission -------- We will grade the code you have submitted to Canvas, so make sure you zip up your entire project! Also make sure to commit all of your files to Github, and add a comment on your Canvas submission with a link to your repository.