diff --git a/README.md b/README.md index 7f170cdb2..2038d7b8a 100755 --- a/README.md +++ b/README.md @@ -1,72 +1,49 @@ Assignment 4 - Creative Coding: Interactive Multimedia Experiences === -Due: September 27th, by 11:59 PM. - For this assignment we will focus on client-side development using popular audio/graphics/visualization technologies; the server requirements are minimal. The goal of this assignment is to refine our JavaScript knowledge while exploring the multimedia capabilities of the browser. -Baseline Requirements ---- - -Your application is required to implement the following functionalities: - -- A server created using Express (you can also use an alternative server framework such as Koa) for basic file delivery and middleware. Your middleware stack should include the `compression` and `helmet` [middlewares]((https://expressjs.com/en/resources/middleware.html)) by default. You are not required to use Glitch for this assignment (but using Glitch is fine!); [Heroku](https://www.heroku.com) is another excellent option to explore. The course staff can't be resposible for helping with all other hosting options outside of Glitch, but some of us do have experience with other systems. It also never hurts to ask on Slack, as there's 99 other classmates who might have the experience you're looking for! -- A client-side interactive experience using at least one of the web technologies frameworks we discussed in class over the past week. - - [Three.js](https://threejs.org/): A library for 3D graphics / VR experiences - - [D3.js](https://d3js.org): A library that is primarily used for interactive data visualizations - - [Canvas](https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API): A 2D raster drawing API included in all modern browsers - - [SVG](https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API): A 2D vector drawing framework that enables shapes to be defined via XML. - - [Web Audio API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API): An API for audio synthesis, analysis, processing, and file playback. -- A user interface for interaction with your project, which must expose at least six parameters for user control. [dat.gui](https://workshop.chromeexperiments.com/examples/gui/#1--Basic-Usage) is highly recommended for this. You might also explore interaction by tracking mouse movement via the `window.onmousemove` event handler in tandem with the `event.clientX` and `event.clientY` properties. Consider using the [Pointer Events API](https://developer.mozilla.org/en-US/docs/Web/API/Pointer_events) to ensure that that mouse and touch events will both be supported in your app. -- Your application should display basic documentation for the user interface when the application first loads. This documentation should be dismissable, however, users should be able to redisplay it via either a help buton (this could, for example, be inside a dat.gui interface) or via a keyboard shortcut (commonly the question mark). -- Your application should feature at least two different ES6 modules that you write ([read about ES6 modules](https://www.sitepoint.com/understanding-es6-modules/)) and include into a main JavaScript file. This means that you will need to author *at least three JavaScript files* (a `app.js` or `main.js` file and two modules). We'll discuss modules in class on Monday 9/23; for this assignment modules should contain at least two functions. -- You are required to use a linter for your JavaScript. There are plugins for most IDEs, however it will be difficult to run the linter directly in Glitch. If you haven't moved to developing on your personal laptop and then uploading to Glitch when your project is completed, this is the assignment to do so! -- Your HTML and CSS should validate. There are options/plugins for most IDEs to check validation. -The interactive experience should possess a reasonable level of complexity. Some examples: -### Three.js -- A generative algorithm creates simple agents that move through a virtual world. Your interface controls the behavior / appearance of these agents. -- A simple 3D game -- An 3D audio visualization of a song of your choosing. User interaction should control aspects of the visualization. -### Canvas -- Implement a generative algorithm such as [Conway's Game of Life](https://bitstorm.org/gameoflife/) (or 1D cellular automata) and provide interactive controls. Note that the Game of Life has been created by 100s of people using ; we'll be checking to ensure that your implementation is not a copy of these. -- Design a 2D audio visualizer of a song of your choosing. User interaction should control visual aspects of the experience. -### Web Audio API -- Create a screen-based musical instrument using the Web Audio API. You can use projects such as [Interface.js](http://charlie-roberts.com/interface/) or [Nexus UI](https://nexus-js.github.io/ui/api/#Piano) to provide common musical interface elements, or use dat.GUI in combination with mouse/touch events (use the Pointer Events API). Your GUI should enable users to control aspects of sound synthesis. -### D3.js -- Create visualizations using the datasets found at [Awesome JSON Datasets](https://github.com/jdorfman/Awesome-JSON-Datasets). Experiment with providing different visualizations of the same data set, and providing users interactive control over visualization parameters and/or data filtering. Alternatively, create a single visualization with using one of the more complicated techniques shown at [d3js.org](d3js.org) and provide meaningful points of interaction for users. +## A Digital Visualization Portfolio +Link: http://a4-petrakumi12.glitch.me -Deliverables ---- +Include a very brief summary of your project here. Images are encouraged, along with concise, high-level text. Be sure to include: -Do the following to complete this assignment: +- The goal of this application is to provide a single source for multiple types of data visualization, to have as a personal portfolio. +- This application contains both implementations of code snippets I found online which I then manipulated to my liking + (the main page background and the WebGL Sound visualization), as well as visualizations I made from scratch + (the fractal and canvas sound visualization). In both cases, I made sure to understand every part of the code that I included. +- I included the online snippets because they helped me understand the workings of these libraries in order for me to +more easily implement my own functionality afterwards. +- The biggest challenge I faced was getting started using all the visualization libraries, as I did not have any prior experience with any of them -1. Implement your project with the above requirements. -3. Test your project to make sure that when someone goes to your main page on Glitch/Heroku/etc., it displays correctly. -4. Ensure that your project has the proper naming scheme `a4-yourname` so we can find it. -5. Fork this repository and modify the README to the specifications below. *NOTE: If you don't use Glitch for hosting (where we can see the files) then you must include all project files that you author in your repo for this assignment*. -6. Create and submit a Pull Request to the original repo. Name the pull request using the following template: `a4-gitname-firstname-lastname`. +- I used ESLinter through WebStorm to lint, using the automatic configuration. I used WebStorm's automatic validator for code validation -Sample Readme (delete the above when you're ready to submit, and modify the below so with your links and descriptions) ---- +** I added two sample songs to the Songs folder that you can download and try the audio visualizers with. -## Your Web Application Title +## Technical Achievements +- **Tech Achievement 1**: Used canvas, WebGL, D3, and Three.js on my visualizations +- **Tech Achievement 2**: My audiovisualizers use amplitude analysis to drive visualization. +- **Tech Achievement 3**: I experimented with custom FragmentShaders for color generation based on sound amplitude. +- **Tech Achievement 4**: I visualized the dataset of characters in Thor that appear in the same scene together using a +correlation circle with D3, including user interaction by hovering over for more information on the data +- **Tech Achievement 5**: I added start/stop functionality to the fractal generator, which users can press to stop the generation +- **Tech Achievement 6**: I added dat.gui to the fractal generator, allowing the user to change max radius, min radius, + and color of the circles. The color change will apply on top of the current image, while the new radii will cause the + context to clear and the visualization to start from scratch. +- **Tech Achievement 7**: I used dat.gui to control characteristics of the TorusKnotGeometry +- **Tech Achievement 8**: Automatic resize of three js renderer with screen size change on the main page +- **Tech Achievement 9**: Used js modules to generate the D3 graph as well as set the stage for audio analysis given the + inputted file. I then used the latter in both audio visualizers. +- **Tech Achievement 10**: Allowed the user to click anywhere to dismiss the help text, and click back at the Typed +cursor to regenerate the animation +- **Tech Achievement 11**: Generated color gradients to use in the linear sound visualizer, which changes +the color based on the amplitude of the sound -your hosting link e.g. http://a4-charlieroberts.glitch.me -Include a very brief summary of your project here. Images are encouraged, along with concise, high-level text. Be sure to include: +### Design/Evaluation Achievements +- **Design Achievement 1**: Used Typed library for quick help on how to navigate the website on the main page +- **Design Achievement 2**: Used Bootstrap with some modifications for the main page layout, with a parallax-style effect -- the goal of the application -- challenges you faced in realizing the application -- a brief description of the JS linter you used and what rules it follows (we'll be looking at your JS files for consistency) -## Technical Achievements -- **Tech Achievement 1**: I wrote my own custom GLSL shaders to use as a material for my Three.js objects. -- **Tech Achievement 2**: My audiovisualizer uses both FFT and amplitude analysis to drive visualization. -- **Tech Achievement 3**: I optimized the efficiency of my reaction-diffusion algorithm by... -- **Tech Achievement 4**: I visualized the dataset X using three different visualization technqiues provided by D3, andprovided -### Design/Evaluation Achievements -- **Design Achievement 1**: I ensured that my application would run on both desktops / mobile devices by changing X -- **Design Achievement 2**: I followed best practices for accessibility, including providing alt attributes for images and using semantic HTML. There are no `
` or `` elements in my document. -- **Design Achievement 3**: We tested the application with n=X users, finding that... diff --git a/Songs/Louder but still cool - Tom Odell - Magnetised (Official Video).mp3 b/Songs/Louder but still cool - Tom Odell - Magnetised (Official Video).mp3 new file mode 100644 index 000000000..7335fbc00 Binary files /dev/null and b/Songs/Louder but still cool - Tom Odell - Magnetised (Official Video).mp3 differ diff --git a/Songs/More variety of amplitudes - Tokio Myers - Bloodstream (Official Audio).mp3 b/Songs/More variety of amplitudes - Tokio Myers - Bloodstream (Official Audio).mp3 new file mode 100644 index 000000000..2535c7aec Binary files /dev/null and b/Songs/More variety of amplitudes - Tokio Myers - Bloodstream (Official Audio).mp3 differ diff --git a/app.js b/app.js new file mode 100644 index 000000000..3140d6c34 --- /dev/null +++ b/app.js @@ -0,0 +1,23 @@ +const express = require('express'), + bodyParser = require('body-parser'), + helmet = require('helmet'), + compression = require('compression'); + + +app = express(); +app.use(helmet()); // protects server +app.use(compression()); //will compress all responses + + +app.use(bodyParser.json()); +app.use(bodyParser.urlencoded({ extended: true })); +app.use(express.static("public")); +app.set('view engine', 'html'); + +app.get('/', function (req, res) { + res.sendFile(__dirname + '/index.html'); +}); + + + +app.listen(3000, () => console.log('Listening on port 3000')); \ No newline at end of file diff --git a/index.html b/index.html new file mode 100644 index 000000000..94c14aed2 --- /dev/null +++ b/index.html @@ -0,0 +1,91 @@ + + + + + + + + Digital Visualization Portfolio + + + + + + + + +
+
+
+
This is a
+
+ Digital Visualization Portfolio +
+
+ +
+
+ +
+

Click on a Card to Learn More

+
+
+
+
+

Data Visualization With D3

+ + +
+
+
+
+
+
+

Sound Visualization With Canvas

+ + +
+
+
+
+ +
+
+
+
+

Sound Visualization With WebGL

+
+
+
+
+
+
+

Generating Fractals With Canvas

+
+
+
+
+ + + +
+ + + + + + + + + + + \ No newline at end of file diff --git a/public/2DVizPage.html b/public/2DVizPage.html new file mode 100644 index 000000000..65832bbb3 --- /dev/null +++ b/public/2DVizPage.html @@ -0,0 +1,26 @@ + + + + + MusicViz + + + + + +
+ +
+ +
+ +
+ + + + + + + \ No newline at end of file diff --git a/public/3DVizPage.html b/public/3DVizPage.html new file mode 100644 index 000000000..1f496b876 --- /dev/null +++ b/public/3DVizPage.html @@ -0,0 +1,59 @@ + + + + + 3D Music Visualization + + + + + + + + + + + + +
+ +
+ +
+ +
+ + + + + \ No newline at end of file diff --git a/public/css/3DVizStyle.css b/public/css/3DVizStyle.css new file mode 100644 index 000000000..559a6e015 --- /dev/null +++ b/public/css/3DVizStyle.css @@ -0,0 +1,9 @@ +canvas { + position: absolute; + top: 0; + left: 0; + width: 100%; + height: 100%; + background-color: black; + z-index: -1; +} diff --git a/public/css/allStyles.css b/public/css/allStyles.css new file mode 100644 index 000000000..c0a872c93 --- /dev/null +++ b/public/css/allStyles.css @@ -0,0 +1,5 @@ +@import url('https://fonts.googleapis.com/css?family=Roboto:300'); +body { + padding: 100px; + font-family: 'Roboto', monospace !important; +} \ No newline at end of file diff --git a/public/css/d3Style.css b/public/css/d3Style.css new file mode 100644 index 000000000..b1e0f62cb --- /dev/null +++ b/public/css/d3Style.css @@ -0,0 +1,57 @@ +/*div {*/ +/* width: 100px;*/ +/* height: 100px;*/ +/* !*position: absolute;*!*/ +/* !*overflow: visible;*!*/ +/*}*/ + +/*svg {*/ +/* width: 50vw;*/ +/* height: 20vh;*/ +/*}*/ + +h1 { + text-align: center; + font-style: oblique; + padding-bottom: 70px; +} + +div { + justify-content: center; + align-content: center; +} + +td { + height: 600px; +} + +svg { + height: 600px; +} + + +.vizTable { + width: 100%; + text-align: center; + vertical-align: center; +} + +.visTitle { + padding-left: 10px; + padding-right: 10px; + max-width: 30vw; + background-color: gold; +} + +.blue { + background-color: #5fbdf6; +} + + +#my_dataviz { + justify-self: right; + align-self: center; + text-align: right; + vertical-align: center; + min-width: 60vw; +} \ No newline at end of file diff --git a/public/css/fractalStyle.css b/public/css/fractalStyle.css new file mode 100644 index 000000000..a083ba4ca --- /dev/null +++ b/public/css/fractalStyle.css @@ -0,0 +1,46 @@ +@import url('https://fonts.googleapis.com/css?family=Roboto:300'); + +body{ + margin: 2px; + padding: 0; + font-family: '"Roboto', monospace !important; +} + +#fractalHolder { + width: 99%; + height: 99%; +} + +#tempText { + color: ghostwhite; + position: absolute; + font-family:"Roboto", monospace !important; + left: 60px; +} + +canvas { + background-color: black; +} + +button { + top: 0; + left: 0; + padding: 15px; + position: absolute; + overflow: visible; + background-color: rgba(0,0,0,0); + border: none; + color: lightgrey; +} + +button:active { + border: none; + box-shadow: none; + color: gold; +} + +.fas { + font-size: 200px; + color: lightgrey; + text-outline: 2px lightgrey; +} \ No newline at end of file diff --git a/public/css/homeStyle.css b/public/css/homeStyle.css new file mode 100644 index 000000000..5cf23f37b --- /dev/null +++ b/public/css/homeStyle.css @@ -0,0 +1,47 @@ +@import url('https://fonts.googleapis.com/css?family=Roboto:300'); + +.homeContentSection { + background-color: ghostwhite; + margin-top: 50vh; + padding: 8vh 3vh; + /*height: 50vh;*/ + font-family: Roboto, monospace; + align-content: center; + text-align: center; +} + +h2 { + padding-bottom: 5vh; +} + +canvas { + width: 100vw; + height: 100vh; + display: block; + position: fixed; + top: 0; + left: 0; + z-index: -9999; +} + +.card{ + padding-top: 5vh; + margin: 0.5vh; + list-style: none; + position: relative; + min-height: 20vh; + filter: brightness(0.75) saturate(1.2) contrast(0.85); + transform-origin: center; + transition: + filter 100ms linear, + transform 100ms linear; + background-color: rgba(114, 126, 203, 0.8); + align-content: center; + text-align: center; + vertical-align: center; +} + +.card:hover { + transform: scale(1.03) translateZ(0); +} + diff --git a/public/css/musicVizStyle.css b/public/css/musicVizStyle.css new file mode 100644 index 000000000..e4e3fc63b --- /dev/null +++ b/public/css/musicVizStyle.css @@ -0,0 +1,40 @@ + + +#out { + position: absolute; + /*z-index: 1;*/ + top: 0; + left: 0; + width: 100%; + height:100%; + display: flex; + /*align-items: center;*/ + /*justify-content: center;*/ + opacity: 1; + background-color: #000000; + color: #ffffff; +} + +canvas { + position: absolute; + top: 0; + left: 0; + /*z-index: -1;*/ +} + +/*#out > div {*/ +/* text-align: center;*/ +/*}*/ +/*#out > div > button {*/ +/* height: 20px;*/ +/* width: 100px;*/ +/* background: transparent;*/ +/* color: #ffffff;*/ +/* outline: 1px solid #ffffff;*/ +/* border: 0px;*/ +/* cursor: pointer;*/ +/*}*/ +/*#out > div > p {*/ +/* color: #777777;*/ +/* font-size: 12px;*/ +/*}*/ \ No newline at end of file diff --git a/public/css/titleStyle.css b/public/css/titleStyle.css new file mode 100644 index 000000000..9ddeab95e --- /dev/null +++ b/public/css/titleStyle.css @@ -0,0 +1,59 @@ +.titleSection { + text-align:center; + color:ghostwhite; + font-family:"Roboto", monospace; + font-weight:300; + font-size:32px; + padding-top:40vh; + /*height:100vh;*/ + /*overflow:hidden;*/ + -webkit-backface-visibility: hidden; + -webkit-transform: translate3d(0,0,0); +} + +.titleDiv{ + display:inline-block; + overflow:hidden; + white-space:nowrap; +} + + +.titleDiv:last-of-type span { + margin-left:0; + animation: slidein 3s; +} + +#typed { + font-size: 24px; +} + +@keyframes showup { + 0% {opacity:0;} + 20% {opacity:1;} + 80% {opacity:1;} + 100% {opacity:0;} +} + +@keyframes slidein { + from { + margin-left:-800px; + } + to { + margin-left:0; + } +} + +@keyframes reveal { + 0% {opacity:0;width:0;} + 20% {opacity:1;width:0;} + 30% {width:450px;} + 80% {opacity:1;} + 100% {opacity:1;width:355px;} +} + + +p { + font-size:12px; + color:#999; + margin-top:200px; +} \ No newline at end of file diff --git a/public/d3Page.html b/public/d3Page.html new file mode 100644 index 000000000..a535efd1e --- /dev/null +++ b/public/d3Page.html @@ -0,0 +1,60 @@ + + + + + D3Page + + + + + + + + + + +

Hover over the diagram to learn more about the data!

+
+ + + + + +
+
+

Number of times every character in Thor has interacted with every other character

+
+
+
+ + +
+
+
+
+ + + + + + + + + + + + + + +
+
+ + + + + + + + + + \ No newline at end of file diff --git a/public/fractalPage.html b/public/fractalPage.html new file mode 100644 index 000000000..a52843737 --- /dev/null +++ b/public/fractalPage.html @@ -0,0 +1,24 @@ + + + + + Fractal Generator + + + + +
+

+ +
+
+ + + + + + + + + + \ No newline at end of file diff --git a/public/js/d3.js b/public/js/d3.js new file mode 100644 index 000000000..fec02072b --- /dev/null +++ b/public/js/d3.js @@ -0,0 +1,64 @@ +import {generateCircleGraph} from "./generateCircleGraphModule.js"; +let innerRadius = 200; + +// //The symmetric matrix about movie collaborations between the Avengers +let matrix = [ + [0,4,3,2,5,2], //Black Widow + [4,0,3,2,4,3], //Captain America + [3,3,0,2,3,3], //Hawkeye + [2,2,2,0,3,3], //The Hulk + [5,4,3,3,0,2], //Iron Man + [2,3,3,3,2,0], //Thor +]; + +let data1 = { + matrix: matrix, + indexByName: { + "Black Widow": 0, + "Captain America": 1, + "Hawkeye": 2, + "The Hulk": 3, + "Iron Man": 4, + "Thor": 5 + }, + nameByIndex: { + 0: "Black Widow", + 1: "Captain America", + 2: "Hawkeye", + 3: "The Hulk", + 4: "Iron Man", + 5: "Thor" + } +}; + +let colors = [ "#440154ff", "#407ea4", "#1ab530", "#fdb021", "#31668dff","#fde735ff"]; + + +window.onload = function(){ + + //generating the first data correlation circle + init1(".visTitle", "#my_dataviz", data1); + // init2() + +}; + + +function init1(className, idName, data){ + //create the div area that contains the tooltip to be shown on hover + let div1 = d3.select(className).append("div") + .attr("class", "tooltip") + .attr("style", "position: absolute;") + .style("opacity", 0); + + // create the svg area + let svg1 = d3.select(idName) + .append("svg") + .attr("width", window.innerWidth/2) + .append("g") + .attr("transform", "translate(400,300)") + + // 6 groups, so create a vector of 6 colors + generateCircleGraph(svg1, div1, data, colors, innerRadius); +} + + diff --git a/public/js/d3ActionOnMouseEvent.js b/public/js/d3ActionOnMouseEvent.js new file mode 100644 index 000000000..24784ff22 --- /dev/null +++ b/public/js/d3ActionOnMouseEvent.js @@ -0,0 +1,40 @@ + +export function actionOnMouseOver(pathOpacity, toolTipOpacity, svg, div, data){ + return function(g){ + div.transition() + .duration(200) + .style("opacity", toolTipOpacity); + div.style("bottom", 20 + "px"); + div.style("left", 20 + "px"); + div.style("font-size", "1.15em") + .html(function(){ + return "Character1: "+ data.nameByIndex[g.source.index].toString() + "
" + + "Character2: " + data.nameByIndex[g.target.index].toString() + "
" + + "Number of times: " + data.matrix[g.source.index][g.target.index]}); + + stylePath(svg, data, g, pathOpacity); + } + +} + +export function stylePath(svg, data, g, pathOpacity){ + svg.selectAll("path") + .filter(function(d) { + console.log("source", data.nameByIndex[g.source.index], "target",data.nameByIndex[g.target.index], "data", data.matrix[g.source.index][g.target.index]); + return (d.source.index !== g.source.index || d.target.index !== g.target.index); + }) + .transition() + .style("opacity", pathOpacity); +} + +// Returns an event handler for fading a given chord group. +export function actionOnMouseOut(pathOpacity, toolTipOpacity, svg, div, data) { + return function (g) { + div.transition() + .duration(500) + .style("opacity", toolTipOpacity); + + stylePath(svg, data, g, pathOpacity); + }; +} + diff --git a/public/js/fractalScript.js b/public/js/fractalScript.js new file mode 100644 index 000000000..7a2e56572 --- /dev/null +++ b/public/js/fractalScript.js @@ -0,0 +1,167 @@ +// 'use Strict' +let c, ctx, canvCenterX, canvCenterY, theRadius; +let isPaused = true; +let willReset = false; //to use when controls get updated and we want to kill the remaining drawings waiting to be run + + +let startColor = "rgb(27,213,222)"; + +window.onload = function () { + "use strict"; + document.getElementById("playPause").addEventListener("onclick", playPause); + + c = document.getElementById("fractalHolder"); + ctx = c.getContext("2d"); + c.width = window.innerWidth; + c.height = window.innerHeight; + canvCenterX = c.width/2; + canvCenterY = c.height/2; + theRadius = 700; + init(theRadius, 50, startColor, 10); +}; + + +function init(radius, minRadius, aStartColor, blurRate){ + "use strict"; + + startDrawingCircles(radius, minRadius, aStartColor, blurRate); + + let controls = new function () { + this.radius = theRadius; + this.minRadius = 20; + this.blurRate = 10; + this.aStartColor = startColor; + + //to update the parameters for characteristics + this.redraw = function () { + let newStartCol = hexToRgb(controls.aStartColor); + + // remove the old plane if the radius or blurriness attribute was changed otherwise keep going + if (!isCanvasBlank(c) && newStartCol===startColor) { + ctx.clearRect(0, 0, c.width, c.height); + willReset = true; + ctx.beginPath(); + willReset = false; + } + startColor = hexToRgb(controls.aStartColor); + startDrawingCircles(controls.radius, controls.minRadius, hexToRgb(controls.aStartColor), controls.blurRate); + }; + }(); + + //generate controls box on the gui that for every changed attribute calls the redraw function + let gui = new dat.GUI(); + + gui.add(controls, 'radius', 100, 800).step(1).onChange(controls.redraw); + gui.add(controls, 'minRadius', 5, 99).step(1).onChange(controls.redraw); + gui.add(controls, 'blurRate', 0, 20).step(1).onChange(controls.redraw); + gui.addColor(controls, 'aStartColor').onChange(controls.redraw); + + gui.close(); + controls.redraw(); + + //initial function that will draw cirles on the corners and the middle + function startDrawingCircles(radius, minRadius, aStartColor, blurRate){ + + drawCircleRecursive(0, 0,theRadius, minRadius, aStartColor, blurRate); + drawCircleRecursive(window.innerWidth, 0, theRadius, minRadius, aStartColor, blurRate); + + drawCircleRecursive(canvCenterX, canvCenterY, theRadius, minRadius, aStartColor, blurRate); + + drawCircleRecursive(0, window.innerHeight,theRadius, minRadius, aStartColor, blurRate); + drawCircleRecursive(window.innerWidth, window.innerHeight,theRadius, minRadius, aStartColor, blurRate); + } +} + + + + + +//drawing circle recursively +function drawCircleRecursive(centerX, centerY, radius, minRadius, color, blurRate){ + let circle; + if (isPaused === false) { + console.log("NOT PAUSED"); + if (color === startColor && willReset === false) { + ctx.beginPath(); + circle = ctx.arc(centerX, centerY, radius, 0, 2 * Math.PI); + ctx.strokeStyle = color; + ctx.shadowBlur = blurRate; + ctx.shadowColor = color; + ctx.stroke(); + + //checking that radius is still bigger than the min radius and that the color is the same as the global color + //if the color isnt the same that means it's an old process, the color has been updated so the process needs to be terminated + //also checking if the pause parameter is on or off + if (radius > minRadius) { + setTimeout(function () { + drawCircleRecursive(centerX + radius / 2, centerY, radius / 2, minRadius, color, blurRate); + }, 0); + setTimeout(function () { + drawCircleRecursive(centerX - radius / 2, centerY, radius / 2, minRadius, color, blurRate); + }, 0); + setTimeout(function () { + drawCircleRecursive(centerX, centerY + radius / 2, radius / 2, minRadius, color, blurRate); + }, 0); + setTimeout(function () { + drawCircleRecursive(centerX, centerY - radius / 2, radius / 2, minRadius, color, blurRate); + }, 0); + } + } + } else { + console.log("PAUSED"); + setTimeout(function () { + console.log(centerX, centerY, radius, minRadius, color, blurRate); + drawCircleRecursive(centerX, centerY, radius, minRadius, color, blurRate); + }, 50); + } +} + + + +//function to check if canvas is blank +function isCanvasBlank(c) { + const blank = document.createElement('canvas'); + + blank.width = c.width; + blank.height = c.height; + + return c.toDataURL() === blank.toDataURL(); +} + + +//function that converts hex values to rgb, for +function hexToRgb(hex) { + // Expand shorthand form (e.g. "03F") to full form (e.g. "0033FF") + let shorthandRegex = /^#?([a-f\d])([a-f\d])([a-f\d])$/i; + hex = hex.replace(shorthandRegex, function(m, r, g, b) { + return r + r + g + g + b + b; + }); + + let result = /^#?([a-f\d]{2})([a-f\d]{2})([a-f\d]{2})$/i.exec(hex); + return result ? "rgb(" + parseInt(result[1], 16) + ", " + parseInt(result[2], 16) + ", " + parseInt(result[3], 16) + ")" : hex; +} + +function playPause() { + let btnContainer = document.getElementById("tempText"); + let btnPlayPause = document.getElementById("playPause"); + // let currentChild = document.children[0]; + // console.log("current child", currentChild); + let curInnerHTML = btnPlayPause.innerHTML; + if (curInnerHTML.includes("fa-play")) { + //want to play + isPaused = false; + //turn btn to pause + curInnerHTML = ""; + btnPlayPause.innerHTML = ''; + btnContainer.innerText = "" + } else { + //want to pause + isPaused = true; + curInnerHTML = ""; + btnPlayPause.innerHTML = ''; + btnContainer.innerText += 'Click here to continue!' + + + } + +} \ No newline at end of file diff --git a/public/js/generateCircleGraphModule.js b/public/js/generateCircleGraphModule.js new file mode 100644 index 000000000..0200ea337 --- /dev/null +++ b/public/js/generateCircleGraphModule.js @@ -0,0 +1,85 @@ +import {actionOnMouseOver, actionOnMouseOut} from "./d3ActionOnMouseEvent.js"; + +//generates the graph +export function generateCircleGraph(svg, div, data, colors, innerRadius){ + // give this matrix to d3.chord() + let res = d3.chord() + .padAngle(0.05) + .sortGroups(d3.descending) + .sortSubgroups(d3.descending)(data.matrix); + + +// Add the links between groups + svg + .datum(res) + .append("g") + .selectAll("path") + .data(function(d) { return d; }) + .enter() + .append("path") + .attr("d", d3.ribbon() + .radius(innerRadius) + ) + .style("fill", function(d){ return(colors[d.source.index]) }) // colors depend on the source group. Change to target otherwise. + .attr("fill-opacity", 0.67) //opacity of ribbons + .style("stroke", "black") + .on("mouseover", actionOnMouseOver(.3, 0.9, svg, div, data)) + .on("mouseout", actionOnMouseOut(1, 0, svg, div, data)); + + + + // this group object uses each group of the data.groups object + let group = svg + .datum(res) + .append("g") + .selectAll("g") + .data(function(d) { return d.groups; }) + .enter(); + + +// Add the ticks + group + .selectAll(".group-tick") + .data(function(d) { return groupTicks(d, 1); }) // Controls the number of ticks: one tick each 25 here. + .enter() + .append("g") + .attr("transform", function(d) { return "rotate(" + (d.angle * 180 / Math.PI - 90) + ") translate(" + innerRadius + ",0)"; }) + .append("line") // By default, x1 = y1 = y2 = 0, so no need to specify it. + .attr("x2", 6) + .attr("stroke", "black"); + + +// Add the labels to ticks: + group + .selectAll(".group-tick-label") + .data(function(d) { return groupTicks(d, 1); }) + .enter() + // .filter(function(d) { return d.value % 25 === 0; }) + .append("g") + .attr("transform", function(d) { return "rotate(" + (d.angle * 180 / Math.PI - 90) + ") translate(" + innerRadius + ",0)"; }) + .append("text") + .each(d => { d.angle = (d.startAngle + d.endAngle) / 2; }) + .attr("x", 15) + .attr("y", 10) + .attr("transform", function(d) { return d.angle > Math.PI ? "rotate(180) translate(-16)" : null; }) + .style("text-anchor", function(d) { return d.angle > Math.PI ? "end" : null; }) + .text(function(d) { return d.value }) + .style("font-size", 12); + + //adding text to show which color represents which attribute + group.append("text") + .each(d => { d.angle = (d.startAngle + d.endAngle) / 2; }) + .attr("x", 15) + .attr("y", 10) + .attr("transform", function(d) { return "rotate(" + (d.angle * 180 / Math.PI - 90) + ") translate(" + (innerRadius+10)+ ",0)"; }) + .text(d => data.nameByIndex[d.index]) +} + + +// Returns an array of tick angles and values for a given group and step. +function groupTicks(d, step) { + let k = (d.endAngle - d.startAngle) / d.value; + return d3.range(0, d.value, step).map(function(value) { + return {value: value, angle: value * k + d.startAngle}; + }); +} diff --git a/public/js/homePage.js b/public/js/homePage.js new file mode 100644 index 000000000..c055b1171 --- /dev/null +++ b/public/js/homePage.js @@ -0,0 +1,81 @@ + // console.clear() + // + // let renderer, scene, camera, geometry, material, cube; + // + // //get our
container + // + // // Helper let which we will use as a additional correction coefficient for objects and camera + // let distance = 400; + // window.onload = function(){ + // init() + // } + // + // function init() { + // let container = document.getElementById('titleCanvas'); + // //init render + // renderer = new THREE.WebGLRenderer({antialias: true}); + // //render window size + // renderer.setSize(window.innerWidth, window.innerHeight); + // //background color + // renderer.setClearColor (0xff0000, 0.2); + // //append render to the
container + // container.appendChild(renderer.domElement); + // + // + // //init scene, camera and camera position + // scene = new THREE.Scene(); + // camera = new THREE.PerspectiveCamera(40, window.innerWidth / window.innerHeight, 0.1, 1000); + // //adding camera to the scene + // scene.add(camera); + // + // //creating ojects to add to scene + // geometry = new THREE.BoxGeometry( 1, 1, 1 ); + // material = new THREE.MeshBasicMaterial( { color: 0xff0000 } ); + // //generating a mesh cube with the given material + // cube = new THREE.Mesh( geometry, material ); + // scene.add( cube ); + // camera.position.z = 10; + // animate() + // + // + // + // } + // + // let animate = function () { + // console.log("animating") + // requestAnimationFrame( animate ); + // + // cube.rotation.x += 0.01; + // cube.rotation.y += 0.01; + // + // renderer.render( scene, camera ); + // }; + // + // + // + // + // + // // window.onload = function () { + // // startThreeJs() + // // } + // + // // function startThreeJs(){ + // // //creating basic scene and adding camera to it + // // let scene = new THREE.Scene(); + // // let camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 0.1, 1000 ); + // // let renderer = new THREE.WebGLRenderer(); + // // renderer.setSize( window.innerWidth, window.innerHeight ); + // // document.body.appendChild( renderer.domElement ); + // // //adding cube + // // let geometry = new THREE.BoxGeometry( 1, 1, 1 ); + // // let material = new THREE.MeshBasicMaterial( { color: 0x00ff00 } ); + // // let cube = new THREE.Mesh( geometry, material ); + // // scene.add( cube ); + // // camera.position.z = 5; + // // animate(scene, camera, renderer) + // // } + // // + // // function animate(scene, camera, renderer) { + // // requestAnimationFrame(animate); + // // renderer.render(scene, camera) + // // } \ No newline at end of file diff --git a/public/js/musicVizScript.js b/public/js/musicVizScript.js new file mode 100644 index 000000000..3c040fddd --- /dev/null +++ b/public/js/musicVizScript.js @@ -0,0 +1,151 @@ +window.onload = function () { + vizInit() +} + +let scene, camera, renderer, analyser, uniforms; +let file, fileLabel, mediaElement + +//will get the submitted audio file, label it, +//after user submits the file +let vizInit = function (){ + file = document.getElementById("thefile"); + fileLabel = document.querySelector("label.file"); + mediaElement = document.getElementById("audio"); + + + //if the user changes the file + file.onchange = function(){ + fileLabel.classList.add('normal'); + mediaElement.classList.add('active'); + let files = this.files; + + mediaElement.src = URL.createObjectURL(files[0]); + mediaElement.load(); + mediaElement.play(); + //call the function that generates the graphics + play(); + } +} + + +//what happens when the song starts playing +function play() { + + let fftSize = 512; //size of fourier transform + + let container = document.getElementById( 'out' ); + renderer = new THREE.WebGLRenderer( { antialias: true } ); + renderer.setSize( window.innerWidth, window.innerHeight ); + renderer.setClearColor( 0xffffff ); //sets color of canvas /border not the inside where the movement is happening + renderer.setPixelRatio( window.devicePixelRatio ); + container.appendChild( renderer.domElement ); + + scene = new THREE.Scene(); + camera = new THREE.Camera(); + + let listener = new THREE.AudioListener(); + let audio = new THREE.Audio( listener ); + audio.setMediaElementSource( mediaElement ); + mediaElement.loop = false; + analyser = new THREE.AudioAnalyser( audio, fftSize ); + + //threejs frequency line + uniforms = { + tAudioData: { value: new THREE.DataTexture( analyser.data, fftSize / 2, 1, THREE.LuminanceFormat ) } + }; + + let freqLineMaterial = new THREE.ShaderMaterial( { + uniforms: uniforms, + vertexShader: document.getElementById( 'vertexShader' ).textContent, + fragmentShader: document.getElementById( 'fragmentShader' ).textContent + } ); + + freqLineMaterial.uniforms.diffuse = { type: "c", value: { r:255, g:124, b:54 } }; + + //creating the plane and the mesh + let freqLineGeometry = new THREE.PlaneBufferGeometry( 2,2 ); + let freqLineMesh = new THREE.Mesh( freqLineGeometry, freqLineMaterial ); + scene.add(freqLineMesh); + + + //starting the animation + window.addEventListener( 'resize', onResize, false ); + animate(); +} + +function animate() { + requestAnimationFrame( animate ); + render(); +} +function render() { + analyser.getFrequencyData(); + uniforms.tAudioData.value.needsUpdate = true; + renderer.render( scene, camera ); +} + +function onResize() { + renderer.setSize( window.innerWidth, window.innerHeight ); +} + +window.addEventListener( 'resize', onResize, false ); + + + + + + + +//CUBE CODE +// //creating cube for the background +// //1 unit for width, height, depth +// let cubeGeometry = new THREE.CubeGeometry(2,2,2); +// +// // each cube side gets another color +// let cubeMaterials = [ +// new THREE.MeshBasicMaterial({color:0x33AA55, transparent:true, opacity:0.8}), +// new THREE.MeshBasicMaterial({color:0x55CC00, transparent:true, opacity:0.8}), +// new THREE.MeshBasicMaterial({color:0x000000, transparent:true, opacity:0.8}), +// new THREE.MeshBasicMaterial({color:0x000000, transparent:true, opacity:0.8}), +// new THREE.MeshBasicMaterial({color:0x0000FF, transparent:true, opacity:0.8}), +// new THREE.MeshBasicMaterial({color:0x5555AA, transparent:true, opacity:0.8}), +// ]; +// // create a MeshFaceMaterial, allows cube to have different materials on each face +// let cubeMaterial = cubeMaterials; +// let cube = new THREE.Mesh(cubeGeometry, cubeMaterial); +// +// cube.position.set(0,0,0); +// group.add( cube ); +// group.add( freqLineMesh ); +// +// cube.scale.x = 2.5; // SCALE +// cube.scale.y = 2.5; // SCALE +// cube.scale.z = 2.5; // SCALE + + + +// +// +// +// function onDocumentMouseMove( event ) { +// mouseX = event.clientX - window.innerWidth/2; +// targetRotation = targetRotationOnMouseDown + ( mouseX - mouseXOnMouseDown ) * 0.02; +// } +// function onDocumentMouseUp() { +// document.removeEventListener( 'mousemove', onDocumentMouseMove, false ); +// document.removeEventListener( 'mouseup', onDocumentMouseUp, false ); +// document.removeEventListener( 'mouseout', onDocumentMouseOut, false ); +// } +// function onDocumentMouseOut() { +// document.removeEventListener( 'mousemove', onDocumentMouseMove, false ); +// document.removeEventListener( 'mouseup', onDocumentMouseUp, false ); +// document.removeEventListener( 'mouseout', onDocumentMouseOut, false ); +// } +// +// function onDocumentMouseDown( event ) { +// event.preventDefault(); +// document.addEventListener( 'mousemove', onDocumentMouseMove, false ); +// document.addEventListener( 'mouseup', onDocumentMouseUp, false ); +// document.addEventListener( 'mouseout', onDocumentMouseOut, false ); +// mouseXOnMouseDown = event.clientX - window.innerWidth/2; +// targetRotationOnMouseDown = targetRotation; +// } \ No newline at end of file diff --git a/public/js/newHomePage.js b/public/js/newHomePage.js new file mode 100644 index 000000000..7aa02eddd --- /dev/null +++ b/public/js/newHomePage.js @@ -0,0 +1,242 @@ +// setup the control gui +// once everything is loaded, we run our Three.js stuff. +let scene, camera, webGLRenderer; +let canDismiss = false; +let blocked = false; + +//what to run as soon as window loads +window.onload = function () { + init(); + typeIt(); +}; + +//listeners +window.addEventListener( 'resize', onWindowResize, false ); +document.addEventListener('mousemove', onMouseMove, false); +document.addEventListener("click", function() { + console.log("dismiss called with canDismiss and blocked val", canDismiss, blocked); + if (canDismiss === true) { + document.getElementById("typed").innerText = ""; + document.getElementById("typed").innerHTML = ""; + canDismiss = false + } +}); +document.getElementById("helpContainer").addEventListener("click", function(){ + if (canDismiss === false && blocked === false){ + typeIt() + } +}); + + + +//function that initializes everything +function init() { + // create a scene + scene = new THREE.Scene(); + // create a camera and assign vertical field of view, aspect ratio, and near and far planes + camera = new THREE.PerspectiveCamera(45, window.innerWidth / window.innerHeight, 0.1, 1000); + + // create a webgl render and set the size + webGLRenderer = new THREE.WebGLRenderer(); + webGLRenderer.setClearColor(new THREE.Color(0x000000, 1.0)); + webGLRenderer.setSize(window.innerWidth, window.innerHeight); + webGLRenderer.shadowMapEnabled = true; + + // position and point the camera to the center of the scene + camera.position.x = -30; + camera.position.y = 40; + camera.position.z = 70; + camera.lookAt(new THREE.Vector3(10, 0, 0)); + + // add the output of the renderer to the html element + document.getElementById("titleCanvas").append(webGLRenderer.domElement); + + let step = 0; //for keeping track of rotation + let knot; //the object being viewed + //initiate the characteristics of the created object + let controls = new function () { + this.radius = 30; + this.tube = 28.2; + this.radialSegments = 400; + this.tubularSegments = 30; + this.p = 6; + this.q = 5; + this.heightScale = 4; + this.asParticles = true; + this.rotate = true; + + //to update the parameters for characteristics + this.redraw = function () { + // remove the old plane + if (knot) scene.remove(knot); + // create a new one using torus knot geometry + let geom = new THREE.TorusKnotGeometry(controls.radius, controls.tube, Math.round(controls.radialSegments), Math.round(controls.tubularSegments), Math.round(controls.p), Math.round(controls.q), controls.heightScale); + if (controls.asParticles) { + knot = createParticleSystem(geom); + } else { + knot = createMesh(geom); + } + // add it to the scene. + scene.add(knot); + }; + }; + + + //generate controls box on the gui that for every changed attribute calls the redraw function + let gui = new dat.GUI(); + gui.add(controls, 'radius', 0, 40).onChange(controls.redraw); + gui.add(controls, 'tube', 0, 40).onChange(controls.redraw); + gui.add(controls, 'radialSegments', 0, 400).step(1).onChange(controls.redraw); + gui.add(controls, 'tubularSegments', 1, 20).step(1).onChange(controls.redraw); + gui.add(controls, 'p', 1, 10).step(1).onChange(controls.redraw); + gui.add(controls, 'q', 1, 15).step(1).onChange(controls.redraw); + gui.add(controls, 'heightScale', 0, 5).onChange(controls.redraw); + gui.add(controls, 'asParticles').onChange(controls.redraw); + gui.add(controls, 'rotate').onChange(controls.redraw); + gui.close(); + controls.redraw(); + render(); + + // from THREE.js examples + function generateSprite() { + let canvas = document.createElement('canvas'); + canvas.width = 16; + canvas.height = 16; + + let context = canvas.getContext('2d'); + let gradient = context.createRadialGradient(canvas.width / 2, canvas.height / 2, 0, canvas.width / 2, canvas.height / 2, canvas.width / 2); + gradient.addColorStop(0, 'rgba(255,255,255,1)'); + gradient.addColorStop(0.2, 'rgb(0,5,255)'); + gradient.addColorStop(0.4, 'rgba(0,0,64,1)'); + gradient.addColorStop(1, 'rgba(0,0,0,1)'); + + context.fillStyle = gradient; + context.fillRect(0, 0, canvas.width, canvas.height); + + let texture = new THREE.Texture(canvas); + texture.needsUpdate = true; + return texture; + } + + //controls the interval of rotation of the object + setInterval(function(){ + if (controls.radius < 80 && controls.tubularSegments > 12) { + controls.radius += 0.5; + controls.tubularSegments -= 1 + } + // remove the old plane + if (knot) {scene.remove(knot);} + // create a new one + let geom = new THREE.TorusKnotGeometry(controls.radius, controls.tube, Math.round(controls.radialSegments), Math.round(controls.tubularSegments), Math.round(controls.p), Math.round(controls.q), controls.heightScale); + if (controls.asParticles) { + knot = createParticleSystem(geom); + } else { + knot = createMesh(geom); + } + scene.add(knot); + } + , 10); + + //creating point cloud system given the geometry and material + function createParticleSystem(geom) { + //creating material with attributes + let material = new THREE.PointCloudMaterial({ + color: 0xffffff, + size: 3, + transparent: true, + blending: THREE.AdditiveBlending, //necessary for realistic points + map: generateSprite() + }); + return new THREE.PointCloud(geom, material); + } + + //creating mesh and adding materials to it + function createMesh(geom) { + // assign two materials + let meshMaterial = new THREE.MeshNormalMaterial({}); + // creates object that is composed of a lot of other objects + return THREE.SceneUtils.createMultiMaterialObject(geom, [meshMaterial]); + } + + + function render() { + if (controls.rotate) { + knot.rotation.y = step += 0.003; //rotation speed + } + // render using requestAnimationFrame + requestAnimationFrame(render); + webGLRenderer.render(scene, camera); + } +} + + +function onWindowResize(){ + camera.aspect = window.innerWidth / window.innerHeight; + camera.updateProjectionMatrix(); + webGLRenderer.setSize( window.innerWidth, window.innerHeight ); +} + + +function onMouseMove(event) { + let mouseX = event.clientX - window.innerWidth / 2; + let mouseY = event.clientY - window.innerHeight / 2; + camera.position.x += (mouseX - camera.position.x) * 0.005; + camera.position.y += (mouseY - camera.position.y) * 0.005; + //set up camera position + camera.lookAt(scene.position); +} + + + +//function to type help text +function typeIt(){ + blocked = true; + //remove previous instance of this in case user wants to see help again, so no two instances of it run at the same time + let helpContainer = document.getElementById("helpContainer"); + helpContainer.innerText = ""; + helpContainer.innerHTML = "null"; + helpContainer.innerHTML = ""; + helpContainer.addEventListener("click", function(){ + if(canDismiss===false && blocked === false){ typeIt() } + }); + new Typed('#typed', { + strings: ["Here, let me help you navigate this site.", + "If you scroll down, you will find some cards. ", + "Click on one of them to see a visualization of some sort.", + "Click the back button to go back.", + "Also, check out the controls on this background, they're pretty cool.", + "Have fun!", + "Click anywhere to dismiss me,", + "And click here again to read this again." + ], + typeSpeed: 1, + onComplete: () => { typeCompleted() } + }) +} + +//change vars when typed finishes +function typeCompleted(){ + canDismiss = true; + blocked = false; +} + + + + +//in case I decide to bring back stats +// stats = initStats(); //stats box on the top left corner + +//intialize stats box on top left +// function initStats() { +// let stats = new Stats(); +// stats.setMode(0); +// +// // Align top-left +// stats.domElement.style.position = 'absolute'; +// stats.domElement.style.left = '0px'; +// stats.domElement.style.top = '0px'; +// +// //append to html doc +// document.getElementById("statsOutput").append(stats.domElement); +// return stats; +// } diff --git a/public/js/prepareAudioForAnalysis.js b/public/js/prepareAudioForAnalysis.js new file mode 100644 index 000000000..3374578e9 --- /dev/null +++ b/public/js/prepareAudioForAnalysis.js @@ -0,0 +1,20 @@ +export function prepAudio(mediaElement){ + console.log("it's being called!"); + let audioContext = new (window.AudioContext || window.webkitAudioContext)(); + let analyser = audioContext.createAnalyser(); + let gainNode = audioContext.createGain(); + let song = mediaElement; + let songSource = audioContext.createMediaElementSource(song); + + songSource.connect(audioContext.destination); + songSource.connect(gainNode); + songSource.connect(analyser); + + return { + audioContext: audioContext, + analyser: analyser, + gainNode: gainNode, + song: song, + songSource: songSource + } +} diff --git a/public/js/singleLineVizScript.js b/public/js/singleLineVizScript.js new file mode 100644 index 000000000..828cbe556 --- /dev/null +++ b/public/js/singleLineVizScript.js @@ -0,0 +1,86 @@ +import {prepAudio} from "./prepareAudioForAnalysis.js" + +window.onload = function () { + vizInit() +}; + + +let file, fileLabel, mediaElement; + +//will get the submitted audio file, label it, +// and assign it to a table for future use +//after user submits the file +let vizInit = function () { + + file = document.getElementById("thefile"); + fileLabel = document.querySelector("label.file"); + mediaElement = document.getElementById("audio"); + + + //if the user changes the file + file.onchange = function () { + fileLabel.classList.add('normal'); + mediaElement.classList.add('active'); + let files = this.files; + + mediaElement.src = URL.createObjectURL(files[0]); + mediaElement.load(); + mediaElement.play(); + + //call the function that generates the graphics + spectogram() + } +}; + +function spectogram() { + let analyser; + + let audioElements = prepAudio(mediaElement); + analyser = audioElements.analyser; + + //generate waveform from frequency extracted from song by analyzer + const waveform = new Float32Array(analyser.frequencyBinCount); + analyser.getFloatTimeDomainData(waveform); + + //updates the waveform based on frequencies in every frame + //to keep up with the song + function updateWaveform() { + requestAnimationFrame(updateWaveform); + analyser.getFloatTimeDomainData(waveform) + } + + updateWaveform(); + + //draw the waveform on a canvas + const canvas = document.getElementById('oscillatingElement'); + canvas.width = window.innerWidth; + canvas.height = window.innerHeight; + const context = canvas.getContext('2d'); + + //updates the image shown so that it is he same as the new waveform fro every frame + function drawOscilloscope() { + requestAnimationFrame(drawOscilloscope); + context.clearRect(0, 0, canvas.width, canvas.height); //clear canvas for updating waveform line + context.beginPath(); + for (let i = 0; i < waveform.length; i++) { + const x = (window.innerWidth/2)-(waveform.length/2) + i; //x position - centering the waveform + const y = (1 + waveform[i] / 3) * canvas.height/1.8; //y position + context.strokeStyle = getGreenToRed(Math.abs(waveform[i])*1500); + context.lineWidth = 3; + context.arc(x, y+Math.abs(waveform[i]*10), Math.abs(waveform[i]*30), Math.PI*Math.abs(waveform[i]*1000), Math.PI * 2, true); + + } + context.stroke() + } + drawOscilloscope() +} + + +//function that converts percentage to a rgb value between green and red +//taken from https://stackoverflow.com/questions/7128675/from-green-to-red-color-depend-on-percentage +function getGreenToRed(percent){ + let r = percent<50 ? 255 : Math.floor(255-(percent*2-100)*255/100); + let g = percent>50 ? 255 : Math.floor((percent*2)*255/100); + return 'rgb('+r+','+g+',0)'; +} + diff --git a/public/js/webGLShaders.js b/public/js/webGLShaders.js new file mode 100644 index 000000000..c0ad364f7 --- /dev/null +++ b/public/js/webGLShaders.js @@ -0,0 +1,148 @@ +import {prepAudio} from "./prepareAudioForAnalysis.js" +let file, fileLabel, mediaElement, analyser, audioContext; + +window.onload = function () { + vizInit() +}; + + +//will get the submitted audio file, label it, +// and assign it to a letiable for future use +//after user submits the file +let vizInit = function (){ + file = document.getElementById("thefile"); + fileLabel = document.querySelector("label.file"); + mediaElement = document.getElementById("audio"); + + + //if the user changes the file (uploads or reuploads) + file.onchange = function(){ + fileLabel.classList.add('normal'); + mediaElement.classList.add('active'); + let files = this.files; + + mediaElement.src = URL.createObjectURL(files[0]); + mediaElement.load(); + mediaElement.play(); + + //call the function that generates the graphics + init(); + } +}; + + +//function to initialize the drawing of the elements +function init(){ + //creates audio context and analyzer + //connect the analyzer with the gain handler + //connect the gain handler to the song + let audioElements = prepAudio(mediaElement); + audioContext = audioElements.audioContext; + analyser = audioElements.analyser; + + + //get fft waveform of audio at every update (every frame) + //returning in Uint8Array gives us the exact range needed to perform Canvas pixel manipulation + const spectrum = new Uint8Array(analyser.frequencyBinCount) + ;(function updateSpectrum() { + requestAnimationFrame(updateSpectrum); + analyser.getByteFrequencyData(spectrum) + })(); + + //initialize canvas, compile shader + const fragCanvas = document.getElementById('oscillatingElement'); + fragCanvas.width = window.innerWidth; + fragCanvas.height = window.innerHeight; + const gl = fragCanvas.getContext('webgl') || fragCanvas.getContext('experimental-webgl'); + const vertexShaderSrc = document.getElementById('vertexShader').textContent; + const fragmentShaderSrc = document.getElementById('fragmentShader').textContent; + const fragShader = createShader(gl, vertexShaderSrc, fragmentShaderSrc); + + + //initialize shader letiables + const fragPosition = gl.getAttribLocation(fragShader, 'position'); + gl.enableVertexAttribArray(fragPosition); + const fragTime = gl.getUniformLocation(fragShader, 'time'); + gl.uniform1f(fragTime, audioContext.currentTime); + const fragResolution = gl.getUniformLocation(fragShader, 'resolution'); + gl.uniform2f(fragResolution, fragCanvas.width, fragCanvas.height); + const fragSpectrumArray = new Uint8Array(4 * spectrum.length); + const fragSpectrum = createTexture(gl); + + + + function createTexture(gl) { + const texture = gl.createTexture(); + gl.bindTexture(gl.TEXTURE_2D, texture); + gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR); + gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE); + gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE); + return texture + } + + + + //initialize fullscreen rectangle and start render loop + //update time and spectrum letiable in every frame + initQuad(gl); + + function renderFragment() { + requestAnimationFrame(renderFragment); + gl.uniform1f(fragTime, audioContext.currentTime); + copyAudioDataToTexture(gl, spectrum, fragSpectrumArray); + renderQuad(gl) + } + renderFragment() +} + + +//will generate a fullscreen rectangle (quad). we will draw the fragment shader on top of this +//initializing elements of the rectangle +function initQuad(gl) { + const vbo = gl.createBuffer(); + gl.bindBuffer(gl.ARRAY_BUFFER, vbo); + const vertices = new Float32Array([-1, -1, 1, -1, -1, 1, 1, 1]); //indices of the vertices + gl.bufferData(gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW); + gl.vertexAttribPointer(0, 2, gl.FLOAT, false, 0, 0) +} + +//rendering rectangle as two back to back triangles +function renderQuad(gl) { + gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4) +} + +//creating shaders +//will take a vertex shader and a fragment shader, and return the compiled shader program +function createShader(gl, vertexShaderSrc, fragmentShaderSrc) { + const vertexShader = gl.createShader(gl.VERTEX_SHADER); + gl.shaderSource(vertexShader, vertexShaderSrc); + gl.compileShader(vertexShader); + if (!gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS)) { + throw new Error(gl.getShaderInfoLog(vertexShader)) + } + + const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER); + gl.shaderSource(fragmentShader, fragmentShaderSrc); + gl.compileShader(fragmentShader); + if (!gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS)) { + throw new Error(gl.getShaderInfoLog(fragmentShader)) + } + + const shader = gl.createProgram(); + gl.attachShader(shader, vertexShader); + gl.attachShader(shader, fragmentShader); + gl.linkProgram(shader); + gl.useProgram(shader); + + return shader +} + +function copyAudioDataToTexture(gl, audioData, textureArray) { + for (let i = 0; i < audioData.length; i++) { + textureArray[2 * i ] = audioData[i]; // R + textureArray[2 * i + 1] = audioData[i]; // G + textureArray[2 * i + 2] = audioData[i]; // B + textureArray[2 * i + 3] = 255 // A + } + gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, audioData.length, 1, 0, gl.RGBA, gl.UNSIGNED_BYTE, textureArray); +} \ No newline at end of file diff --git a/public/particle.png b/public/particle.png new file mode 100644 index 000000000..ee3f7e902 Binary files /dev/null and b/public/particle.png differ