How To Create Audio Spectrum Visualizer Using HTML CSS & JavaScript

How To Create Audio Spectrum Visualizer Using HTML CSS & JavaScript

In today’s digital age, multimedia experiences play a crucial role in engaging users. Audio spectrum visualizers are a popular way to enhance the audio listening experience by providing a dynamic and visually appealing representation of the audio being played. In this tutorial, we will explore how to create an audio spectrum visualizer using HTML, CSS, and JavaScript.

HTML: The HTML structure for our audio spectrum visualizer is straightforward. We will use a <div> element to contain the visualizer, an <input> element of type “file” to allow the user to select an audio file, a <canvas> element to draw the visualizer and an <audio> element to play the selected audio file.

CSS: For the CSS, we will style the visualizer container, set up the canvas dimensions and appearance, and style the audio player controls. We will also add some basic styling to make the visualizer visually appealing.

JavaScript: The JavaScript code will handle the main functionality of the visualizer. We will create an AudioContext to handle audio processing, set up an AnalyserNode to analyze the audio data, and use a CanvasRenderingContext2D to draw the visualizer on the canvas. We will also add event listeners to update the visualizer in real-time as the audio is playing.

HTML File:

<body>
<div class="wrapper">
<input id="fileinput" type="file" accept="audio/mp3,video/mp4"/>
<canvas id="canvas"></canvas>
<audio id="audio" src="" controls="true"></audio>
</div>
</body>

Explanation:

  1. <div class="wrapper">: This is a div element with a class name “wrapper” that acts as a container for the input, canvas, and audio elements. It helps in organizing the layout and styling of these elements.
  2. <input id="fileinput" type="file" accept="audio/mp3,video/mp4"/>: This is an input element of type “file” that allows users to select files from their device. The accept attribute specifies the types of files that can be selected, in this case, audio files with the MIME type audio/mp3 and video files with the MIME type video/mp4. The id attribute is used to uniquely identify the input element.
  3. <canvas id="canvas"></canvas>: This is a canvas element used for rendering graphics, animations, or other visualizations using JavaScript. It can be used to create visual effects synchronized with the audio playback.
  4. <audio id="audio" src="" controls="true"></audio>: This is an audio element used for embedding audio content in the web page. The id attribute is used to uniquely identify the audio element. The src attribute specifies the URL of the audio file to be played. The controls attribute adds playback controls (like play, pause, volume, etc.) to the audio player.

CSS File

html {
height: 100%;
}

body {
height: 100%;
margin: 0;
background-color: #edf2f7;
}

.sh-card {
position: fixed;
bottom: 10px;
right: 10px;
background: #edf2f7;
padding: 7px 10px;
border-radius: 50px;
border: none;
color: #000;
box-shadow: 2px 2px 20px -10px #000;
transition: all 0.3s ease;
z-index: 99999;
font-family: system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji";
line-height: 1.5;
overflow: hidden;
width: 18px;
white-space: nowrap;
}
.sh-card:hover {
width: 92px;
}

.sh-card-icon,
.sh-card-info {
display: inline-block;
vertical-align: baseline;
line-height: 1;
}

.sh-card-info {
font-size: 16px;
margin-left: 7px;
}

.sh-card-icon {
width: 18px;
height: 15px;
}

.sh-card-link,
.sh-card-link:hover,
.sh-card-link:active,
.sh-card-link:visited {
color: #000;
text-decoration: none;
line-height: 1;
}

.sh-card-box {
display: block;
}

@-webkit-keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
@keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
.sh-bounceIn {
-webkit-animation-duration: 0.75s;
animation-duration: 0.75s;
-webkit-animation-name: bounceIn;
animation-name: bounceIn;
animation-fill-mode: forwards;
-webkit-animation-fill-mode: forwards;
animation-delay: 1s;
-webkit-animation-delay: 1s;
opacity: 0;
}

canvas {
width: 100%;
height: 130px;
background: #f9f9f9;
margin: 2rem auto;
box-shadow: inset 0px 0px 25px -15px #000;
border-radius: 5px;
overflow: hidden;
}

.wrapper {
padding: 2rem;
}

Explanation:

  1. html { height: 100%; }: Sets the height of the html element to 100% of the viewport height, ensuring that the webpage takes up the full height of the browser window.
  2. body { ... }: Styles the body element, setting its height to 100% of the viewport height, removing any default margin, setting the background color to a light gray (#edf2f7), which gives the page a clean look.
  3. .sh-card { ... }: Styles the floating card element. It is positioned fixed at the bottom right corner (position: fixed; bottom: 10px; right: 10px;). It has a light gray background (background: #edf2f7;), padding, border-radius for rounded corners, and a box-shadow for a 3D effect. The card width is initially set to 18px and expands to 92px on hover (:hover).
  4. .sh-card-icon, .sh-card-info { ... }: Styles the elements inside the card. .sh-card-icon is used for displaying an icon, while .sh-card-info is used for displaying text information. They are set to display: inline-block; to align them horizontally and have a consistent line height.
  5. .sh-card-info { font-size: 16px; ... }: Styles the text information inside the card, setting the font size to 16px and adding some left margin for spacing (margin-left: 7px;).
  6. .sh-card-link { ... }: Styles the links inside the card. Sets the color to black (#000) and removes the underline (text-decoration: none;).
  7. .sh-card-box { display: block; }: Styles the box inside the card. Sets it to display as a block element.
  8. @keyframes bounceIn { ... }: Defines a keyframe animation named bounceIn using CSS @keyframes. This animation scales and fades in an element to give it a bouncing effect. It is used to animate the card when it appears on the screen (animation-name: bounceIn;).
  9. .sh-bounceIn { ... }: Applies the bounceIn animation to an element. It sets the animation duration, fill mode, and delay to control the animation timing and appearance.
  10. canvas { ... }: Styles the canvas element used for visual effects. It sets the width to 100%, height to 130px, background color to a light gray, adds a box shadow for a 3D effect, and sets a border radius for rounded corners.
  11. .wrapper { ... }: Styles the wrapper element that contains the content of the webpage. It adds padding around the content.

JavaScript File

"use strict";
window.AudioContext = window.AudioContext || window.webkitAudioContext;
class renderWave {
constructor(message) {
this._samples = 10000;
this._strokeStyle = "#3098ff";
this.audioContext = new AudioContext();
this.canvas = document.querySelector("canvas");
this.ctx = this.canvas.getContext("2d");
this.data = [];
message
.then(arrayBuffer => {
return this.audioContext.decodeAudioData(arrayBuffer);
})
.then(audioBuffer => {
this.draw(this.normalizedData(audioBuffer));
this.drawData(this.data);
});
}
normalizedData(audioBuffer) {
const rawData = audioBuffer.getChannelData(0); // We only need to work with one channel of data
const samples = this._samples; // Number of samples we want to have in our final data set
const blockSize = Math.floor(rawData.length / samples); // Number of samples in each subdivision
const filteredData = [];
for (let i = 0; i < samples; i++) {
filteredData.push(rawData[i * blockSize]);
}
return filteredData;
}
draw(normalizedData) {
// set up the canvas
const canvas = this.canvas;
const dpr = window.devicePixelRatio || 1;
const padding = 10;
canvas.width = canvas.offsetWidth * dpr;
canvas.height = (canvas.offsetHeight + padding * 2) * dpr;
this.ctx.scale(dpr, dpr);
this.ctx.translate(0, canvas.offsetHeight / 2 + padding); // set Y = 0 to be in the middle of the canvas
// draw the line segments
const width = canvas.offsetWidth / normalizedData.length;
for (let i = 0; i < normalizedData.length; i++) {
const x = width * i;
let height = normalizedData[i] * canvas.offsetHeight - padding;
if (height < 0) {
height = 0;
}
else if (height > canvas.offsetHeight / 2) {
height = height > canvas.offsetHeight / 2;
}
// this.drawLineSegment(this.ctx, x, height, width, (i + 1) % 2);
this.data.push({
x: x,
h: height,
w: width,
isEven: (i + 1) % 2
});
}
return this.data;
}
drawLineSegment(ctx, x, height, width, isEven, colors = this._strokeStyle) {
ctx.lineWidth = 1; // how thick the line is
ctx.strokeStyle = colors; // what color our line is
ctx.beginPath();
height = isEven ? height : -height;
ctx.moveTo(x, 0);
ctx.lineTo(x + width, height);
ctx.stroke();
}
drawData(data, colors = this._strokeStyle) {
data.map(item => {
this.drawLineSegment(this.ctx, item.x, item.h, item.w, item.isEven, colors);
});
}
drawTimeline(percent) {
let end = Math.ceil(this._samples * percent);
let start = end - 5 || 0;
let t = this.data.slice(0, end);
this.drawData(t, "#1d1e22");
}
}
document.getElementById("fileinput").addEventListener("change", function () {
var wave = new renderWave(this.files[0].arrayBuffer());
var audioPlayer = document.getElementById("audio");
audioPlayer.src = URL.createObjectURL(this.files[0]);
// audioPlayer.play();
audioPlayer.ontimeupdate = function () {
let percent = this.currentTime / this.duration;
wave.drawTimeline(percent);
};
});

Explanation:

  1. "use strict";: This is a directive that enables strict mode, which catches common coding mistakes and “unsafe” actions in JavaScript, providing more secure and optimized code.
  2. window.AudioContext = window.AudioContext || window.webkitAudioContext;: This line checks if the AudioContext object is supported in the current browser. If it’s not, it tries to use the webkitAudioContext object (for older versions of WebKit browsers).
  3. class renderWave { ... }: This defines a JavaScript class called renderWave, which is used to render the audio waveform visualization.
  4. constructor(message) { ... }: This is the constructor method of the renderWave class. It initializes various properties and sets up the audio context and canvas for visualization.
  5. this._samples = 10000;: Sets the number of samples for the audio visualization.
  6. this.audioContext = new AudioContext();: Creates a new instance of the AudioContext object, which is used for audio processing.
  7. this.canvas = document.querySelector("canvas");: Selects the canvas element from the DOM for rendering the waveform visualization.
  8. this.ctx = this.canvas.getContext("2d");: Gets the 2D rendering context for the canvas, which is used for drawing the waveform.
  9. message.then(arrayBuffer => { ... });: Uses a promise to asynchronously decode the audio data from the provided message (array buffer) and draw the waveform.
  10. normalizedData(audioBuffer) { ... }: A method that normalizes the audio data to be used for rendering the waveform.
  11. draw(normalizedData) { ... }: Draws the waveform on the canvas based on the normalized audio data.
  12. drawLineSegment(ctx, x, height, width, isEven, colors = this._strokeStyle) { ... }: Draws a line segment on the canvas representing a part of the waveform.
  13. drawData(data, colors = this._strokeStyle) { ... }: Draws the entire waveform data on the canvas.
  14. drawTimeline(percent) { ... }: Draws a timeline indicating the current playback position of the audio.
  15. document.getElementById("fileinput").addEventListener("change", function () { ... });: Adds an event listener to the file input element (#fileinput) to handle changes (i.e., when a file is selected). When a file is selected, it creates a new renderWave instance and sets up the audio player.

Full Code(You can copy the code and add your features):

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Audio Spectrum Visualizer</title>
<style>
html {
height: 100%;
}

body {
height: 100%;
margin: 0;
background-color: #edf2f7;
}

.sh-card {
position: fixed;
bottom: 10px;
right: 10px;
background: #edf2f7;
padding: 7px 10px;
border-radius: 50px;
border: none;
color: #000;
box-shadow: 2px 2px 20px -10px #000;
transition: all 0.3s ease;
z-index: 99999;
font-family: system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji";
line-height: 1.5;
overflow: hidden;
width: 18px;
white-space: nowrap;
}
.sh-card:hover {
width: 92px;
}

.sh-card-icon,
.sh-card-info {
display: inline-block;
vertical-align: baseline;
line-height: 1;
}

.sh-card-info {
font-size: 16px;
margin-left: 7px;
}

.sh-card-icon {
width: 18px;
height: 15px;
}

.sh-card-link,
.sh-card-link:hover,
.sh-card-link:active,
.sh-card-link:visited {
color: #000;
text-decoration: none;
line-height: 1;
}

.sh-card-box {
display: block;
}

@-webkit-keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
@keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
.sh-bounceIn {
-webkit-animation-duration: 0.75s;
animation-duration: 0.75s;
-webkit-animation-name: bounceIn;
animation-name: bounceIn;
animation-fill-mode: forwards;
-webkit-animation-fill-mode: forwards;
animation-delay: 1s;
-webkit-animation-delay: 1s;
opacity: 0;
}

canvas {
width: 100%;
height: 130px;
background: #f9f9f9;
margin: 2rem auto;
box-shadow: inset 0px 0px 25px -15px #000;
border-radius: 5px;
overflow: hidden;
}

.wrapper {
padding: 2rem;
}
</style>
</head>
<body>
<div class="wrapper">
<input id="fileinput" type="file" accept="audio/mp3,video/mp4"/>
<canvas id="canvas"></canvas>
<audio id="audio" src="" controls="true"></audio>
</div>
<script>
"use strict";
window.AudioContext = window.AudioContext || window.webkitAudioContext;
class renderWave {
constructor(message) {
this._samples = 10000;
this._strokeStyle = "#3098ff";
this.audioContext = new AudioContext();
this.canvas = document.querySelector("canvas");
this.ctx = this.canvas.getContext("2d");
this.data = [];
message
.then(arrayBuffer => {
return this.audioContext.decodeAudioData(arrayBuffer);
})
.then(audioBuffer => {
this.draw(this.normalizedData(audioBuffer));
this.drawData(this.data);
});
}
normalizedData(audioBuffer) {
const rawData = audioBuffer.getChannelData(0); // We only need to work with one channel of data
const samples = this._samples; // Number of samples we want to have in our final data set
const blockSize = Math.floor(rawData.length / samples); // Number of samples in each subdivision
const filteredData = [];
for (let i = 0; i < samples; i++) {
filteredData.push(rawData[i * blockSize]);
}
return filteredData;
}
draw(normalizedData) {
// set up the canvas
const canvas = this.canvas;
const dpr = window.devicePixelRatio || 1;
const padding = 10;
canvas.width = canvas.offsetWidth * dpr;
canvas.height = (canvas.offsetHeight + padding * 2) * dpr;
this.ctx.scale(dpr, dpr);
this.ctx.translate(0, canvas.offsetHeight / 2 + padding); // set Y = 0 to be in the middle of the canvas
// draw the line segments
const width = canvas.offsetWidth / normalizedData.length;
for (let i = 0; i < normalizedData.length; i++) {
const x = width * i;
let height = normalizedData[i] * canvas.offsetHeight - padding;
if (height < 0) {
height = 0;
}
else if (height > canvas.offsetHeight / 2) {
height = height > canvas.offsetHeight / 2;
}
// this.drawLineSegment(this.ctx, x, height, width, (i + 1) % 2);
this.data.push({
x: x,
h: height,
w: width,
isEven: (i + 1) % 2
});
}
return this.data;
}
drawLineSegment(ctx, x, height, width, isEven, colors = this._strokeStyle) {
ctx.lineWidth = 1; // how thick the line is
ctx.strokeStyle = colors; // what color our line is
ctx.beginPath();
height = isEven ? height : -height;
ctx.moveTo(x, 0);
ctx.lineTo(x + width, height);
ctx.stroke();
}
drawData(data, colors = this._strokeStyle) {
data.map(item => {
this.drawLineSegment(this.ctx, item.x, item.h, item.w, item.isEven, colors);
});
}
drawTimeline(percent) {
let end = Math.ceil(this._samples * percent);
let start = end - 5 || 0;
let t = this.data.slice(0, end);
this.drawData(t, "#1d1e22");
}
}
document.getElementById("fileinput").addEventListener("change", function () {
var wave = new renderWave(this.files[0].arrayBuffer());
var audioPlayer = document.getElementById("audio");
audioPlayer.src = URL.createObjectURL(this.files[0]);
// audioPlayer.play();
audioPlayer.ontimeupdate = function () {
let percent = this.currentTime / this.duration;
wave.drawTimeline(percent);
};
});
</script>
</body>
</html>

Output:


Creating an audio spectrum visualizer using HTML, CSS, and JavaScript can enhance the user experience of audio playback on your website. By following this tutorial, you can create a dynamic and visually engaging visualizer that adds a new dimension to your audio content. Experiment with different visual styles and effects to create a unique and immersive audio experience for your users.

Author

Sona Avatar

Written by

One response to “How To Create Audio Spectrum Visualizer Using HTML CSS & JavaScript”

  1. […] Taipy, you can create web applications without needing to dive deep into HTML, CSS, and JS. It offers Taipy Cloud, a platform that simplifies web app development and […]

Leave a Reply

Trending

CodeMagnet

Your Magnetic Resource, For Coding Brilliance

Programming Languages

Web Development

Data Science and Visualization

Career Section

<script async src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-4205364944170772"
     crossorigin="anonymous"></script>