In today’s digital age, multimedia experiences play a crucial role in engaging users. Audio spectrum visualizers are a popular way to enhance the audio listening experience by providing a dynamic and visually appealing representation of the audio being played. In this tutorial, we will explore how to create an audio spectrum visualizer using HTML, CSS, and JavaScript.
HTML: The HTML structure for our audio spectrum visualizer is straightforward. We will use a <div> element to contain the visualizer, an <input> element of type “file” to allow the user to select an audio file, a <canvas> element to draw the visualizer and an <audio> element to play the selected audio file.
CSS: For the CSS, we will style the visualizer container, set up the canvas dimensions and appearance, and style the audio player controls. We will also add some basic styling to make the visualizer visually appealing.
JavaScript: The JavaScript code will handle the main functionality of the visualizer. We will create an AudioContext to handle audio processing, set up an AnalyserNode to analyze the audio data, and use a CanvasRenderingContext2D to draw the visualizer on the canvas. We will also add event listeners to update the visualizer in real-time as the audio is playing.
HTML File:
<body>
<div class="wrapper">
<input id="fileinput" type="file" accept="audio/mp3,video/mp4"/>
<canvas id="canvas"></canvas>
<audio id="audio" src="" controls="true"></audio>
</div>
</body>
Explanation:
<div class="wrapper">: This is adivelement with a class name “wrapper” that acts as a container for the input, canvas, and audio elements. It helps in organizing the layout and styling of these elements.<input id="fileinput" type="file" accept="audio/mp3,video/mp4"/>: This is an input element of type “file” that allows users to select files from their device. Theacceptattribute specifies the types of files that can be selected, in this case, audio files with the MIME typeaudio/mp3and video files with the MIME typevideo/mp4. Theidattribute is used to uniquely identify the input element.<canvas id="canvas"></canvas>: This is a canvas element used for rendering graphics, animations, or other visualizations using JavaScript. It can be used to create visual effects synchronized with the audio playback.<audio id="audio" src="" controls="true"></audio>: This is an audio element used for embedding audio content in the web page. Theidattribute is used to uniquely identify the audio element. Thesrcattribute specifies the URL of the audio file to be played. Thecontrolsattribute adds playback controls (like play, pause, volume, etc.) to the audio player.
CSS File
html {
height: 100%;
}
body {
height: 100%;
margin: 0;
background-color: #edf2f7;
}
.sh-card {
position: fixed;
bottom: 10px;
right: 10px;
background: #edf2f7;
padding: 7px 10px;
border-radius: 50px;
border: none;
color: #000;
box-shadow: 2px 2px 20px -10px #000;
transition: all 0.3s ease;
z-index: 99999;
font-family: system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji";
line-height: 1.5;
overflow: hidden;
width: 18px;
white-space: nowrap;
}
.sh-card:hover {
width: 92px;
}
.sh-card-icon,
.sh-card-info {
display: inline-block;
vertical-align: baseline;
line-height: 1;
}
.sh-card-info {
font-size: 16px;
margin-left: 7px;
}
.sh-card-icon {
width: 18px;
height: 15px;
}
.sh-card-link,
.sh-card-link:hover,
.sh-card-link:active,
.sh-card-link:visited {
color: #000;
text-decoration: none;
line-height: 1;
}
.sh-card-box {
display: block;
}
@-webkit-keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
@keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
.sh-bounceIn {
-webkit-animation-duration: 0.75s;
animation-duration: 0.75s;
-webkit-animation-name: bounceIn;
animation-name: bounceIn;
animation-fill-mode: forwards;
-webkit-animation-fill-mode: forwards;
animation-delay: 1s;
-webkit-animation-delay: 1s;
opacity: 0;
}
canvas {
width: 100%;
height: 130px;
background: #f9f9f9;
margin: 2rem auto;
box-shadow: inset 0px 0px 25px -15px #000;
border-radius: 5px;
overflow: hidden;
}
.wrapper {
padding: 2rem;
}
Explanation:
html { height: 100%; }: Sets the height of thehtmlelement to 100% of the viewport height, ensuring that the webpage takes up the full height of the browser window.body { ... }: Styles thebodyelement, setting its height to 100% of the viewport height, removing any default margin, setting the background color to a light gray (#edf2f7), which gives the page a clean look..sh-card { ... }: Styles the floating card element. It is positioned fixed at the bottom right corner (position: fixed; bottom: 10px; right: 10px;). It has a light gray background (background: #edf2f7;), padding, border-radius for rounded corners, and a box-shadow for a 3D effect. The card width is initially set to18pxand expands to92pxon hover (:hover)..sh-card-icon, .sh-card-info { ... }: Styles the elements inside the card..sh-card-iconis used for displaying an icon, while.sh-card-infois used for displaying text information. They are set todisplay: inline-block;to align them horizontally and have a consistent line height..sh-card-info { font-size: 16px; ... }: Styles the text information inside the card, setting the font size to16pxand adding some left margin for spacing (margin-left: 7px;)..sh-card-link { ... }: Styles the links inside the card. Sets the color to black (#000) and removes the underline (text-decoration: none;)..sh-card-box { display: block; }: Styles the box inside the card. Sets it to display as a block element.@keyframes bounceIn { ... }: Defines a keyframe animation namedbounceInusing CSS@keyframes. This animation scales and fades in an element to give it a bouncing effect. It is used to animate the card when it appears on the screen (animation-name: bounceIn;)..sh-bounceIn { ... }: Applies thebounceInanimation to an element. It sets the animation duration, fill mode, and delay to control the animation timing and appearance.canvas { ... }: Styles the canvas element used for visual effects. It sets the width to100%, height to130px, background color to a light gray, adds a box shadow for a 3D effect, and sets a border radius for rounded corners..wrapper { ... }: Styles the wrapper element that contains the content of the webpage. It adds padding around the content.
JavaScript File
"use strict";
window.AudioContext = window.AudioContext || window.webkitAudioContext;
class renderWave {
constructor(message) {
this._samples = 10000;
this._strokeStyle = "#3098ff";
this.audioContext = new AudioContext();
this.canvas = document.querySelector("canvas");
this.ctx = this.canvas.getContext("2d");
this.data = [];
message
.then(arrayBuffer => {
return this.audioContext.decodeAudioData(arrayBuffer);
})
.then(audioBuffer => {
this.draw(this.normalizedData(audioBuffer));
this.drawData(this.data);
});
}
normalizedData(audioBuffer) {
const rawData = audioBuffer.getChannelData(0); // We only need to work with one channel of data
const samples = this._samples; // Number of samples we want to have in our final data set
const blockSize = Math.floor(rawData.length / samples); // Number of samples in each subdivision
const filteredData = [];
for (let i = 0; i < samples; i++) {
filteredData.push(rawData[i * blockSize]);
}
return filteredData;
}
draw(normalizedData) {
// set up the canvas
const canvas = this.canvas;
const dpr = window.devicePixelRatio || 1;
const padding = 10;
canvas.width = canvas.offsetWidth * dpr;
canvas.height = (canvas.offsetHeight + padding * 2) * dpr;
this.ctx.scale(dpr, dpr);
this.ctx.translate(0, canvas.offsetHeight / 2 + padding); // set Y = 0 to be in the middle of the canvas
// draw the line segments
const width = canvas.offsetWidth / normalizedData.length;
for (let i = 0; i < normalizedData.length; i++) {
const x = width * i;
let height = normalizedData[i] * canvas.offsetHeight - padding;
if (height < 0) {
height = 0;
}
else if (height > canvas.offsetHeight / 2) {
height = height > canvas.offsetHeight / 2;
}
// this.drawLineSegment(this.ctx, x, height, width, (i + 1) % 2);
this.data.push({
x: x,
h: height,
w: width,
isEven: (i + 1) % 2
});
}
return this.data;
}
drawLineSegment(ctx, x, height, width, isEven, colors = this._strokeStyle) {
ctx.lineWidth = 1; // how thick the line is
ctx.strokeStyle = colors; // what color our line is
ctx.beginPath();
height = isEven ? height : -height;
ctx.moveTo(x, 0);
ctx.lineTo(x + width, height);
ctx.stroke();
}
drawData(data, colors = this._strokeStyle) {
data.map(item => {
this.drawLineSegment(this.ctx, item.x, item.h, item.w, item.isEven, colors);
});
}
drawTimeline(percent) {
let end = Math.ceil(this._samples * percent);
let start = end - 5 || 0;
let t = this.data.slice(0, end);
this.drawData(t, "#1d1e22");
}
}
document.getElementById("fileinput").addEventListener("change", function () {
var wave = new renderWave(this.files[0].arrayBuffer());
var audioPlayer = document.getElementById("audio");
audioPlayer.src = URL.createObjectURL(this.files[0]);
// audioPlayer.play();
audioPlayer.ontimeupdate = function () {
let percent = this.currentTime / this.duration;
wave.drawTimeline(percent);
};
});
Explanation:
"use strict";: This is a directive that enables strict mode, which catches common coding mistakes and “unsafe” actions in JavaScript, providing more secure and optimized code.window.AudioContext = window.AudioContext || window.webkitAudioContext;: This line checks if theAudioContextobject is supported in the current browser. If it’s not, it tries to use thewebkitAudioContextobject (for older versions of WebKit browsers).class renderWave { ... }: This defines a JavaScript class calledrenderWave, which is used to render the audio waveform visualization.constructor(message) { ... }: This is the constructor method of therenderWaveclass. It initializes various properties and sets up the audio context and canvas for visualization.this._samples = 10000;: Sets the number of samples for the audio visualization.this.audioContext = new AudioContext();: Creates a new instance of theAudioContextobject, which is used for audio processing.this.canvas = document.querySelector("canvas");: Selects thecanvaselement from the DOM for rendering the waveform visualization.this.ctx = this.canvas.getContext("2d");: Gets the 2D rendering context for the canvas, which is used for drawing the waveform.message.then(arrayBuffer => { ... });: Uses a promise to asynchronously decode the audio data from the providedmessage(array buffer) and draw the waveform.normalizedData(audioBuffer) { ... }: A method that normalizes the audio data to be used for rendering the waveform.draw(normalizedData) { ... }: Draws the waveform on the canvas based on the normalized audio data.drawLineSegment(ctx, x, height, width, isEven, colors = this._strokeStyle) { ... }: Draws a line segment on the canvas representing a part of the waveform.drawData(data, colors = this._strokeStyle) { ... }: Draws the entire waveform data on the canvas.drawTimeline(percent) { ... }: Draws a timeline indicating the current playback position of the audio.document.getElementById("fileinput").addEventListener("change", function () { ... });: Adds an event listener to the file input element (#fileinput) to handle changes (i.e., when a file is selected). When a file is selected, it creates a newrenderWaveinstance and sets up the audio player.
Full Code(You can copy the code and add your features):
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Audio Spectrum Visualizer</title>
<style>
html {
height: 100%;
}
body {
height: 100%;
margin: 0;
background-color: #edf2f7;
}
.sh-card {
position: fixed;
bottom: 10px;
right: 10px;
background: #edf2f7;
padding: 7px 10px;
border-radius: 50px;
border: none;
color: #000;
box-shadow: 2px 2px 20px -10px #000;
transition: all 0.3s ease;
z-index: 99999;
font-family: system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji";
line-height: 1.5;
overflow: hidden;
width: 18px;
white-space: nowrap;
}
.sh-card:hover {
width: 92px;
}
.sh-card-icon,
.sh-card-info {
display: inline-block;
vertical-align: baseline;
line-height: 1;
}
.sh-card-info {
font-size: 16px;
margin-left: 7px;
}
.sh-card-icon {
width: 18px;
height: 15px;
}
.sh-card-link,
.sh-card-link:hover,
.sh-card-link:active,
.sh-card-link:visited {
color: #000;
text-decoration: none;
line-height: 1;
}
.sh-card-box {
display: block;
}
@-webkit-keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
@keyframes bounceIn {
0%, 20%, 40%, 60%, 80%, to {
-webkit-animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
animation-timing-function: cubic-bezier(0.215, 0.61, 0.355, 1);
}
0% {
opacity: 0;
-webkit-transform: scale3d(0.3, 0.3, 0.3);
transform: scale3d(0.3, 0.3, 0.3);
}
20% {
-webkit-transform: scale3d(1.1, 1.1, 1.1);
transform: scale3d(1.1, 1.1, 1.1);
}
40% {
-webkit-transform: scale3d(0.9, 0.9, 0.9);
transform: scale3d(0.9, 0.9, 0.9);
}
60% {
opacity: 1;
-webkit-transform: scale3d(1.03, 1.03, 1.03);
transform: scale3d(1.03, 1.03, 1.03);
}
80% {
-webkit-transform: scale3d(0.97, 0.97, 0.97);
transform: scale3d(0.97, 0.97, 0.97);
}
to {
opacity: 1;
-webkit-transform: scaleX(1);
transform: scaleX(1);
}
}
.sh-bounceIn {
-webkit-animation-duration: 0.75s;
animation-duration: 0.75s;
-webkit-animation-name: bounceIn;
animation-name: bounceIn;
animation-fill-mode: forwards;
-webkit-animation-fill-mode: forwards;
animation-delay: 1s;
-webkit-animation-delay: 1s;
opacity: 0;
}
canvas {
width: 100%;
height: 130px;
background: #f9f9f9;
margin: 2rem auto;
box-shadow: inset 0px 0px 25px -15px #000;
border-radius: 5px;
overflow: hidden;
}
.wrapper {
padding: 2rem;
}
</style>
</head>
<body>
<div class="wrapper">
<input id="fileinput" type="file" accept="audio/mp3,video/mp4"/>
<canvas id="canvas"></canvas>
<audio id="audio" src="" controls="true"></audio>
</div>
<script>
"use strict";
window.AudioContext = window.AudioContext || window.webkitAudioContext;
class renderWave {
constructor(message) {
this._samples = 10000;
this._strokeStyle = "#3098ff";
this.audioContext = new AudioContext();
this.canvas = document.querySelector("canvas");
this.ctx = this.canvas.getContext("2d");
this.data = [];
message
.then(arrayBuffer => {
return this.audioContext.decodeAudioData(arrayBuffer);
})
.then(audioBuffer => {
this.draw(this.normalizedData(audioBuffer));
this.drawData(this.data);
});
}
normalizedData(audioBuffer) {
const rawData = audioBuffer.getChannelData(0); // We only need to work with one channel of data
const samples = this._samples; // Number of samples we want to have in our final data set
const blockSize = Math.floor(rawData.length / samples); // Number of samples in each subdivision
const filteredData = [];
for (let i = 0; i < samples; i++) {
filteredData.push(rawData[i * blockSize]);
}
return filteredData;
}
draw(normalizedData) {
// set up the canvas
const canvas = this.canvas;
const dpr = window.devicePixelRatio || 1;
const padding = 10;
canvas.width = canvas.offsetWidth * dpr;
canvas.height = (canvas.offsetHeight + padding * 2) * dpr;
this.ctx.scale(dpr, dpr);
this.ctx.translate(0, canvas.offsetHeight / 2 + padding); // set Y = 0 to be in the middle of the canvas
// draw the line segments
const width = canvas.offsetWidth / normalizedData.length;
for (let i = 0; i < normalizedData.length; i++) {
const x = width * i;
let height = normalizedData[i] * canvas.offsetHeight - padding;
if (height < 0) {
height = 0;
}
else if (height > canvas.offsetHeight / 2) {
height = height > canvas.offsetHeight / 2;
}
// this.drawLineSegment(this.ctx, x, height, width, (i + 1) % 2);
this.data.push({
x: x,
h: height,
w: width,
isEven: (i + 1) % 2
});
}
return this.data;
}
drawLineSegment(ctx, x, height, width, isEven, colors = this._strokeStyle) {
ctx.lineWidth = 1; // how thick the line is
ctx.strokeStyle = colors; // what color our line is
ctx.beginPath();
height = isEven ? height : -height;
ctx.moveTo(x, 0);
ctx.lineTo(x + width, height);
ctx.stroke();
}
drawData(data, colors = this._strokeStyle) {
data.map(item => {
this.drawLineSegment(this.ctx, item.x, item.h, item.w, item.isEven, colors);
});
}
drawTimeline(percent) {
let end = Math.ceil(this._samples * percent);
let start = end - 5 || 0;
let t = this.data.slice(0, end);
this.drawData(t, "#1d1e22");
}
}
document.getElementById("fileinput").addEventListener("change", function () {
var wave = new renderWave(this.files[0].arrayBuffer());
var audioPlayer = document.getElementById("audio");
audioPlayer.src = URL.createObjectURL(this.files[0]);
// audioPlayer.play();
audioPlayer.ontimeupdate = function () {
let percent = this.currentTime / this.duration;
wave.drawTimeline(percent);
};
});
</script>
</body>
</html>
Output:
Creating an audio spectrum visualizer using HTML, CSS, and JavaScript can enhance the user experience of audio playback on your website. By following this tutorial, you can create a dynamic and visually engaging visualizer that adds a new dimension to your audio content. Experiment with different visual styles and effects to create a unique and immersive audio experience for your users.





Leave a Reply