ryanheise / just_waveform Goto Github PK
View Code? Open in Web Editor NEWA Flutter plugin to extract waveform data from an audio file suitable for visual rendering.
License: MIT License
A Flutter plugin to extract waveform data from an audio file suitable for visual rendering.
License: MIT License
iam trying to use with just audio library of yours
First thanks for the awesome package , i see that the package extract the wave form audio to a wave file . i wonder if there is a way to extract the wave as (uint8list ) format .
Kind regards
Hello @ryanheise , how to go about detecting pauses and silences using your plugin ? Any tips ?
I'm gettings songs with flutter plugin on_audio_query
which returns song.uri likes ipod-library://item/item.flac?id=4724756324520404040
, how can I extract wavefrom from this.
Thanks.
Hello, great plugin ! Is there the possibility to save the waveform to not reload it every time ? Thanks :)
Sorry if it looks like a dumb question, but is there anyway to integrate this with just_audio player? I wanna use it as progress indicator.
I am trying to get a visualization bar in my app for audio. I am checking out your plugin and in the example page you have a line..
final progressStream = BehaviorSubject<WaveformProgress>();
Where does BehaviourSubject come from? I searched the page and that was the only reference to the name anywhere
Hi there,
I was wondering how can I evaluate a proper zoom value? Sometimes, I dont need that accurate wave such as pixelsPerSecond(100). Even 2 to 3 pixelsPerSecond is enough for me. But it extracts a null waveform. Well, if the audio is long enough, it works.
Thanks
I just added that package on my flutter project. Awesome package.
But I met to frozen waveform generate on 99% while creating.
If anyone has some solutions for that, Please let me know.
=== My flutter doctor ===
Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, 3.0.6-0.0.pre.2, on macOS 12.5.1 21G83 darwin-x64, locale en-LA)
[✓] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
[✓] Xcode - develop for iOS and macOS (Xcode 13.4)
[✓] Chrome - develop for the web
[✓] Android Studio (version 2021.2)
[✓] VS Code (version 1.71.0)
[✓] VS Code (version 1.54.2)
[✓] Connected device (3 available)
! Error: JinYZ’s iPhone has recently restarted. Xcode will continue when JinYZ’s iPhone is unlocked. (code -14)
[✓] HTTP Host Availability
=== My Test Device ===
Samgsung A20
Thanks.
is it possible to extract audio and generate wave from that url I want to use it in chatting app
How can we hide logs?
Example:
wave length = 1712
waveHeader[0] = 1
waveHeader[1] = 0
waveHeader[2] = 16000
waveHeader[3] = 160
waveHeader[4] = 651
Total scaled samples: 1302
waveHeader[0] = 1
waveHeader[1] = 0
waveHeader[2] = 16000
waveHeader[3] = 160
waveHeader[4] = 856
request
In my app I use aac codec or audio format files and i will like to get the wave form so i am see the use of mp3 in the examples and i am led to believe thats the only codec supported ?
The thing is that I was using voice_message_package
, but it is using just_audio, which for some reason doesn't play aac
audio on iOS, but audioplayers
package plays. So I migrated to audioplayers
and just_waveforrm, but seems just_waveforrm
doesn't work on iOS with aac
files recorded by record
package with android. Is it possible to fix that issue, cause feels like you are missing an encoding of something?
If I want to handle an generated file, what should I do?
Doesn't work for certain mp3 file, see attachment.
6607054370125122307.mp3.zip
Hi 👋🏼. Great plugin. Do you have in mind add web support?
I have a Stream filled from the Network, is there a way to get the waveform from this ?
Hi. I get this error when try extract file from audio.
PlatformException(ExtAudioOpenURL error, ExtAudioOpenURL error, null, null).
This is only in the iOS version. Everything works fine on android.
Thera are my files path:
file:///Users/user/Library/Developer/CoreSimulator/Devices/DAB4C89D-6849-460A-A4C7-C1E2D7A81125/data/Containers/Data/Application/D7974DA8-ED71-480B-B289-36CE72899032/tmp/BDC72317-B389-4F46-8CE3-1A24A37172A9.m4a
file:///Users/user/Library/Developer/CoreSimulator/Devices/DAB4C89D-6849-460A-A4C7-C1E2D7A81125/data/Containers/Data/Application/D7974DA8-ED71-480B-B289-36CE72899032/tmp/BDC72317-B389-4F46-8CE3-1A24A37172A9.wave
I am working on a simulator, can this be the problem?
Thanks for this awesome plugin.
Also add the method to cancel the ongoing extraction. If it is possible to cancel the extraction without adding any new method then please mention it in the documentation.
If trying extract few files in same time, progress will commit last file progress.
I think, need to make a each progress for each instance
I have added that package to my flutter project.
But when move before and next page, I met a issue for percent change one.
The percent value is return the previous and current ones.
Could you help me for that issue?
You took and immediately closed my problem. And the Internet does not have any information about how to connect these two plugins. There is no such question on Stackoverflow and ChatGpt doesn't know your plugin. Can you share the code? I don't think it's that much of a problem for you. There is no documentation about it at all. Yes, I'm new to programming. And how to deal with this issue?
As stated in the title: Will there be windows and linux support?
For starters, thanks for this great plugin.
I'm encountering a problem on Android only, when i extract the data from the audio file I get this error
PlatformException(Attempt to invoke virtual method 'int java.lang.Integer.intValue()' on a null object reference, null, null, null)
I'm using the same code from the example with a few modifications
final audioFile = File(widget.audioFilePath);
try {
final dir = await getApplicationDocumentsDirectory();
final waveFile = File('${dir.path}/waveform.wave');
JustWaveform.extract(
audioInFile: audioFile,
waveOutFile: waveFile,
zoom: WaveformZoom.pixelsPerSecond(
pixelPerSecond.ceil(),
),
).listen(_progressStream.add, onError: _progressStream.addError);
} catch (err) {
_progressStream.addError(e);
}
I'm having this problem on an Android device on 9.0.
Many thanks.
Hello,
Thank you for this plugin.
We create an app of meditation and sound relaxing named Evolum (4.9 stars in stores).
We wanted to add audio wave in our app.
As you know we can't get the audiowave file on the file directly in the app when an user want to relax.
We have a dashboard where internal user can upload audio to be available in the app.
This dashboard is web so we can't generate the audiowave file so we wanted to do by deploying our dashboard in macOS but some of our internal user have only window...
We try to doc something with Google Cloud function when a file is uploaded in our storage with that
https://github.com/bbc/audiowaveform
but we didn't successed...
So if adding window support is not a big challenge maybe that can be usefull for us.
Thanks,
When I try to load an mp3 file around 3mb it returns the error
PlatformException(ExtAudioOpenURL error, ExtAudioOpenURL error, null, null)
Have you tested with any max large files?
I use audioplayers to play an audio file and retrieve the wave. I manage to display it but I notice that the traces don't correspond to the frequency levels of the track.
I use:
Flutter 3.22.2 • channel stable
Dart 3.4.3
audioplayers: ^6.0.0
just_waveform: ^0.0.5
rxdart: ^0.28.0
For easy testing I made a full example:
you will need to add to your pubspec.yaml file the audio file:
assets:
- assets/msg2.mp3
import 'dart:async';
import 'dart:io';
import 'dart:math';
import 'dart:typed_data';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:path/path.dart' as p;
import 'package:path_provider/path_provider.dart';
import 'package:just_waveform/just_waveform.dart';
import 'package:audioplayers/audioplayers.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Audio Waveform Example',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: AudioPlayerScreen(),
);
}
}
class AudioPlayerScreen extends StatefulWidget {
@override
_AudioPlayerScreenState createState() => _AudioPlayerScreenState();
}
class _AudioPlayerScreenState extends State<AudioPlayerScreen> {
final AudioPlayer audioPlayer = AudioPlayer();
final StreamController<WaveformProgress> progressStream = StreamController<WaveformProgress>();
Waveform? _waveform;
Duration playingPosition = Duration.zero;
@override
void initState() {
super.initState();
audioPlayer.onPositionChanged.listen((Duration position) {
setState(() {
playingPosition = position;
});
});
}
@override
void dispose() {
audioPlayer.dispose();
progressStream.close();
super.dispose();
}
Future<void> playAudio(String fileName) async {
try {
// Load audio file from assets
final ByteData data = await rootBundle.load('assets/$fileName');
final Uint8List bytes = data.buffer.asUint8List();
// Save audio file to a temporary directory
final tempDir = await getTemporaryDirectory();
final audioPath = '${tempDir.path}/$fileName';
final File tempAudioFile = File(audioPath);
await tempAudioFile.writeAsBytes(bytes, flush: true);
final waveFile = File(p.join(tempDir.path, '$fileName.waveform'));
// Extract waveform
JustWaveform.extract(audioInFile: tempAudioFile, waveOutFile: waveFile).listen((waveformProgress) async {
progressStream.add(waveformProgress);
print('Progress: %${(100 * waveformProgress.progress).toInt()}');
if (waveformProgress.waveform != null) {
setState(() {
_waveform = waveformProgress.waveform;
});
}
}, onError: (e) {
print("Waveform extraction error: $e");
});
await audioPlayer.play(AssetSource(fileName));
print("playing.....done $audioPath");
} catch (e) {
print("error audioPlayer: $e");
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text('Audio Waveform Example'),
),
body: Padding(
padding: const EdgeInsets.all(16.0),
child: Column(
children: [
ElevatedButton(
onPressed: () => playAudio('msg2.mp3'),
child: Text('Play Audio'),
),
SizedBox(height: 20),
Container(
height: 150.0,
decoration: BoxDecoration(
color: Colors.grey.shade200,
borderRadius: const BorderRadius.all(Radius.circular(20.0)),
),
padding: const EdgeInsets.all(16.0),
width: double.maxFinite,
child: StreamBuilder<WaveformProgress>(
stream: progressStream.stream,
builder: (context, snapshot) {
if (snapshot.hasError) {
return Center(
child: Text(
'Error: ${snapshot.error}',
style: Theme.of(context).textTheme.titleLarge,
textAlign: TextAlign.center,
),
);
}
final progress = snapshot.data?.progress ?? 0.0;
final waveform = snapshot.data?.waveform;
if (waveform == null) {
return Center(
child: Text(
'${(100 * progress).toInt()}%',
style: Theme.of(context).textTheme.titleLarge,
),
);
}
return AudioWaveformWidget(
waveform: waveform,
start: Duration.zero,
duration: waveform.duration,
position: playingPosition,
waveColor: Colors.grey,
playedColor: Colors.blue,
);
},
),
)
],
),
),
);
}
}
class AudioWaveformWidget extends StatefulWidget {
final Color waveColor;
final Color playedColor;
final double scale;
final double strokeWidth;
final double pixelsPerStep;
final Waveform waveform;
final Duration start;
final Duration duration;
final Duration position;
const AudioWaveformWidget({
Key? key,
required this.waveform,
required this.start,
required this.duration,
required this.position,
this.waveColor = Colors.grey,
this.playedColor = Colors.blue,
this.scale = 1.0,
this.strokeWidth = 5.0,
this.pixelsPerStep = 8.0,
}) : super(key: key);
@override
_AudioWaveformState createState() => _AudioWaveformState();
}
class _AudioWaveformState extends State<AudioWaveformWidget> {
@override
Widget build(BuildContext context) {
return ClipRect(
child: CustomPaint(
painter: AudioWaveformPainter(
waveColor: widget.waveColor,
playedColor: widget.playedColor,
waveform: widget.waveform,
start: widget.start,
duration: widget.duration,
position: widget.position,
scale: widget.scale,
strokeWidth: widget.strokeWidth,
pixelsPerStep: widget.pixelsPerStep,
),
),
);
}
}
class AudioWaveformPainter extends CustomPainter {
final double scale;
final double strokeWidth;
final double pixelsPerStep;
final Paint wavePaint;
final Paint playedPaint;
final Waveform waveform;
final Duration start;
final Duration duration;
final Duration position;
AudioWaveformPainter({
required this.waveform,
required this.start,
required this.duration,
required this.position,
Color waveColor = Colors.grey,
Color playedColor = Colors.blue,
this.scale = 1.0,
this.strokeWidth = 5.0,
this.pixelsPerStep = 8.0,
}) : wavePaint = Paint()
..style = PaintingStyle.stroke
..strokeWidth = strokeWidth
..strokeCap = StrokeCap.round
..color = waveColor,
playedPaint = Paint()
..style = PaintingStyle.stroke
..strokeWidth = strokeWidth
..strokeCap = StrokeCap.round
..color = playedColor;
@override
void paint(Canvas canvas, Size size) {
if (duration == Duration.zero) return;
double width = size.width;
double height = size.height;
final waveformPixelsPerWindow = waveform.positionToPixel(duration).toInt();
final waveformPixelsPerDevicePixel = waveformPixelsPerWindow / width;
final waveformPixelsPerStep = waveformPixelsPerDevicePixel * pixelsPerStep;
final sampleOffset = waveform.positionToPixel(start);
final sampleStart = -sampleOffset % waveformPixelsPerStep;
final playProgressPixel = waveform.positionToPixel(position);
for (var i = sampleStart.toDouble(); i <= waveformPixelsPerWindow + 1.0; i += waveformPixelsPerStep) {
final sampleIdx = (sampleOffset + i).toInt();
final x = i / waveformPixelsPerDevicePixel;
final minY = normalise(waveform.getPixelMin(sampleIdx), height);
final maxY = normalise(waveform.getPixelMax(sampleIdx), height);
final paint = (i <= playProgressPixel) ? playedPaint : wavePaint;
canvas.drawLine(
Offset(x + strokeWidth / 2, max(strokeWidth * 0.75, minY)),
Offset(x + strokeWidth / 2, min(height - strokeWidth * 0.75, maxY)),
paint,
);
}
}
@override
bool shouldRepaint(covariant AudioWaveformPainter oldDelegate) {
return true;
}
double normalise(int s, double height) {
if (waveform.flags == 0) {
final y = 32768 + (scale * s).clamp(-32768.0, 32767.0).toDouble();
return height - 1 - y * height / 65536;
} else {
final y = 128 + (scale * s).clamp(-128.0, 127.0).toDouble();
return height - 1 - y * height / 256;
}
}
}
Result:
Audio sample:
https://www.mediafire.com/file/ntfe614cak6cj3m/msg2.mp3/file
If you listen the audio and follow the wave you will notice it do not match, there is some clear voices represented by just a dot.
Did i made some mistakes?
When parsing the waveform data, we should do this:
final data = flags == 0
? Int16List.view(bytes, headerLength ~/ 2)
: Int8List.view(bytes, headerLength);
When painting the data, we should do this:
double normalise(int s, double height) {
if (waveform.flags == 0) {
final y = 32768 + (scale * s).clamp(-32768.0, 32767.0).toDouble();
return height - 1 - y * height / 65536;
} else {
final y = 128 + (scale * s).clamp(-128.0, 127.0).toDouble();
return height - 1 - y * height / 256;
}
}
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.