GithubHelp home page GithubHelp logo

ryanheise / just_waveform Goto Github PK

View Code? Open in Web Editor NEW
79.0 4.0 15.0 201 KB

A Flutter plugin to extract waveform data from an audio file suitable for visual rendering.

License: MIT License

Java 35.17% Objective-C 24.73% Ruby 9.40% Dart 29.39% Swift 1.32%
flutter audio waveform

just_waveform's Issues

is the library working today

iam trying to use with just audio library of yours

  1. the combination of both will work?
  2. iam trying to run your example code but giving error :T
    arget of URI doesn't exist: 'package:just_waveform/just_waveform.dart'.
    Try creating the file referenced by the URI, or try using a URI for a file that does

Extract waveform from audio as uint8list

First thanks for the awesome package , i see that the package extract the wave form audio to a wave file . i wonder if there is a way to extract the wave as (uint8list ) format .
Kind regards

extract from ipod-library

I'm gettings songs with flutter plugin on_audio_query which returns song.uri likes ipod-library://item/item.flac?id=4724756324520404040, how can I extract wavefrom from this.

Thanks.

just_audio integration

Sorry if it looks like a dumb question, but is there anyway to integrate this with just_audio player? I wanna use it as progress indicator.

Question: Cant follow along with example code

I am trying to get a visualization bar in my app for audio. I am checking out your plugin and in the example page you have a line..

final progressStream = BehaviorSubject<WaveformProgress>();

Where does BehaviourSubject come from? I searched the page and that was the only reference to the name anywhere

how to implements a proper WaveformZoom?

Hi there,

I was wondering how can I evaluate a proper zoom value? Sometimes, I dont need that accurate wave such as pixelsPerSecond(100). Even 2 to 3 pixelsPerSecond is enough for me. But it extracts a null waveform. Well, if the audio is long enough, it works.

Thanks

Frozen on 99% while generating.

I just added that package on my flutter project. Awesome package.
But I met to frozen waveform generate on 99% while creating.
If anyone has some solutions for that, Please let me know.

=== My flutter doctor ===

Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, 3.0.6-0.0.pre.2, on macOS 12.5.1 21G83 darwin-x64, locale en-LA)
[✓] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
[✓] Xcode - develop for iOS and macOS (Xcode 13.4)
[✓] Chrome - develop for the web
[✓] Android Studio (version 2021.2)
[✓] VS Code (version 1.71.0)
[✓] VS Code (version 1.54.2)
[✓] Connected device (3 available)
    ! Error: JinYZ’s iPhone has recently restarted. Xcode will continue when JinYZ’s iPhone is unlocked. (code -14)
[✓] HTTP Host Availability

=== My Test Device ===
Samgsung A20

Thanks.

Example code load asset error

Hi, I am first tring flutter develop and just_waveform.
I tried to run example but it got error message: Unable to load asset: audio/waveform.mp3.
Can help me to know where I got wrong?
pic

extract audio via url

is it possible to extract audio and generate wave from that url I want to use it in chatting app

Hide logs in console

How can we hide logs?

Example:
wave length = 1712
waveHeader[0] = 1
waveHeader[1] = 0
waveHeader[2] = 16000
waveHeader[3] = 160
waveHeader[4] = 651
Total scaled samples: 1302
waveHeader[0] = 1
waveHeader[1] = 0
waveHeader[2] = 16000
waveHeader[3] = 160
waveHeader[4] = 856

PlatformException(ExtAudioOpenURL error, ExtAudioOpenURL error, null, null)

The thing is that I was using voice_message_package, but it is using just_audio, which for some reason doesn't play aac audio on iOS, but audioplayers package plays. So I migrated to audioplayers and just_waveforrm, but seems just_waveforrm doesn't work on iOS with aac files recorded by record package with android. Is it possible to fix that issue, cause feels like you are missing an encoding of something?

Web support

Hi 👋🏼. Great plugin. Do you have in mind add web support?

PlatformException(ExtAudioOpenURL error, ExtAudioOpenURL error, null, null)

Hi. I get this error when try extract file from audio.

PlatformException(ExtAudioOpenURL error, ExtAudioOpenURL error, null, null).

This is only in the iOS version. Everything works fine on android.

Thera are my files path:

file:///Users/user/Library/Developer/CoreSimulator/Devices/DAB4C89D-6849-460A-A4C7-C1E2D7A81125/data/Containers/Data/Application/D7974DA8-ED71-480B-B289-36CE72899032/tmp/BDC72317-B389-4F46-8CE3-1A24A37172A9.m4a

file:///Users/user/Library/Developer/CoreSimulator/Devices/DAB4C89D-6849-460A-A4C7-C1E2D7A81125/data/Containers/Data/Application/D7974DA8-ED71-480B-B289-36CE72899032/tmp/BDC72317-B389-4F46-8CE3-1A24A37172A9.wave

I am working on a simulator, can this be the problem?

Add cancel method to cancel the ongoing extraction

Thanks for this awesome plugin.

Also add the method to cancel the ongoing extraction. If it is possible to cancel the extraction without adding any new method then please mention it in the documentation.

How to cancel the waveform processing before done?

I have added that package to my flutter project.
But when move before and next page, I met a issue for percent change one.
The percent value is return the previous and current ones.
Could you help me for that issue?

Just audio wavefrom

You took and immediately closed my problem. And the Internet does not have any information about how to connect these two plugins. There is no such question on Stackoverflow and ChatGpt doesn't know your plugin. Can you share the code? I don't think it's that much of a problem for you. There is no documentation about it at all. Yes, I'm new to programming. And how to deal with this issue?
Screenshot_2023-07-21-18-51-20-784_fambox pro-edit

Attempt to invoke virtual method 'int java.lang.Integer.intValue()'

For starters, thanks for this great plugin.

I'm encountering a problem on Android only, when i extract the data from the audio file I get this error

PlatformException(Attempt to invoke virtual method 'int java.lang.Integer.intValue()' on a null object reference, null, null, null)

I'm using the same code from the example with a few modifications

final audioFile = File(widget.audioFilePath);

    try {
      final dir = await getApplicationDocumentsDirectory();
      final waveFile = File('${dir.path}/waveform.wave');

      JustWaveform.extract(
        audioInFile: audioFile,
        waveOutFile: waveFile,
        zoom: WaveformZoom.pixelsPerSecond(
          pixelPerSecond.ceil(),
        ),
      ).listen(_progressStream.add, onError: _progressStream.addError);
    } catch (err) {
      _progressStream.addError(e);
    }

I'm having this problem on an Android device on 9.0.

Many thanks.

add support window

Hello,

Thank you for this plugin.

We create an app of meditation and sound relaxing named Evolum (4.9 stars in stores).
We wanted to add audio wave in our app.
As you know we can't get the audiowave file on the file directly in the app when an user want to relax.

We have a dashboard where internal user can upload audio to be available in the app.

This dashboard is web so we can't generate the audiowave file so we wanted to do by deploying our dashboard in macOS but some of our internal user have only window...

We try to doc something with Google Cloud function when a file is uploaded in our storage with that

https://github.com/bbc/audiowaveform

but we didn't successed...

So if adding window support is not a big challenge maybe that can be usefull for us.

Thanks,

large mp3 file size error

When I try to load an mp3 file around 3mb it returns the error
PlatformException(ExtAudioOpenURL error, ExtAudioOpenURL error, null, null)

Have you tested with any max large files?

Wave inaccuracy

I use audioplayers to play an audio file and retrieve the wave. I manage to display it but I notice that the traces don't correspond to the frequency levels of the track.
I use:
Flutter 3.22.2 • channel stable
Dart 3.4.3
audioplayers: ^6.0.0
just_waveform: ^0.0.5
rxdart: ^0.28.0

For easy testing I made a full example:
you will need to add to your pubspec.yaml file the audio file:
assets:
- assets/msg2.mp3

import 'dart:async';
import 'dart:io';
import 'dart:math';
import 'dart:typed_data';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:path/path.dart' as p;
import 'package:path_provider/path_provider.dart';
import 'package:just_waveform/just_waveform.dart';
import 'package:audioplayers/audioplayers.dart';

void main() {
  runApp(MyApp());
}

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Audio Waveform Example',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: AudioPlayerScreen(),
    );
  }
}

class AudioPlayerScreen extends StatefulWidget {
  @override
  _AudioPlayerScreenState createState() => _AudioPlayerScreenState();
}

class _AudioPlayerScreenState extends State<AudioPlayerScreen> {
  final AudioPlayer audioPlayer = AudioPlayer();
  final StreamController<WaveformProgress> progressStream = StreamController<WaveformProgress>();
  Waveform? _waveform;
  Duration playingPosition = Duration.zero;

  @override
  void initState() {
    super.initState();
    audioPlayer.onPositionChanged.listen((Duration position) {
      setState(() {
        playingPosition = position;
      });
    });
  }

  @override
  void dispose() {
    audioPlayer.dispose();
    progressStream.close();
    super.dispose();
  }

Future<void> playAudio(String fileName) async {
  try {
    // Load audio file from assets
    final ByteData data = await rootBundle.load('assets/$fileName');
    final Uint8List bytes = data.buffer.asUint8List();
    
    // Save audio file to a temporary directory
    final tempDir = await getTemporaryDirectory();
    final audioPath = '${tempDir.path}/$fileName';
    final File tempAudioFile = File(audioPath);
    await tempAudioFile.writeAsBytes(bytes, flush: true);

    final waveFile = File(p.join(tempDir.path, '$fileName.waveform'));

    // Extract waveform
    JustWaveform.extract(audioInFile: tempAudioFile, waveOutFile: waveFile).listen((waveformProgress) async {
      progressStream.add(waveformProgress);
      print('Progress: %${(100 * waveformProgress.progress).toInt()}');
      if (waveformProgress.waveform != null) {
        setState(() {
          _waveform = waveformProgress.waveform;
        });
      }
    }, onError: (e) {
      print("Waveform extraction error: $e");
    });

    await audioPlayer.play(AssetSource(fileName));
    print("playing.....done $audioPath");
  } catch (e) {
    print("error audioPlayer: $e");
  }
}


  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: Text('Audio Waveform Example'),
      ),
      body: Padding(
        padding: const EdgeInsets.all(16.0),
        child: Column(
          children: [
            ElevatedButton(
              onPressed: () => playAudio('msg2.mp3'),
              child: Text('Play Audio'),
            ),
            SizedBox(height: 20),
            Container(
              height: 150.0,
              decoration: BoxDecoration(
                color: Colors.grey.shade200,
                borderRadius: const BorderRadius.all(Radius.circular(20.0)),
              ),
              padding: const EdgeInsets.all(16.0),
              width: double.maxFinite,
              child: StreamBuilder<WaveformProgress>(
                stream: progressStream.stream,
                builder: (context, snapshot) {
                  if (snapshot.hasError) {
                    return Center(
                      child: Text(
                        'Error: ${snapshot.error}',
                        style: Theme.of(context).textTheme.titleLarge,
                        textAlign: TextAlign.center,
                      ),
                    );
                  }
                  final progress = snapshot.data?.progress ?? 0.0;
                  final waveform = snapshot.data?.waveform;
                  if (waveform == null) {
                    return Center(
                      child: Text(
                        '${(100 * progress).toInt()}%',
                        style: Theme.of(context).textTheme.titleLarge,
                      ),
                    );
                  }
                  return AudioWaveformWidget(
                    waveform: waveform,
                    start: Duration.zero,
                    duration: waveform.duration,
                    position: playingPosition,
                    waveColor: Colors.grey,
                    playedColor: Colors.blue,
                  );
                },
              ),
            )
          ],
        ),
      ),
    );
  }
}

class AudioWaveformWidget extends StatefulWidget {
  final Color waveColor;
  final Color playedColor;
  final double scale;
  final double strokeWidth;
  final double pixelsPerStep;
  final Waveform waveform;
  final Duration start;
  final Duration duration;
  final Duration position;

  const AudioWaveformWidget({
    Key? key,
    required this.waveform,
    required this.start,
    required this.duration,
    required this.position,
    this.waveColor = Colors.grey,
    this.playedColor = Colors.blue,
    this.scale = 1.0,
    this.strokeWidth = 5.0,
    this.pixelsPerStep = 8.0,
  }) : super(key: key);

  @override
  _AudioWaveformState createState() => _AudioWaveformState();
}

class _AudioWaveformState extends State<AudioWaveformWidget> {
  @override
  Widget build(BuildContext context) {
    return ClipRect(
      child: CustomPaint(
        painter: AudioWaveformPainter(
          waveColor: widget.waveColor,
          playedColor: widget.playedColor,
          waveform: widget.waveform,
          start: widget.start,
          duration: widget.duration,
          position: widget.position,
          scale: widget.scale,
          strokeWidth: widget.strokeWidth,
          pixelsPerStep: widget.pixelsPerStep,
        ),
      ),
    );
  }
}

class AudioWaveformPainter extends CustomPainter {
  final double scale;
  final double strokeWidth;
  final double pixelsPerStep;
  final Paint wavePaint;
  final Paint playedPaint;
  final Waveform waveform;
  final Duration start;
  final Duration duration;
  final Duration position;

  AudioWaveformPainter({
    required this.waveform,
    required this.start,
    required this.duration,
    required this.position,
    Color waveColor = Colors.grey,
    Color playedColor = Colors.blue,
    this.scale = 1.0,
    this.strokeWidth = 5.0,
    this.pixelsPerStep = 8.0,
  })  : wavePaint = Paint()
          ..style = PaintingStyle.stroke
          ..strokeWidth = strokeWidth
          ..strokeCap = StrokeCap.round
          ..color = waveColor,
        playedPaint = Paint()
          ..style = PaintingStyle.stroke
          ..strokeWidth = strokeWidth
          ..strokeCap = StrokeCap.round
          ..color = playedColor;

  @override
  void paint(Canvas canvas, Size size) {
    if (duration == Duration.zero) return;

    double width = size.width;
    double height = size.height;

    final waveformPixelsPerWindow = waveform.positionToPixel(duration).toInt();
    final waveformPixelsPerDevicePixel = waveformPixelsPerWindow / width;
    final waveformPixelsPerStep = waveformPixelsPerDevicePixel * pixelsPerStep;
    final sampleOffset = waveform.positionToPixel(start);
    final sampleStart = -sampleOffset % waveformPixelsPerStep;
    final playProgressPixel = waveform.positionToPixel(position);

    for (var i = sampleStart.toDouble(); i <= waveformPixelsPerWindow + 1.0; i += waveformPixelsPerStep) {
      final sampleIdx = (sampleOffset + i).toInt();
      final x = i / waveformPixelsPerDevicePixel;
      final minY = normalise(waveform.getPixelMin(sampleIdx), height);
      final maxY = normalise(waveform.getPixelMax(sampleIdx), height);

      final paint = (i <= playProgressPixel) ? playedPaint : wavePaint;

      canvas.drawLine(
        Offset(x + strokeWidth / 2, max(strokeWidth * 0.75, minY)),
        Offset(x + strokeWidth / 2, min(height - strokeWidth * 0.75, maxY)),
        paint,
      );
    }
  }

  @override
  bool shouldRepaint(covariant AudioWaveformPainter oldDelegate) {
    return true;
  }

  double normalise(int s, double height) {
    if (waveform.flags == 0) {
      final y = 32768 + (scale * s).clamp(-32768.0, 32767.0).toDouble();
      return height - 1 - y * height / 65536;
    } else {
      final y = 128 + (scale * s).clamp(-128.0, 127.0).toDouble();
      return height - 1 - y * height / 256;
    }
  }
}

Result:
Screenshot from 2024-07-02 02-06-55
Audio sample:
https://www.mediafire.com/file/ntfe614cak6cj3m/msg2.mp3/file

If you listen the audio and follow the wave you will notice it do not match, there is some clear voices represented by just a dot.

Did i made some mistakes?

Support 8 bit resolution

When parsing the waveform data, we should do this:

    final data = flags == 0
        ? Int16List.view(bytes, headerLength ~/ 2)
        : Int8List.view(bytes, headerLength);

When painting the data, we should do this:

  double normalise(int s, double height) {
    if (waveform.flags == 0) {
      final y = 32768 + (scale * s).clamp(-32768.0, 32767.0).toDouble();
      return height - 1 - y * height / 65536;
    } else {
      final y = 128 + (scale * s).clamp(-128.0, 127.0).toDouble();
      return height - 1 - y * height / 256;
    }
  }

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.