Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: problems with recording and playing audio in iOS #1137

Open
ElvinSadikhov opened this issue Dec 21, 2024 · 3 comments
Open

[BUG]: problems with recording and playing audio in iOS #1137

ElvinSadikhov opened this issue Dec 21, 2024 · 3 comments

Comments

@ElvinSadikhov
Copy link

ElvinSadikhov commented Dec 21, 2024

Hi! I am trying to build a voice call feature. Basically the controller that I have created is linked to web socket logic.

Firstly, user speaks and I stream his voice to web socket. After some time web socket answers with chunks of audio that I start to put into queue. I play these audios from queue and start listening to user again.

I had issue with not recording audio On iOS only, but I saw that we need to apply audio_session here. I did that, but now this code works only for the first time. It records audio and plays audio from queue for the first time, but after that recording and playing show severe problems, especially playing audio. I tried to solve by putting audio_session everywhere -> but no result. I am going to attach my code below, if you need any sort of info, please let me know.

Thank you in advance!


// ignore_for_file: avoid_function_literals_in_foreach_calls

import 'dart:async';
import 'dart:collection';
import 'dart:io';
import 'dart:typed_data';
import 'package:audio_session/audio_session.dart';
import 'package:flutter/material.dart';
import 'package:flutter_sound/flutter_sound.dart';
import 'package:logger/logger.dart';
import 'package:permission_handler/permission_handler.dart';

class LiveAudioController {
final Duration chunkInterval;
void Function(Uint8List chunk)? onRecoredChunkReceived;
VoidCallback? onRecordingStarted;

LiveAudioController({
this.chunkInterval = const Duration(milliseconds: 250),
this.onRecoredChunkReceived,
this.onRecordingStarted,
});

late final FlutterSoundRecorder _recorder;
late final FlutterSoundPlayer _player;
late final StreamController _audioStreamCntr;
StreamSubscription? _audioStreamSub;
late final Queue _playerChunksQueue;
late final List _accumulatedChunks;
Timer? _intervalTimer;
bool _isPlayerStarted = false;
bool _isPlayerActivelyPlaying = false; //!

Future init() async {
final isOkay = await _checkPermissions();
if (!isOkay) return false;

_recorder = FlutterSoundRecorder(logLevel: Level.off);
_player = FlutterSoundPlayer(logLevel: Level.off);
_audioStreamCntr = StreamController<Uint8List>.broadcast();
_playerChunksQueue = Queue<Uint8List>();
_accumulatedChunks = [];

final session = await AudioSession.instance;
await session.configure(AudioSessionConfiguration(
  avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
  avAudioSessionCategoryOptions:
      AVAudioSessionCategoryOptions.allowBluetooth |
          AVAudioSessionCategoryOptions.defaultToSpeaker,
  avAudioSessionMode: AVAudioSessionMode.spokenAudio,
  avAudioSessionRouteSharingPolicy:
      AVAudioSessionRouteSharingPolicy.defaultPolicy,
  avAudioSessionSetActiveOptions: AVAudioSessionSetActiveOptions.none,
  androidAudioAttributes: const AndroidAudioAttributes(
    contentType: AndroidAudioContentType.speech,
    flags: AndroidAudioFlags.none,
    usage: AndroidAudioUsage.voiceCommunication,
  ),
  androidAudioFocusGainType: AndroidAudioFocusGainType.gain,
  androidWillPauseWhenDucked: true,
));
session.setActive(true);

await _recorder.openRecorder();
await _player.openPlayer();

return true;

}

Future dispose() async {
[
_recorder.closeRecorder,
_player.closePlayer,
_audioStreamCntr.close,
_intervalTimer?.cancel,
_audioStreamCntr.close,
_audioStreamSub?.cancel,
_playerChunksQueue.clear,
// () async => await audioSession.setActive(false),
].forEach((func) {
try {
func?.call();
} catch (
) { }
});
}

Future startRecording() async {
if (_recorder.isRecording) return;

if (Platform.isIOS) {
  final session = await AudioSession.instance;
  await session.configure(AudioSessionConfiguration(
    avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
    avAudioSessionCategoryOptions:
        AVAudioSessionCategoryOptions.allowBluetooth |
            AVAudioSessionCategoryOptions.defaultToSpeaker,
    avAudioSessionMode: AVAudioSessionMode.spokenAudio,
    avAudioSessionRouteSharingPolicy:
        AVAudioSessionRouteSharingPolicy.defaultPolicy,
    avAudioSessionSetActiveOptions: AVAudioSessionSetActiveOptions.none,
    androidAudioAttributes: const AndroidAudioAttributes(
      contentType: AndroidAudioContentType.speech,
      flags: AndroidAudioFlags.none,
      usage: AndroidAudioUsage.voiceCommunication,
    ),
    androidAudioFocusGainType: AndroidAudioFocusGainType.gain,
    androidWillPauseWhenDucked: true,
  ));
  session.setActive(true);
}

await pauseRecording(); // just in case

await _recorder.startRecorder(
  toStream: _audioStreamCntr.sink,
  codec: Codec.pcm16,
  sampleRate: 16000,
  numChannels: 1,
);

onRecordingStarted?.call();

_audioStreamSub = _audioStreamCntr.stream.listen((chunk) {
  _accumulatedChunks.add(chunk);
});

_intervalTimer = Timer.periodic(chunkInterval, (_) {
  if (_accumulatedChunks.isNotEmpty) {
    final tempList = List.of(_accumulatedChunks);
    _accumulatedChunks.clear();

    final chunk = tempList.reduce((acc, el) => Uint8List.fromList([...acc, ...el]));
    onRecoredChunkReceived?.call(chunk);
  }
});

}

Future pauseRecording() async {
_audioStreamSub?.cancel();
_audioStreamSub = null;

_intervalTimer?.cancel();
_intervalTimer = null;

_accumulatedChunks.clear();

if (_recorder.isRecording) {
  try {
    await _recorder.pauseRecorder();
  } catch (_) { }
}

}

Future interruptPlayerAndStartRecording() async {
await _player.stopPlayer();
_playerChunksQueue.clear();
_isPlayerStarted = false;
_isPlayerActivelyPlaying = false;

await startRecording();

}

Future enqueueAudioChunk(Uint8List chunk) async {
final wasEmpty = _playerChunksQueue.isEmpty;
_playerChunksQueue.add(chunk);

if (wasEmpty && !_isPlayerActivelyPlaying) {
  debugPrint('start processing player queue FIRST time');
  await pauseRecording();
  _processPlayerQueue();
}

}

Future stopAllProcesses() async {
await pauseRecording();
_playerChunksQueue.clear();
_isPlayerStarted = false;
_isPlayerActivelyPlaying = false;
}

Future _processPlayerQueue() async {
debugPrint('queue is empty? -> ${_playerChunksQueue.isEmpty}, is player playing? -> ${_player.isPlaying}');
if (_playerChunksQueue.isEmpty || _isPlayerActivelyPlaying) return;

_isPlayerActivelyPlaying = true;

await _maybeStartPlayer();

debugPrint('processing next chunk');
final Uint8List nextAudio = _playerChunksQueue.removeFirst();
await _player.feedFromStream(nextAudio);

_isPlayerActivelyPlaying = false;

debugPrint('has more chunks? -> ${_playerChunksQueue.isNotEmpty}');
if (_playerChunksQueue.isNotEmpty) {
  _processPlayerQueue();
} else {
  startRecording();
}

}

FutureOr _maybeStartPlayer() async {
if (_isPlayerStarted) return;

debugPrint('starting player');


if (Platform.isIOS) {
  final session = await AudioSession.instance;
  await session.configure(AudioSessionConfiguration(
    avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
    avAudioSessionCategoryOptions:
        AVAudioSessionCategoryOptions.allowBluetooth |
            AVAudioSessionCategoryOptions.defaultToSpeaker,
    avAudioSessionMode: AVAudioSessionMode.spokenAudio,
    avAudioSessionRouteSharingPolicy:
        AVAudioSessionRouteSharingPolicy.defaultPolicy,
    avAudioSessionSetActiveOptions: AVAudioSessionSetActiveOptions.none,
    androidAudioAttributes: const AndroidAudioAttributes(
      contentType: AndroidAudioContentType.speech,
      flags: AndroidAudioFlags.none,
      usage: AndroidAudioUsage.voiceCommunication,
    ),
    androidAudioFocusGainType: AndroidAudioFocusGainType.gain,
    androidWillPauseWhenDucked: true,
  ));
}

// todo: read sampleRate from ws response
await _player.startPlayerFromStream(
  codec: Codec.pcm16,
  // sampleRate: 16000,
  sampleRate: 44100,
  numChannels: 1,
);



_isPlayerStarted = true;

}

Future _checkPermissions() async {
final permission = await Permission.microphone.request();
return permission.isGranted;
}
}

@ElvinSadikhov
Copy link
Author

logs:

logs.docx

@ElvinSadikhov
Copy link
Author

@Larpoux hello. can you please assist me with this one please? I would be very very grateful

@Larpoux
Copy link
Collaborator

Larpoux commented Dec 23, 2024

I just overview your code.
Some remarks:

  • you can open an audio session just once. During initState() is a good place to do that.
  • If you really want to close your audio session, do it when you have finished working with audio. A good place to do that is during dispose()
  • Do the open audio session also with Android, even if it is less important than for iOS
  • Do not start/stop your player or recorder too quickly: probably better to keep your player started and just pause it if you want to stop it during your recording.
  • Do not create a new player during your processing. Keep working with the same player.
  • Check the way you process everything. In the logs it seems that you close your player during the startPlayer () , and before the start completion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants