The Web Audio API is implemented in most Mozilla-based browsers. Notably, Safari does NOT support it yet. My understanding is that Electron can be used to encapsulate a web app in a wrapper that turns it into a native app for various platforms.
Does it also encapsulate some specific web browser with it (one you can specify) to ensure it works the same on all platforms?
What I'm wondering is if a web app that doesn't work on Safari, say, will run or not on Apple devices that are configured to use Safari by default. Of course, you can install another browser. But if you don't, then will the Electron-wrapped app run?
In other words, will the Electron app depend on anything ouside of itself to run? Or is it essentially self-contained?
It uses & deploys the Chromium rendering engine, so its feature set should be the same everywhere.
Some information about this topic which I found when doing some research on it:
- iOS and Android is not covered by Electron but you can use Apache Cordova or Ionics Capacitor instead
- On iOS, Apple only allows the webkit browser engine, so Chrome etc. only have another GUI, but under the hood it's always Safari.
- I have some (very limited) usage of webaudio in my current project, and it works under iOS. I only use TJSAudioContext, createOscillator(), createGain, etc, to play some simple (sine-wave) sounds. There is a limitation that Safari only allows sounds to be played as a reaction of a user interaction, i.e. in an mouse/keyboard event procedure.
BUT if you package your project with Cordova, this limitation obviously does no longer exist! As a workaround and not to rely on Cordova, I use the following trick: In the first GUI event I create the audio context and play an empty wav file. So following audio output also works, not only in GUI event procedures.
Thanks for the info. I need this to play back audio files / streams, not oscillator tones. I was thinking that going with a web app meant I don't have to create separate apps for each platform since I read that Electron lets you retarget for pretty much everything.
I had a prototype working in Windows, but nobody wants that any more. They want to use their mobile devices. So it's either a web app or native for Android and iOS.
But if Web Audio API doesn't run on Webkit ... why is there the need to do this:
var audioCtx = new (window.AudioContext || window.webkitAudioContext)
Actually, I've found Web Audio API examples and the ones that generate tones usually work in Safari, but the ones that play sound files don't. Maybe it's how they're configured? Still searching around...
CrossOver now supports targeting Chrome OS on top of a Linux option that can be enabled (which they say includes everything made in 2019 and fwd). That seems to mainly include ChromeBooks, not tablets and phones.
Not sure if iOS is in their future or not. It's based on WINE.
UPDATE: it only runs on devices that have x86 support.
I have some simple .wav file output in a web app, and that also works, both in iOS and macOS (in Safari).
I here use the TWebMultimediaPlayer component from TMS Web Core.
So I would suggest debugging in the browser console; this often shows helpful information.
a) like above mentioned, sound output only in GUI event procedures
b) not every platform supports all file formats
I imagine that all browsers will play an HTML5 AUDIO tag properly. I could embed the media in an HTML page that's perhaps hidden. Then would it be possible to control the playing and volume of each one via WebCore code?
I need three or four channels that play independently. Each channel could be a single AUDIO tag, assuming they can play simultaneously and be controlled individually. Two channels will be longer (3-5 min) tracks that would loop if needed. A third track could be composed of several AUDIO tags, each of which is 2-3 mins long and they'd play sequentially.
The user would see a level control for adjusting the volume of each channel.
I'd also like a Master level control that adjusts everything else proportionally.
This approach would ensure everything would sound the same regardless of browser, assuming they support the AUDIO tags.
Do you think this approach would work?
BTW, I have a simple app with WebCore that uses Web Audio API and it creates a single Oscillator. It has a grid that acts like a piano keyboard that lets me play notes. It works fine in Chrome on my Mac Mini but Safari always plays the same note, which seems to be the first note you assign after starting the Oscillator.
Ditto for testing it on my iPad Mini 5.
This is just for testing; I'll need to play audio files (MP3 and FLAC) in my app., not tones.
Unfortunately I didn't dig that deep into webaudio usage, so currently I can not give a prediction if this may work...