Skip to main content
Updated issue since @zibelas helped me fix the errors with my networking code but the original issue still remains
Source Link

This works on the host when I test it in the Unity editor but whenBefore I buildadded Mirror or any networking component to the WebGL project and test it havingthis worked fine in WebGL, but now the browser asaudio will not behave the clientsame. I tried it in Google Chrome and the Unity editor as the server/host, the browser client behaves differentlyMicrosoft Edge. It will play/unpause when I press the space/enter keys and it will pause but after the first two lines (which play correctly) it will start to skip ahead a few seconds every time I unpause it. I used the WebGL development build to check that the AudioSource.time is correct before it is unpaused but it still will play the wrong part of the audioclip. It almost seems like the audio clip is still running even after it has been paused because the longer I wait the further into the clip it will play, but it is not a direct second:second relationship, even if I play lines closely after each other it seems to skip ahead further and further each time.

In this Networking case are there two different AudioSources (One for the server and one for the client)? Though how could the debug log show the correct AudioSource.time (in Unity editor and the browser) but then it is unpaused at a different time? Is the client version of the AudioSource running in the background so the time is increasing even though the audio has been paused? It doesn't make sense to me but I am quite new to Unity, especially Mirror and network programming in general. Or could this be some issue with WebGL? I think it probably isn't solely a WebGL issue since this part was working in WebGL before I added any networking component to the project.

When I build it as a Windows project instead of WebGL, it works perfectly. So I am unsure if it is an issue with my networking code. Could it be an issue specifically caused by using Mirror or networking with a WebGL build?

using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (isServer != true) return;  if (Input.GetKeyDown(KeyCode.Space)) { count = 0; startSound();   StartCoroutine(SoundWaiter(33.5f));  } if (Input.GetKeyDown(KeyCode.Return)) {   playSound(count);  float[] lineDurations = { 3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, count5.5f, =3.8f, count8.6f, +2.9f 1;};   }  }  IEnumerator SoundWaiterplaySound(float duration);   {  yield return new WaitForSecondsStartCoroutine(durationSoundWaiter(lineDurations[count])); soundFX.Pause();  count = count + //pauseSound();1;  // This doesn't seem to change anything} } [ClientRpc][Server] voidIEnumerator startSoundSoundWaiter(float duration) { soundFX.Playyield return new WaitForSeconds(duration); StartCoroutine(SoundWaiterpauseSound(33.5f)); } [ClientRpc] void playSoundstartSound(int number) { DebugsoundFX.LogPlay("Playing sound: " + number);   }  Debug.Log("soundFX.time: " + soundFX.time);[ClientRpc]   void playSound()  float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause();   StartCoroutine(SoundWaiter(lineDurations[number])); } [ClientRpc] void pauseSound() // This doesn't seem to change anything { soundFX.Pause(); }   } ``` 

This works on the host when I test it in the Unity editor but when I build the WebGL project and test it having the browser as the client and the Unity editor as the server/host, the browser client behaves differently. It will play/unpause when I press the space/enter keys and it will pause but after the first two lines (which play correctly) it will start to skip ahead a few seconds every time I unpause it. I used the WebGL development build to check that the AudioSource.time is correct before it is unpaused but it still will play the wrong part of the audioclip. It almost seems like the audio clip is still running even after it has been paused because the longer I wait the further into the clip it will play, but it is not a direct second:second relationship, even if I play lines closely after each other it seems to skip ahead further and further each time.

In this Networking case are there two different AudioSources (One for the server and one for the client)? Though how could the debug log show the correct AudioSource.time (in Unity editor and the browser) but then it is unpaused at a different time? Is the client version of the AudioSource running in the background so the time is increasing even though the audio has been paused? It doesn't make sense to me but I am quite new to Unity, especially Mirror and network programming in general. Or could this be some issue with WebGL? I think it probably isn't a WebGL issue since this part was working in WebGL before I added any networking component to the project.

using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (Input.GetKeyDown(KeyCode.Space)) { count = 0; startSound(); } if (Input.GetKeyDown(KeyCode.Return)) {   playSound(count);  count = count + 1;   }  }  IEnumerator SoundWaiter(float duration)   {  yield return new WaitForSeconds(duration); soundFX.Pause();  //pauseSound(); // This doesn't seem to change anything } [ClientRpc] void startSound() { soundFX.Play(); StartCoroutine(SoundWaiter(33.5f)); } [ClientRpc] void playSound(int number) { Debug.Log("Playing sound: " + number);   Debug.Log("soundFX.time: " + soundFX.time);   float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause();   StartCoroutine(SoundWaiter(lineDurations[number])); } [ClientRpc] void pauseSound() // This doesn't seem to change anything { soundFX.Pause(); }   } ``` 

Before I added Mirror or any networking component to the project this worked fine in WebGL, but now the audio will not behave the same. I tried it in Google Chrome and Microsoft Edge. It will play/unpause when I press the space/enter keys and it will pause but after the first two lines (which play correctly) it will start to skip ahead a few seconds every time I unpause it. I used the WebGL development build to check that the AudioSource.time is correct before it is unpaused but it still will play the wrong part of the audioclip. It almost seems like the audio clip is still running even after it has been paused because the longer I wait the further into the clip it will play, but it is not a direct second:second relationship, even if I play lines closely after each other it seems to skip ahead further and further each time.

In this Networking case are there two different AudioSources (One for the server and one for the client)? Though how could the debug log show the correct AudioSource.time (in Unity editor and the browser) but then it is unpaused at a different time? Is the client version of the AudioSource running in the background so the time is increasing even though the audio has been paused? It doesn't make sense to me but I am quite new to Unity, especially Mirror and network programming in general. Or could this be some issue with WebGL? I think it probably isn't solely a WebGL issue since this part was working in WebGL before I added any networking component to the project.

When I build it as a Windows project instead of WebGL, it works perfectly. So I am unsure if it is an issue with my networking code. Could it be an issue specifically caused by using Mirror or networking with a WebGL build?

using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame void Update() { if (isServer != true) return;  if (Input.GetKeyDown(KeyCode.Space)) { count = 0; startSound();   StartCoroutine(SoundWaiter(33.5f));  } if (Input.GetKeyDown(KeyCode.Return)) { float[] lineDurations = { 3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f }; playSound(); StartCoroutine(SoundWaiter(lineDurations[count])); count = count + 1;  } } [Server] IEnumerator SoundWaiter(float duration) { yield return new WaitForSeconds(duration); pauseSound(); } [ClientRpc] void startSound() { soundFX.Play(); }  [ClientRpc] void playSound()  { soundFX.UnPause(); } [ClientRpc] void pauseSound() { soundFX.Pause(); } } ``` 
deleted 67 characters in body
Source Link
using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (Input.GetKeyDown(KeyCode.Space)) { count = 1; // We want to skip the long first line in testing0; startSound(); } if (Input.GetKeyDown(KeyCode.Return)) { playSound(count); count = count + 1; } } IEnumerator SoundWaiter(float duration) { yield return new WaitForSeconds(duration); soundFX.Pause(); }  [ClientRpc]  void startSound//pauseSound() { ; soundFX.time// =This 33.5f;doesn't seem //to Notchange startinganything  at the start so}  I don't have to[ClientRpc]  wait through a longvoid linestartSound()  every time I test{ soundFX.Play(); //StartCoroutine(SoundWaiter(33.5f));  StartCoroutine(SoundWaiter(3.2f)); // The second line is much shorter } [ClientRpc] void playSound(int number) { Debug.Log("Playing sound: " + number); Debug.Log("soundFX.time: " + soundFX.time); float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause(); StartCoroutine(SoundWaiter(lineDurations[number])); } [ClientRpc] void pauseSound() // This doesn't seem to change anything { soundFX.Pause(); }  } ``` 
using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (Input.GetKeyDown(KeyCode.Space)) { count = 1; // We want to skip the long first line in testing startSound(); } if (Input.GetKeyDown(KeyCode.Return)) { playSound(count); count = count + 1; } } IEnumerator SoundWaiter(float duration) { yield return new WaitForSeconds(duration); soundFX.Pause(); }  [ClientRpc]  void startSound() {  soundFX.time = 33.5f; // Not starting at the start so I don't have to wait through a long line every time I test soundFX.Play(); //StartCoroutine(SoundWaiter(33.5f));  StartCoroutine(SoundWaiter(3.2f)); // The second line is much shorter } [ClientRpc] void playSound(int number) { Debug.Log("Playing sound: " + number); Debug.Log("soundFX.time: " + soundFX.time); float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause(); StartCoroutine(SoundWaiter(lineDurations[number])); } } ``` 
using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (Input.GetKeyDown(KeyCode.Space)) { count = 0; startSound(); } if (Input.GetKeyDown(KeyCode.Return)) { playSound(count); count = count + 1; } } IEnumerator SoundWaiter(float duration) { yield return new WaitForSeconds(duration); soundFX.Pause(); //pauseSound(); // This doesn't seem to change anything  }  [ClientRpc]  void startSound()  { soundFX.Play(); StartCoroutine(SoundWaiter(33.5f)); } [ClientRpc] void playSound(int number) { Debug.Log("Playing sound: " + number); Debug.Log("soundFX.time: " + soundFX.time); float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause(); StartCoroutine(SoundWaiter(lineDurations[number])); } [ClientRpc] void pauseSound() // This doesn't seem to change anything { soundFX.Pause(); }  } ``` 
deleted 19 characters in body
Source Link

I am creating a system in Unity using Mirror Networking which will only require one host/server and one client. It is a WebGL build so the client will connect on a browser. I want to send commands from the server to play audio clips on the client without the client being able to control anything. The audio is questions an avatar is asking the user. Due to another constraint (Lip Sync setup) I have one audioSourceAudioSource and one long audio clip which contains all the questions and is played by pressing the space bar. So rather than playing each audio clip separately, the audio clip is unpaused and paused. It will pause itself after a set time when the question line is finished and then I press the enter key to unpause the audioSourceAudioSource for the next question (and again, each line will pause itself after a set time which is the length of the individual question).

This works on the host when I test it in the Unity editor but when I build the WebGL project and test it having the browser as the client and the Unity editor as the server/host, the browser client behaves differently. It will play/unpause when I press the space/enter keys and it will pause but after the first two lines (which play correctly) it will start to skip ahead a few seconds every time I unpause it. I used the WebGL development build to check that the audioSourceAudioSource.time is correct before it is unpaused but it still will play the wrong part of the audioclip. It almost seems like the audio clip is still running even after it has been paused because the longer I wait the further into the clip it will play, but it is not a direct second:second relationship, even if I play lines closely after each other it seems to skip ahead further and further each time.

In this Networking case are there two different audioSourcesAudioSources (One for the server and one for the client)? Though how could the debug log show the correct audioSourceAudioSource.time (in Unity editor and the browser) but then it is unpaused at a different time? Is the client version of the audioSourceAudioSource running in the background so the time is increasing even though the audio has been paused? It doesn't make sense to me but I am quite new to Unity, especially Mirror and network programming in general. Or could this be some issue with WebGL? I think it probably isn't a WebGL issue since this part was working in WebGL before I added any networking component to the project.

using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (Input.GetKeyDown(KeyCode.Space)) { count = 1; // We want to skip the long first line in testing startSound(); } if (Input.GetKeyDown(KeyCode.Return)) { playSound(count); count = count + 1; } }    [Server] IEnumerator SoundWaiter(float duration) { yield return new WaitForSeconds(duration); soundFX.Pause(); } [ClientRpc] void startSound() { soundFX.time = 33.5f; // Not starting at the start so I don't have to wait through a long line every time I test soundFX.Play(); //StartCoroutine(SoundWaiter(33.5f)); StartCoroutine(SoundWaiter(3.2f)); // The second line is much shorter } [ClientRpc] void playSound(int number) { Debug.Log("Playing sound: " + number); Debug.Log("soundFX.time: " + soundFX.time); float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause(); StartCoroutine(SoundWaiter(lineDurations[number])); } }   ``` 

I am creating a system in Unity using Mirror Networking which will only require one host/server and one client. It is a WebGL build so the client will connect on a browser. I want to send commands from the server to play audio clips on the client without the client being able to control anything. The audio is questions an avatar is asking the user. Due to another constraint (Lip Sync setup) I have one audioSource and one long audio clip which contains all the questions and is played by pressing the space bar. So rather than playing each audio clip separately, the audio clip is unpaused and paused. It will pause itself after a set time when the question line is finished and then I press the enter key to unpause the audioSource for the next question (and again, each line will pause itself after a set time which is the length of the individual question).

This works on the host when I test it in the Unity editor but when I build the WebGL project and test it having the browser as the client and the Unity editor as the server/host, the browser client behaves differently. It will play/unpause when I press the space/enter keys and it will pause but after the first two lines (which play correctly) it will start to skip ahead a few seconds every time I unpause it. I used the WebGL development build to check that the audioSource.time is correct before it is unpaused but it still will play the wrong part of the audioclip. It almost seems like the audio clip is still running even after it has been paused because the longer I wait the further into the clip it will play, but it is not a direct second:second relationship, even if I play lines closely after each other it seems to skip ahead further and further each time.

In this Networking case are there two different audioSources (One for the server and one for the client)? Though how could the debug log show the correct audioSource.time (in Unity editor and the browser) but then it is unpaused at a different time? Is the client version of the audioSource running in the background so the time is increasing even though the audio has been paused? It doesn't make sense to me but I am quite new to Unity, especially Mirror and network programming in general. Or could this be some issue with WebGL? I think it probably isn't a WebGL issue since this part was working in WebGL before I added any networking component to the project.

using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (Input.GetKeyDown(KeyCode.Space)) { count = 1; // We want to skip the long first line in testing startSound(); } if (Input.GetKeyDown(KeyCode.Return)) { playSound(count); count = count + 1; } }    [Server] IEnumerator SoundWaiter(float duration) { yield return new WaitForSeconds(duration); soundFX.Pause(); } [ClientRpc] void startSound() { soundFX.time = 33.5f; // Not starting at the start so I don't have to wait through a long line every time I test soundFX.Play(); //StartCoroutine(SoundWaiter(33.5f)); StartCoroutine(SoundWaiter(3.2f)); // The second line is much shorter } [ClientRpc] void playSound(int number) { Debug.Log("Playing sound: " + number); Debug.Log("soundFX.time: " + soundFX.time); float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause(); StartCoroutine(SoundWaiter(lineDurations[number])); } } 

I am creating a system in Unity using Mirror Networking which will only require one host/server and one client. It is a WebGL build so the client will connect on a browser. I want to send commands from the server to play audio clips on the client without the client being able to control anything. The audio is questions an avatar is asking the user. Due to another constraint (Lip Sync setup) I have one AudioSource and one long audio clip which contains all the questions and is played by pressing the space bar. So rather than playing each audio clip separately, the audio clip is unpaused and paused. It will pause itself after a set time when the question line is finished and then I press the enter key to unpause the AudioSource for the next question (and again, each line will pause itself after a set time which is the length of the individual question).

This works on the host when I test it in the Unity editor but when I build the WebGL project and test it having the browser as the client and the Unity editor as the server/host, the browser client behaves differently. It will play/unpause when I press the space/enter keys and it will pause but after the first two lines (which play correctly) it will start to skip ahead a few seconds every time I unpause it. I used the WebGL development build to check that the AudioSource.time is correct before it is unpaused but it still will play the wrong part of the audioclip. It almost seems like the audio clip is still running even after it has been paused because the longer I wait the further into the clip it will play, but it is not a direct second:second relationship, even if I play lines closely after each other it seems to skip ahead further and further each time.

In this Networking case are there two different AudioSources (One for the server and one for the client)? Though how could the debug log show the correct AudioSource.time (in Unity editor and the browser) but then it is unpaused at a different time? Is the client version of the AudioSource running in the background so the time is increasing even though the audio has been paused? It doesn't make sense to me but I am quite new to Unity, especially Mirror and network programming in general. Or could this be some issue with WebGL? I think it probably isn't a WebGL issue since this part was working in WebGL before I added any networking component to the project.

using System.Collections; using System.Collections.Generic; using UnityEngine; using Mirror; public class SoundTracker : NetworkBehaviour { public AudioSource soundFX; public int count = 0; // Update is called once per frame [Server] void Update() { if (Input.GetKeyDown(KeyCode.Space)) { count = 1; // We want to skip the long first line in testing startSound(); } if (Input.GetKeyDown(KeyCode.Return)) { playSound(count); count = count + 1; } } IEnumerator SoundWaiter(float duration) { yield return new WaitForSeconds(duration); soundFX.Pause(); } [ClientRpc] void startSound() { soundFX.time = 33.5f; // Not starting at the start so I don't have to wait through a long line every time I test soundFX.Play(); //StartCoroutine(SoundWaiter(33.5f)); StartCoroutine(SoundWaiter(3.2f)); // The second line is much shorter } [ClientRpc] void playSound(int number) { Debug.Log("Playing sound: " + number); Debug.Log("soundFX.time: " + soundFX.time); float[] lineDurations = {3.2f, 4.75f, 4.5f, 5f, 4f, 5.5f, 5.5f, 3.8f, 8.6f, 2.9f}; soundFX.UnPause(); StartCoroutine(SoundWaiter(lineDurations[number])); } }   ``` 
added 301 characters in body
Source Link
Loading
added 133 characters in body
Source Link
Loading
Source Link
Loading