I am attempting to make a consistent firing rate.
I have done a lot of research on figuring out consistent rates that are independent of framerate and most of the articles about it use deltaTime. However, most timers do not. I have a timer that works as long as the interval is not less that "0.2". Any number less than that has inconsistent results depending on the framerate. Specifically, when I play the project in the maximized window, versus a small window.
Here is the code I am using. I know this has been beaten to death, so forgive me, please. I assume I must use some form of deltaTime multiplier somewhere.
using UnityEngine;
using System.Collections;
public class Shooter : MonoBehaviour {
// public GameObject bullet;
// public Transform cam;
public float interval;
public float lastFire = -9999;
public int shotsFired;
void Update ()
{
if(Time.time < 5)
{
if(CanFire ())
{
lastFire = Time.time;
// Instantiate(bullet, cam.position, cam.rotation);
shotsFired ++;
}
}
}
bool CanFire()
{
return Time.time >= lastFire + interval;
}
}
I have toggled out the bullet references so that it can be easily tested. The reason I know it is inconsistent is because I implemented a test that lasts for a specific amount of time. I realize my test may be faulty and could be the actual problem. If so, please let me know.
I really appreciate any and all help.
Thanks, and God bless!
Howey
↧