I am trying to use the FDTD solver for a low-frequency simulation as I am interested in testing different waveforms. However, when I use the solver with low frequencies, the time needed for the simulation to run is too long (50k+ hours) compared to a few minutes with high frequencies. I was wondering if there is a way to reduce the timestep size.
As FDTD is a time-domain based method, implementing LF sources with this method is quite challenging. It would need a very (very) long simulation's time, without any assurance of reaching the correct convergence level at the end - furthermore if your simulation has a certain amount of MCells.
For input signal with frequency components below the MHz, LF approach is the only solution that would give you quick and accurate results. I would thus advice you to (1) get the current profile along time, and (2) do a fft so as to get the current's magnitude for each of the frequency components. Then, (3) perform several LF simulations for each of the frequency/magnitude obtained during step (2).