I tried to implement a Generator with the tf.keras.utils.Sequence - Method following this Github-Page:
https://mahmoudyusof.github.io/facial-keypoint-detection/data-generator/
So my Generator has the form:
class Generator(tf.keras.utils.Sequence):
def __init__(self, *args, **kwargs):
self.on_epoch_end()
def on_epoch_end(self):
#shuffle indices for batches
def __len__(self):
def __getitem__(self, idx):
#returning the idxth batch of the shuffled dataset
return X, y
Unfortunately the training-processes of my model became very long with this generator so I wanted to prefetch it.
I tried
Train_Generator = tf.data.Dataset.from_generator(Generator(Training_Files, batch_size=64, shuffle = True), output_types=(np.array, np.array))
to convert the generator to a type where prefetching works. I got the error message:
`generator` must be callable.
I know for this to work the generator musst support the Iter()-Protocol. But how can i implement it?
Or do you guys know other methods to improve the Performance of these kinds of generators?
Thanks ahead!!
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…