TFLSTM-52
Model summary
TransformerLSTM trained on a shorter target length (540 as opposed to 1020), which is the length of an SFTPRO1 cycle. causal_attention=True.
Model summary
TransformerLSTM trained on a shorter target length (540 as opposed to 1020), which is the length of an SFTPRO1 cycle. causal_attention=True.