79777664

Date: 2025-09-29 00:24:26
Score: 2
Natty:
Report link

Data leakage is never good when testing models; data samples should not be present in training when they can be observed in validation/testing. I examined the dataset on Kaggle, and I can assume that different individuals produce distinct signal frequencies even when performing the same gesture. Can z-score normalization be applied per-channel, per-subject, to remove gesture variance? This could remove the subject bias and prevent your models from learning subject-specific patterns instead of gesture-specific patterns. Additionally, verify that you have an even class distribution within your training data.

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Contains question mark (0.5):
  • Single line (0.5):
  • Low reputation (1):
Posted by: JordanDotPy