On December 10, 2025, a thread on r/indiaworks revealed that a Pune-based ITES company had started requiring its remote employees to complete daily biometric face scans via a third-party app for attendance purposes. The employees reported frequent failed verifications, scanner glitches, and warnings that "non-compliant attendance" would affect their payroll. The policy was implemented abruptly without consultation, and several employees raised privacy concerns, citing the vendor's terms which allowed extensive data storage rights. As the post gained traction, numerous workers from other companies shared similar experiences, igniting a debate around surveillance and digital dignity.
The emotional response from employees has been a mixture of resentment, anxiety, and fear. Remote workers expressed that the biometric requirement made them feel distrusted and treated like potential absentees rather than responsible adults. Some reported waking up stressed about whether the app would accurately capture their face. HR teams are caught between the demands for productivity and privacy concerns, aware that the policy's abrupt rollout without explanation has severely damaged morale. Managers, on the other hand, feel defensive, claiming that attendance abuse was increasing and they had no other options. For many employees, this situation has become symbolic of a broader erosion of autonomy and respect.
From a legal perspective, the use of biometric data intersects with the Digital Personal Data Protection Act, 2023, which mandates informed consent, purpose limitation, and secure processing of sensitive personal data. Forcing biometric scans without clear consent or transparency could expose employers to penalties. HR must immediately revisit the attendance policy, scrutinize vendor contracts for data protection clauses, and ensure employees have clear opt-in mechanisms. Companies also need to assess whether less intrusive methods—like self-attestation or time-stamped logs—would suffice. Leadership should treat this as a serious compliance and culture moment, recognizing that surveillance without guardrails can erode trust and create significant legal exposure.
The questions that arise are: Should remote teams rely on biometrics at all for attendance? How can HR build trust while maintaining accountability in remote settings?
The emotional response from employees has been a mixture of resentment, anxiety, and fear. Remote workers expressed that the biometric requirement made them feel distrusted and treated like potential absentees rather than responsible adults. Some reported waking up stressed about whether the app would accurately capture their face. HR teams are caught between the demands for productivity and privacy concerns, aware that the policy's abrupt rollout without explanation has severely damaged morale. Managers, on the other hand, feel defensive, claiming that attendance abuse was increasing and they had no other options. For many employees, this situation has become symbolic of a broader erosion of autonomy and respect.
From a legal perspective, the use of biometric data intersects with the Digital Personal Data Protection Act, 2023, which mandates informed consent, purpose limitation, and secure processing of sensitive personal data. Forcing biometric scans without clear consent or transparency could expose employers to penalties. HR must immediately revisit the attendance policy, scrutinize vendor contracts for data protection clauses, and ensure employees have clear opt-in mechanisms. Companies also need to assess whether less intrusive methods—like self-attestation or time-stamped logs—would suffice. Leadership should treat this as a serious compliance and culture moment, recognizing that surveillance without guardrails can erode trust and create significant legal exposure.
The questions that arise are: Should remote teams rely on biometrics at all for attendance? How can HR build trust while maintaining accountability in remote settings?