About Me

Hey, I’m Seyfal! I’m a Research Engineer at Nesa and a research assistant at the Electronic Visualization Laboratory (EVL).

At Nesa, I work on scalable, privacy-preserving inference by developing novel architectures that integrate MPC, homomorphic encryption, and function secret sharing for decentralized compute networks. At EVL, supervised by Professor Mike Papka, I research energy efficiency in deep learning and optimize workloads for Intel GPU clusters, collaborating across the lab to adapt experiments for Intel’s architecture.

My research interests include: (i) cryptographically secure inference with minimal communication overhead and no trusted parties, (ii) decentralized training architectures where autonomous model components can evolve independently across heterogeneous hardware without requiring high-bandwidth communication, and (iii) data-centric optimizations through unsupervised preprocessing and representation learning that improve scaling efficiency.

Previously, I applied autoencoders to physics microscopy data and developed knowledge distillation methods for knowledge graphs.

Copy [email protected] View Research Summary →


News