A crucial part in the design of an upper limb prosthetic is developing a way for the amputee to control its movement. We propose a novel approach to interpreting biometric data and converting it to hand movements. Our project is a finely tuned system consisting of myographic armlet in connection with complex digital signal processing methods based on signal filtering, fast Fourier transform and pattern-recognizing neural networks. Establishing of correct working of the system requires substantial computing power during the period of pre-use calibration. Our tool of choice for selecting the neural network architecture and training it is Microsoft Cognitive Toolkit running on Microsoft Azure platform. On the hardware side, our armlet employs STM32 microcontroller recieving and processing signals from electromyographic sensor and accelerometer module. Personalized patient data required for neural network operation is stored in a plug-in memory card. Data used for calibration of the system is obtained using a cross-platform GUI application. The main measure of capability of devices of the same type as developed in our project is the number of gestures provided. The rate of successful gesture recognition in our prototype reaches 94.8% which allows us to correctly recognize a larger number of gestures, which allows to use up to 9 different hand movements. This number surpasses what world leaders in this field have to offer.