Invention Grant
- Patent Title: Coordinated heterogeneous processing of training data for deep neural networks
-
Application No.: US15945647Application Date: 2018-04-04
-
Publication No.: US11275991B2Publication Date: 2022-03-15
- Inventor: Fangzhe Chang , Dong Liu , Thomas Woo
- Applicant: Nokia Technologies Oy
- Applicant Address: FI Espoo
- Assignee: Nokia Technologies Oy
- Current Assignee: Nokia Technologies Oy
- Current Assignee Address: FI Espoo
- Agency: Duft & Bornsen, PC
- Main IPC: G06N3/04
- IPC: G06N3/04 ; G06N3/08 ; G06N3/063 ; G06F9/50

Abstract:
Systems and methods for training neural networks. One embodiment is a system that includes a memory configured to store samples of training data for a Deep Neural Network (DNN), and a distributor. The distributor identifies a plurality of work servers provisioned for training the DNN by processing the samples via a model of the DNN, receives information indicating Graphics Processing Unit (GPU) processing powers at the work servers, determines differences in the GPU processing powers between the work servers based on the information, and allocates the samples among the work servers based on the differences.
Public/Granted literature
- US20190311257A1 COORDINATED HETEROGENEOUS PROCESSING OF TRAINING DATA FOR DEEP NEURAL NETWORKS Public/Granted day:2019-10-10
Information query