Author
Abstract
The object of research is the ability to combine a previously trained model of a deep neural network of direct propagation with user data when used in problems of determining the class of one object in the image. That is, the processes of transfer learning in convolutional neural networks in classification problems are considered. The conducted researches are based on application of a method of comparison of theoretical and practical results received at training of convolutional neural networks. The main objective of this research is to conduct two different learning processes. Traditional training during which in each epoch of training there is an adjustment of values of all weights of each layer of a network. After that there is a process of training of a neural network on a sample of the data presented by images. The second process is learning using transfer learning methods, when initializing a pre-trained network, the weights of all its layers are «frozen» except for the last fully connected layer. This layer will be replaced by a new one with the number of outputs, which should be equal to the number of classes in the sample. After that, to initialize its parameters by the random values distributed according to the normal law. Then conduct training of such convolutional neural network on the set sample. When the training was conducted, the results were compared. In conclusion, learning from convolutional neural networks using transfer learning techniques can be applied to a variety of classification tasks, ranging from numbers to space objects (stars and quasars). The amount of computer resources spent on research is also quite important. Because not all model of a convolutional neural network can be fully taught without powerful computer systems and a large number of images in the training sample.
Suggested Citation
Vladimir Khotsyanovsky, 2022.
"Comparative characteristics of the ability of convolutional neural networks to the concept of transfer learning,"
Technology audit and production reserves, PC TECHNOLOGY CENTER, vol. 1(2(63)), pages 10-13, January.
Handle:
RePEc:baq:taprar:v:1:y:2022:i:2:p:10-13
DOI: 10.15587/2706-5448.2022.252695
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:baq:taprar:v:1:y:2022:i:2:p:10-13. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Iryna Prudius (email available below). General contact details of provider: https://journals.uran.ua/tarp/issue/archive .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.