
One advantage of using sparse categorical cross entropy is it saves time in memory as well as computation because it simply uses a single integer for a class, rather than a whole vector. Examples for above 3-class classification problem: 1, 2, 3 The usage entirely depends on how you load your dataset.


上記の xxx を使って optimizer='xxxx' のように文字列で指定することも、 optimizer=tf.optimizers. Examples (for a 3-class classification): 1,0,0, 0,1,0, 0,0,1 But if your Y i 's are integers, use sparsecategoricalcrossentropy. There should be classes floating point values per feature. The output label, if present in integer form, is converted into categorical encoding using keras.utils tocategorical method. The output label is assigned one-hot category encoding value in form of 0s and 1. If you want to provide labels as integers, please use SparseCategoricalCrossentropy loss. categoricalcrossentropy: Used as a loss function for multi-class classification model where there are two or more output labels. We expect labels to be provided in a onehot representation.

compile ( optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics = )
