![]() Optimizer = (net.parameters(), lr=0.1)įor i, (inputs,labels) in enumerate (train_loader):Ĭorrect = (output = labels).float().sum() Is there any thing wrong I did in the accuracy calculation? And why isn't it improving, but getting more worse?Ĭriterion = torch.nn.BCELoss(size_average=True) After every epoch, I am calculating the correct predictions after thresholding the output, and dividing that number by the total number of the dataset. I am assuming I did a mistake in the accuracy calculation. The loss is fine, however, the accuracy is very low and isn't improving. I am using Binary cross entropy loss to do this. Of course, then you can also do: print localtime(time) -> epoch Īnd do without all the fuss of converting back and forth.I am working on a Neural Network problem, to classify data as 1 or 0. Note - Time::Piece overloads localtime so you can actually use it (fairly) transparently. Print localtime(time) -> strftime ( "%Y-%m-%d %H:%M:%S" ) You would probably be better off instead using Time::Piece and strftime to get a fixed format: This is the usual format used in American dates. When both the month and the date are specified in the date as numbers they are always parsed assuming that the month number comes before the date. That's probably why str2time is doing odd things - because it makes certain assumptions about formats that don't always apply. If you do it in a scalar context, it returns a string denoting the time: print "".localtime(time) īut note - that might vary somewhat depending on your current locale. (Which you can use without needing to parse). (In days, so you'll have to multiply up).īut if you really want to take the time and convert it back again - you'll need to look at how localtime(time) returns the result.īecause localtime is being evaluated in a list context, and so returning an array of values. Or perhaps better yet -M which tells you how long ago a file was modified. You're doing something bizarre here - localtime(time) takes - the epoch time ( time) and converts it to a string.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |