
Sorry for a stupid question, but can you please explain me (the general idea) how mixup should work in tabular data?Īs I understand the main principle behind mixup is this: we take for ex a picture that is interpolated with other picture and the answer for a model should be – ranking these 2 classes (correct ones for every of these two pictures) higher than others (and also maybe we get much more data as you now have combinatorics on your side). However I did notice a loss in accuracy and not much improvement if any on some quick tests (rossmann bucketed and Adults)
#Fast.ai tabular data pass np.array update#
opt/conda/lib/python3.6/site-packages/fastai/callback.py in _call_and_update(self, cb, cb_name, **kwargs)Ģ39 def _call_and_update(self, cb, cb_name, **kwargs)->None:Ģ40 "Call `cb_name` on `cb` and update the inner state." > 251 for cb in self.callbacks: self._call_and_update(cb, cb_name, **kwargs) opt/conda/lib/python3.6/site-packages/fastai/callback.py in _call_(self, cb_name, call_mets, **kwargs)Ģ50 for met in trics: self._call_and_update(met, cb_name, **kwargs) > 279 self('batch_begin', mets = not self.state_dict)Ģ80 return self.state_dict, self.state_dict opt/conda/lib/python3.6/site-packages/fastai/callback.py in on_batch_begin(self, xb, yb, train)Ģ77 self.state_dict.update(dict(last_input=xb, last_target=yb, train=train,Ģ78 stop_epoch=False, skip_step=False, skip_zero=False, skip_bwd=False)) > 100 xb, yb = cb_handler.on_batch_begin(xb, yb)ġ01 loss = loss_batch(learn.model, xb, yb, learn.loss_func, learn.opt, cb_handler)ġ02 if cb_handler.on_batch_end(loss): break

opt/conda/lib/python3.6/site-packages/fastai/basic_train.py in fit(epochs, learn, callbacks, metrics)ĩ9 for xb,yb in progress_bar(_dl, parent=pbar): > 199 fit(epochs, self, metrics=trics, callbacks=self.callbacks+callbacks)Ģ01 def create_opt(self, lr:Floats, wd:Floats=0.)->None: opt/conda/lib/python3.6/site-packages/fastai/basic_train.py in fit(self, epochs, lr, wd, callbacks)ġ97 callbacks = + listify(callbacks)ġ98 if defaults.extra_callbacks is not None: callbacks += defaults.extra_callbacks > 32 learn.fit(epochs, start_lr, callbacks=, wd=wd)ģ4 def to_fp16(learn:Learner, loss_scale:float=None, max_noskip:int=1000, dynamic:bool=True, clip:float=None, opt/conda/lib/python3.6/site-packages/fastai/train.py in lr_find(learn, start_lr, end_lr, num_it, stop_div, wd)ģ0 cb = LRFinder(learn, start_lr, end_lr, num_it, stop_div)ģ1 epochs = int(np.ceil(num_it/len(_dl))) Learn = tabular_learner(data, layers=, metrics=accuracy, ps=, callback_fns=,emb_drop=0.04)īut the error I am getting is as follows: -ĪttributeError Traceback (most recent call last)
#Fast.ai tabular data pass np.array code#
Here is the code I wrote: from fastai.callbacks import * I am trying to implement Mixup in TabularLearner.
