-
Notifications
You must be signed in to change notification settings - Fork 4
yax #157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
yax #157
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -37,7 +37,7 @@ grads = backtrace(l)[1] | |
| # TODO: test DimArray inputs | ||
| using DimensionalData, ChainRulesCore | ||
| # mat = Matrix(df)' | ||
| mat = Array(Matrix(df)') | ||
| mat = Float32.(Array(Matrix(df)')) | ||
| da = DimArray(mat, (Dim{:col}(Symbol.(names(df))), Dim{:row}(1:size(df,1)))) | ||
|
|
||
| ##! new dispatch | ||
|
|
@@ -91,7 +91,7 @@ ar = rand(3,3) | |
| A = DimArray(ar, (Y([:a,:b,:c]), X(1:3))); | ||
| grad = Zygote.gradient(x -> sum(x[Y=At(:b)]), A) | ||
|
|
||
| xy = EasyHybrid.split_data((ds_p_f, ds_t), 0.8, shuffle=true, rng=Random.default_rng()) | ||
| # xy = EasyHybrid.split_data((ds_p_f, ds_t), 0.8, shuffle=true, rng=Random.default_rng()) | ||
|
|
||
| EasyHybrid.get_prediction_target_names(RbQ10) | ||
|
|
||
|
|
@@ -100,3 +100,23 @@ xy1 = EasyHybrid.prepare_data(RbQ10, da) | |
| (x_train, y_train), (x_val, y_val) = EasyHybrid.split_data(da, RbQ10) # ; shuffleobs=false, split_data_at=0.8 | ||
|
|
||
| out = train(RbQ10, da, (:Q10, ); nepochs=200, batchsize=512, opt=Adam(0.01)); | ||
|
|
||
| using YAXArrays | ||
| axDims = dims(da) | ||
|
|
||
| ds_yax = YAXArray(axDims, da.data) | ||
|
|
||
| ds_p_f = ds_yax[col=At(forcing_names ∪ predictor_names)] | ||
| ds_t = ds_yax[col=At(target_names)] | ||
| ds_t_nan = .!isnan.(ds_t) # produces 1×35064 YAXArray{Float32, 2}, not a Bool | ||
| ds_t_nan = map(x -> !isnan(x), ds_t) # 1×35064 YAXArray{Bool, 2} | ||
| length(ds_t_nan) | ||
| # is_no_nan = .!isnan.(y) | ||
|
|
||
|
|
||
| ls = EasyHybrid.lossfn(RbQ10, ds_p_f, (ds_t, ds_t_nan), ps, st, LoggingLoss()) | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This runs now
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nice, yes indeed it seems to be the nan mask. Let's the most generic version so that all input cases work. |
||
| ls_logs = EasyHybrid.lossfn(RbQ10, ds_p_f, (ds_t, ds_t_nan), ps, st, LoggingLoss(train_mode=false)) | ||
| acc_ = EasyHybrid.evaluate_acc(RbQ10, ds_p_f, ds_t, ds_t_nan, ps, st, [:mse, :r2], :mse, sum) | ||
|
|
||
| # TODO how to proceed - would it work already for multiple targets? | ||
| out_yax = train(RbQ10, ds_yax, (:Q10, ); nepochs=200, batchsize=512, opt=Adam(0.01)); | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -57,7 +57,17 @@ function loss_fn(ŷ, y, y_nan, ::Val{:rmse}) | |
| return sqrt(mean(abs2, (ŷ[y_nan] .- y[y_nan]))) | ||
| end | ||
| function loss_fn(ŷ, y, y_nan, ::Val{:mse}) | ||
| return mean(abs2, (ŷ[y_nan] .- y[y_nan])) | ||
| # Option 1: Convert to Array and compute MSE | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Option 1 would be converting to an array but what I am not sure is where it would use a view, when it copies and when comes into memory and so on |
||
| #yh = Array(ŷ[y_nan]) | ||
| #yt = Array(y[y_nan]) | ||
| #return mean(abs2, yh .- yt) | ||
|
|
||
| # Option 2: Use YAXArray directly but map has to be used | ||
| return mean(x -> x, map((a,b)->(a-b)^2, ŷ[y_nan], y[y_nan])) | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Also here broadcast does not work - map works or we it an array. Not quite sure how we proceed from here. Should our model give a DimArray back? Not at all a YaxArrays / DimArrays expert ;-)
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. in principle all types should work, so YAXArrays should be fine. |
||
|
|
||
| # Option 3 gives an error | ||
| #return mean(abs2, (ŷ[y_nan] .- y[y_nan])) # errors with ERROR: MethodError: no method matching to_yax(::Vector{Float32}) The function `to_yax` exists, but no method is defined for this combination of argument types. | ||
| # I guess our model output would need to yax and not Vector{Float32} | ||
| end | ||
| function loss_fn(ŷ, y, y_nan, ::Val{:mae}) | ||
| return mean(abs, (ŷ[y_nan] .- y[y_nan])) | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This has to be done with map otherwise it's not a Bool