Skip to content

Conversation

franchuterivera
Copy link
Contributor

Address #1072

Also, to have in mind:

  • I am creating auto-sklearn even when XDG_CACHE_HOME is provided, which I think make sense to have files within the given tool
  • In case the XDG_CACHE_HOME is not passed or is not writable we default to HOME.
  • Add checks for different combinations.

@codecov
Copy link

codecov bot commented Feb 3, 2021

Codecov Report

Merging #1073 (bdbe86a) into development (9343373) will increase coverage by 0.02%.
The diff coverage is 81.81%.

Impacted file tree graph

@@               Coverage Diff               @@
##           development    #1073      +/-   ##
===============================================
+ Coverage        85.41%   85.44%   +0.02%     
===============================================
  Files              130      130              
  Lines            10340    10338       -2     
===============================================
+ Hits              8832     8833       +1     
+ Misses            1508     1505       -3     
Impacted Files Coverage Δ
autosklearn/ensembles/ensemble_selection.py 67.36% <54.54%> (+0.23%) ⬆️
autosklearn/evaluation/test_evaluator.py 92.59% <66.66%> (+1.68%) ⬆️
autosklearn/experimental/askl2.py 80.19% <75.00%> (-1.72%) ⬇️
autosklearn/evaluation/abstract_evaluator.py 88.57% <80.00%> (-0.23%) ⬇️
autosklearn/ensemble_builder.py 76.65% <90.00%> (-0.05%) ⬇️
autosklearn/automl.py 84.76% <100.00%> (+0.31%) ⬆️
autosklearn/smbo.py 83.83% <100.00%> (+1.92%) ⬆️
...mponents/feature_preprocessing/nystroem_sampler.py 85.29% <0.00%> (-5.89%) ⬇️
...ipeline/components/regression/gradient_boosting.py 89.42% <0.00%> (-2.89%) ⬇️
... and 5 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9343373...1a6ec38. Read the comment docs.

@franchuterivera
Copy link
Contributor Author

I think there was a change in the python version which is causing crashes. #1063 aims to fix this.

training_data_hash
)
selector_directory = os.environ.get('XDG_CACHE_HOME')
if selector_directory is None or not os.access(selector_directory, os.W_OK):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey, what's the reason to check if that directory is writable? Also, wouldn't you also have to check the selector directory below?

I think if 'XDG_CACHE_HOME' is set but not writable or the fallback is not writable either, we should not save the selector. I'm also wondering whether we should assume the existence of a directory called .cache or whether we should just write into .auto-sklearn?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, writting to .auto-sklearn sounds better. Then I wonder, why using XDG_CACHE_HOME. Do you think it is more straight forward to:

  1. Default to <HOME>/.auto-sklearn
  2. If the user specified AUTO_SKLEARN_SELECTO_PATH we prioritize that?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea of using XDG_CACHE_HOME is to follow the XDG Base Directory Specification which a few packages under linux follow. To summarize, I think it would be best to either write in there or not at all to have consistent behavior.

@mfeurer
Copy link
Contributor

mfeurer commented Feb 16, 2021

PING not sure if you saw my comments?

@franchuterivera
Copy link
Contributor Author

Thanks for the ping, I clearly missed the comments. I just have question before I update the PR.

@mfeurer mfeurer merged commit 598d33d into automl:development Feb 18, 2021
@mfeurer mfeurer mentioned this pull request Feb 18, 2021
franchuterivera added a commit to franchuterivera/auto-sklearn that referenced this pull request Mar 11, 2021
* MAINT cleanup readme and remove old service yaml file (.landscape.yaml)

* MAINT bump to dev version

* move from fork to spawn

* FIX_1061 (automl#1063)

* FIX_1061

* Fxi type of target

* Moving to classes_

* classes_ should be np.ndarray

* Force float before nan

* Pynisher context is passed to metafeatures (automl#1076)

* Pynisher context to metafeatures

* Update test_smbo.py

Co-authored-by: Matthias Feurer <[email protected]>

* Calculate loss support (automl#1075)

* Calculate loss support

* Relaxed log loss test for individual models

* Feedback from automl#1075

* Missing loss in comment

* Revert back test as well

* Fix rank for metrics for which greater value is not good (automl#1079)

* Enable Mypy in evaluation (except Train Evaluator) (automl#1077)

* Almost all files for evaluation

* Feedback from PR

* Feedback from comments

* Solving rebase artifacts

* Revert bytes

* Automatically update the Copyright when building the html (automl#1074)

* update the year automatically

* Fixes for new numpy

* Revert test

* Prepare new release (automl#1081)

* prepare new release

* fix unit test

* bump version number

* Fix 1072 (automl#1073)

* Improve selector checking

* Remove copy error

* Rebase changes to development

* No .cache and check selector path

* Missing params in signature (automl#1084)

* Add size check before trying to split for GMeans (automl#732)

* Add size check before trying to split

* Rebase to new code

Co-authored-by: chico <[email protected]>

* Fxi broken links in docs and update parallel docs (automl#1088)

* Fxi broken links

* Feedback from comments

* Update manual.rst

Co-authored-by: Matthias Feurer <[email protected]>

* automl#660 Enable Power Transformations Update (automl#1086)

* Power Transformer

* Correct typo

* ADD_630

* PEP8 compliance

* Fix target type

Co-authored-by: MaxGreil <[email protected]>

* Stale Support (automl#1090)

* Stale Support

* Enhanced criteria for stale

* Enable weekly cron job

* test

Co-authored-by: Matthias Feurer <[email protected]>
Co-authored-by: Matthias Feurer <[email protected]>
Co-authored-by: Rohit Agarwal <[email protected]>
Co-authored-by: Pepe Berba <[email protected]>
Co-authored-by: MaxGreil <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants