Release Notes
0.8.0
refactoring template formatting for
input_spec
fixing issues with input fields with extension (and using them in templates)
adding simple validators to input spec (using
attr.validator
)adding
create_dotfile
for workflows, that creates graphs as dotfiles (can convert to other formats if dot available)adding a simple user guide with
input_spec
descriptionexpanding docstrings for
State
,audit
andmessanger
updating syntax to newer python
0.7.0
refactoring the error handling by padra: improving raised errors, removing nodes from the workflow graph that can’t be run
refactoring of the
input_spec
: adapting better to the nipype interfacesswitching from
pkg_resources.declare_namespace
to the stdlibpkgutil.extend_path
moving
readme
to rst format
0.6.2
Use pkgutil to declare
pydra.tasks
as a namespace package, ensuring better support for editable mode.
0.6.1
Add
pydra.tasks
namespace package to enable separate packages ofTask
s to be installed intopydra.tasks
.Raise error when task or workflow name conflicts with names of attributes, methods, or other tasks already added to workflow
Mention
requirements.txt
in README
0.6
removing the tutorial to a separate repo
adding windows tests to codecov
accepting
None
as a valid output from aFunctionTask
, also for function that returns multiple valuesfixing slurm error files
adding
wf._connection
tochecksum
allowing for updates of
wf._connections
editing output, so it works with
numpy.arrays
removing
to_job
and pickling task instead (workers read the tasks and set the proper input, so the multiple copies of the input are not kept in the memory)adding standalone function
load_and_run
that can load and run a task from a pickle fileremoving
create_pyscript
and simplifying the slurm workerimproving error reports in errors flies
fixing
make_class
so theOutput
is properly formatted
0.5
fixing
hash_dir
functionadding
get_available_cpus
to get the number of CPUs available to the current process or available on the systemadding simple implementation for
BoshTask
that uses boutiques descriptoradding azure to CI
fixing code for windows
etelementry updates
adding more verbose output for task
result
- returns values or indices for input fieldsadding an experimental implementation of Dask Worker (limited testing with ci)
0.4
reorganization of the
State
class, fixing small issues with the classfixing some paths issues on windows os
adding osx and window sto the travis runs (right now allowing for failures for windows)
adding
PydraStateError
for exception in theState
classsmall fixes to the hashing functions, adding more tests
adding
hash_dir
to calculate hash forDirectory
type
0.3.1
passing
wf.cache_locations
to the taskusing
rerun
from submitter to all taskadding
test_rerun
andpropagate_rerun
for workflowsfixing task with a full combiner
adding
cont_dim
to specify dimensionality of the input variables (how much the input is nested)
0.3
adding sphinx documentation
moving from
dataclasses
toattrs
adding
container
flag to theShellCommandTask
fixing
cmdline
,command_args
andcontainer_args
for tasks with statesadding
CONTRIBUTING.md
fixing hash calculations for inputs with a list of files
using
attr.NOTHING
for input that is not set
0.2.2
supporting tuple as a single element of an input
0.2.1
fixing: nodes with states and input fields (from splitter) that are empty were failing
0.2
- big changes in
ShellTask
,DockerTask
andSingularityTask
customized input specification and output specification for
Task
sadding singularity checks to Travis CI
binding all input files to the container
- big changes in
- changes in
Workflow
passing all outputs to the next node:
lzout.all_
fixing inner splitter
- changes in
allowing for
splitter
andcombiner
updatesadding
etelementry
support
0.1
Core dataflow creation and management API
- Distributed workers:
concurrent futures
SLURM
Notebooks for Pydra concepts
0.0.1
Initial Pydra Dataflow Engine release.