Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: Update snakemake to v8 #493

Draft
wants to merge 115 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
115 commits
Select commit Hold shift + click to select a range
889556c
update environment.yaml for snakemake8 requirements
tedil Mar 19, 2024
b3486b1
update base and test requirements.txt to match environment.yaml / new…
tedil Mar 19, 2024
dba4186
RERUN_TRIGGERS are now accessible via the RerunTrigger enum
tedil Mar 19, 2024
b266684
ngs_mapping: for _get_input_files_run, accept **kwargs, as sm8 will s…
tedil Mar 19, 2024
7d502e0
use new SnakemakeApi for wrapper_parallel
tedil Mar 19, 2024
da259c7
fix OrderedDict.__str__ comparison
tedil Mar 19, 2024
d88b8d9
replace --use-conda with --software-deployment-method conda
tedil Mar 19, 2024
90bed58
constrain coverage in test reqs
tedil Mar 20, 2024
46480bf
use ruff instead of black+flake8+isort, enable commit hooks
tedil Apr 4, 2024
4b4aad1
modify ruff lint target categories slightly
tedil Apr 4, 2024
5672793
ruff format
tedil Apr 4, 2024
73838a1
use exclude rules from setup.cfg
tedil Apr 4, 2024
ba04cdb
move ruff requirement from dev to test
tedil Apr 4, 2024
becdbba
no python version req for now
tedil Apr 4, 2024
9e457f2
merge branch pre-commit-hooks-and-ruff into snake8
tedil Apr 8, 2024
cae1a79
slightly more precise version requirements; include test requirements…
tedil Apr 8, 2024
c1c863d
lower coveralls version req
tedil Apr 8, 2024
c49b61e
more info on biomedsheets in env yaml
tedil Apr 9, 2024
9c2f707
update snakemake.api calls, use submodules, wip
tedil Apr 10, 2024
07b64d6
replace subworkflows with module import statements for ngs_mapping + …
tedil Apr 15, 2024
df7c0d7
snakefmt
tedil Apr 15, 2024
8c4a768
include snakefiles in ignore files
tedil Apr 15, 2024
5fe1837
use absolute paths for input files (to appease parallel wrappers)
tedil Apr 15, 2024
d388ac2
type hints for snappy-pipeline/utils.py (@dictify, @listify)
tedil Apr 16, 2024
a1e5591
use shutils.which and actually use arg of binary_available(name)
tedil Apr 23, 2024
5e5dcd4
fmt
tedil Apr 23, 2024
797fe0b
for parallel wrapper, make sure input paths are realpaths, otherwise …
tedil Apr 23, 2024
838d700
replace 'output/' with 'work/' via sed in bwa wrapper when creating o…
tedil Apr 23, 2024
f304a55
re-introduce black formatting for now
tedil Apr 24, 2024
0ee8131
replace register_sub_workflow with register_module
tedil Apr 24, 2024
7173c10
fmt
tedil Apr 24, 2024
a6a95a1
replace all sub_workflows with modules explicitly
tedil Apr 24, 2024
49107ba
fix test_tumor_mutational_burden expected input file paths to match m…
tedil Apr 25, 2024
3dc6430
fix module usage in somatic_variant_annotation
tedil Apr 25, 2024
12495be
remove mock globals in tests (for subworkflows)
tedil Apr 25, 2024
d299bb3
fix ngs_mapping module in abstract basesteppart
tedil Apr 25, 2024
9e9c404
fix modules in SomaticVariantFiltration
tedil Apr 25, 2024
b413dd9
consistently use path_[subworkflow] setting from minimal config in te…
tedil Apr 25, 2024
0d668ca
adapt checkpoints mocks to snakemake 8 api
tedil Apr 25, 2024
a8b9007
reference add_ped_header.sh relative to project root
tedil Apr 25, 2024
f0b3a99
for mutect2 parallel wrapper workaround: fix paths in expected file; …
tedil Apr 25, 2024
327eba3
sync env yaml and requirements
tedil Apr 26, 2024
4bbee84
make sure all 'rule all's are named 'rule workflow_all' and marked as…
tedil May 21, 2024
b035d27
merge main into snake8
tedil Jul 1, 2024
c0f7d23
update to snakemake 8.14
tedil Jul 1, 2024
3696893
harmonize usage of NGS_MAPPING to ../ngs_mapping in tests
tedil Jul 1, 2024
28fe827
reinstate type hint for workflow
tedil Jul 1, 2024
f29814e
fix duplicate yaml key part1
tedil Jul 1, 2024
6048994
harmonize usage of SOMATIC_*_CALLING to ../somatic_*_calling in tests
tedil Jul 1, 2024
895b00f
harmonize paths some more, fix duplicate config keys
tedil Jul 1, 2024
f437671
fix more relative vs absolute paths in tests
tedil Jul 1, 2024
5d263d5
fix more relative vs absolute paths in tests
tedil Jul 1, 2024
4bedce1
update snappy base env in e2e test workflow
tedil Jul 1, 2024
f0673d0
fix: for submodules, the config may already be a pydantic model; in t…
tedil Jul 1, 2024
9148240
allow for additional keyword arguments in input function
tedil Jul 1, 2024
9999b5d
allow for additional keyword arguments in input function everywhere
tedil Jul 1, 2024
ab03ec8
update environment.yaml for snakemake8 requirements
tedil Mar 19, 2024
b8a5c6d
RERUN_TRIGGERS are now accessible via the RerunTrigger enum
tedil Mar 19, 2024
cadef6a
ngs_mapping: for _get_input_files_run, accept **kwargs, as sm8 will s…
tedil Mar 19, 2024
3c4cd3f
use new SnakemakeApi for wrapper_parallel
tedil Mar 19, 2024
ae90c90
fix OrderedDict.__str__ comparison
tedil Mar 19, 2024
03f15ec
replace --use-conda with --software-deployment-method conda
tedil Mar 19, 2024
ebb7eac
use ruff instead of black+flake8+isort, enable commit hooks
tedil Apr 4, 2024
6b7780a
modify ruff lint target categories slightly
tedil Apr 4, 2024
96423fe
ruff format
tedil Apr 4, 2024
3d3adb5
use exclude rules from setup.cfg
tedil Apr 4, 2024
a9bd1fd
update snakemake.api calls, use submodules, wip
tedil Apr 10, 2024
89345aa
replace subworkflows with module import statements for ngs_mapping + …
tedil Apr 15, 2024
0d998ab
snakefmt
tedil Apr 15, 2024
18aeba1
include snakefiles in ignore files
tedil Apr 15, 2024
85f63e3
use absolute paths for input files (to appease parallel wrappers)
tedil Apr 15, 2024
3c03e2b
type hints for snappy-pipeline/utils.py (@dictify, @listify)
tedil Apr 16, 2024
9e58fd2
use shutils.which and actually use arg of binary_available(name)
tedil Apr 23, 2024
74797c3
for parallel wrapper, make sure input paths are realpaths, otherwise …
tedil Apr 23, 2024
1050c18
replace 'output/' with 'work/' via sed in bwa wrapper when creating o…
tedil Apr 23, 2024
0e771f7
replace register_sub_workflow with register_module
tedil Apr 24, 2024
cd23778
replace all sub_workflows with modules explicitly
tedil Apr 24, 2024
fa87697
fix test_tumor_mutational_burden expected input file paths to match m…
tedil Apr 25, 2024
56ff4a4
fix module usage in somatic_variant_annotation
tedil Apr 25, 2024
454cb9f
remove mock globals in tests (for subworkflows)
tedil Apr 25, 2024
638941e
fix ngs_mapping module in abstract basesteppart
tedil Apr 25, 2024
f1342c7
fix modules in SomaticVariantFiltration
tedil Apr 25, 2024
fcf5c81
consistently use path_[subworkflow] setting from minimal config in te…
tedil Apr 25, 2024
90d281e
adapt checkpoints mocks to snakemake 8 api
tedil Apr 25, 2024
4c473a5
reference add_ped_header.sh relative to project root
tedil Apr 25, 2024
66e546e
for mutect2 parallel wrapper workaround: fix paths in expected file; …
tedil Apr 25, 2024
68df619
make sure all 'rule all's are named 'rule workflow_all' and marked as…
tedil May 21, 2024
22f06ab
update to snakemake 8.14
tedil Jul 1, 2024
cb9d635
harmonize usage of NGS_MAPPING to ../ngs_mapping in tests
tedil Jul 1, 2024
bf20ac4
reinstate type hint for workflow
tedil Jul 1, 2024
40e12b1
fix duplicate yaml key part1
tedil Jul 1, 2024
1cf880b
harmonize usage of SOMATIC_*_CALLING to ../somatic_*_calling in tests
tedil Jul 1, 2024
cd07601
harmonize paths some more, fix duplicate config keys
tedil Jul 1, 2024
2d29114
fix more relative vs absolute paths in tests
tedil Jul 1, 2024
6d03c79
fix more relative vs absolute paths in tests
tedil Jul 1, 2024
87c137b
update snappy base env in e2e test workflow
tedil Jul 1, 2024
cab6910
fix: for submodules, the config may already be a pydantic model; in t…
tedil Jul 1, 2024
dffe4c5
allow for additional keyword arguments in input function everywhere
tedil Jul 1, 2024
da51712
rebase
tedil Dec 6, 2024
06fc0ac
...and merge
tedil Dec 6, 2024
49bdada
fix path_ngs_mapping in tests
tedil Dec 6, 2024
8139f84
fix extraneous previous_steps param
tedil Dec 6, 2024
1b9999c
fix more test paths
tedil Dec 6, 2024
6091b83
fix available mutect2 actions
tedil Dec 6, 2024
0bf4831
fix incorrectly formatted yaml
tedil Dec 6, 2024
7828b22
fmt
tedil Dec 6, 2024
5725566
fix incorrectly formatted yaml #2
tedil Dec 6, 2024
b9834cb
fix more test paths
tedil Dec 6, 2024
d47884a
merge origin/main
tedil Dec 11, 2024
992fe1b
fix snakemake dependency upper constraints
tedil Dec 11, 2024
9876fff
fix formatting
tedil Dec 11, 2024
7e052a6
fix formatting #2
tedil Dec 11, 2024
916b67f
bump snakemake version
tedil Dec 12, 2024
a037ece
bump snakemake version #2
tedil Dec 12, 2024
ee4cbe1
bump more snakemake versions in wrapper envs
tedil Dec 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ci-e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -97,5 +97,5 @@ jobs:
snakefile: .tests/test-workflow/workflow/Snakefile
args: "--configfile .tests/test-workflow/config/config.yaml --use-conda --show-failed-logs -j 2 --conda-cleanup-pkgs cache"
show-disk-usage-on-error: true
snakemake-version: 7.32.4
snakemake-version: 8.25.5

2 changes: 1 addition & 1 deletion .tests/test-workflow/workflow/envs/snappy.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ dependencies:
- pydantic =2.7

# Snakemake is used for providing the actual wrapper calling functionality
- snakemake =7.32
- snakemake >=8.25.5,<9

# Additional libraries used by snappy
- ruamel.yaml ==0.18.6 # Nice, round-trip enabled YAML parsing
Expand Down
2 changes: 1 addition & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ dependencies:
- pydantic =2.7

# Snakemake is used for providing the actual wrapper calling functionality
- snakemake =7.32
- snakemake >=8.25.5

# Additional libraries used by snappy
- ruamel.yaml ==0.18.6 # Nice, round-trip enabled YAML parsing
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ dependencies = [
# Helpful for CLIs
"termcolor >=1.1.0,<3",
# Snakemake is used for providing the actual wrapper calling functionality
"snakemake >=7.32.0,<8",
"snakemake >=8.25.5,<9",
# Required for plotting
"matplotlib >=3.8.4",
# Library for working with VCF files.
Expand Down
Empty file added requirements/base.txt
Empty file.
Empty file added requirements/test.txt
Empty file.
21 changes: 21 additions & 0 deletions ruff.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
line-length = 100
exclude = [
"**/Snakefile",
]

[lint]
select = ["E", "F", "I"] # "B": flake8-bugbear, "COM": trailing commas, "W": whitespace, "C"
ignore = ["D", "F821", "E501"] # E501: line too long, F821: name used but unknown (e.g. "snakemake" in wrapper files)
exclude = [
"docs/conf.py",
"docs",
"tests",
"venv",
".*.py",
".snakemake.*.wrapper.py",
"splitMNPsAndComplex.py",
"wrapper.py",
"snappy_pipeline/__init__.py",
"versioneer.py",
"**/Snakefile"
]
14 changes: 8 additions & 6 deletions snappy_pipeline/apps/snappy_snake.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,12 @@
import datetime
import logging
import os
import subprocess
import sys
from shutil import which

import ruamel.yaml as ruamel_yaml
from snakemake import RERUN_TRIGGERS
from snakemake import main as snakemake_main
from snakemake.cli import main as snakemake_main
from snakemake.settings.enums import RerunTrigger

from .. import __version__
from ..workflows import (
Expand Down Expand Up @@ -60,6 +60,8 @@

__author__ = "Manuel Holtgrewe <[email protected]>"

# snakemake v8 now has an explicit enum for rerun triggers
RERUN_TRIGGERS = RerunTrigger.all()

#: Configuration file names
CONFIG_FILES = ("config.yaml", "config.json")
Expand Down Expand Up @@ -123,8 +125,7 @@ def setup_logging(args):


def binary_available(name):
retcode = subprocess.call(["which", "mamba"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
return retcode == 0
return which(name) is not None


def run(wrapper_args): # noqa: C901
Expand Down Expand Up @@ -179,8 +180,9 @@ def run(wrapper_args): # noqa: C901
if wrapper_args.conda_create_envs_only:
snakemake_argv.append("--conda-create-envs-only")
wrapper_args.use_conda = True
# TODO replace with --software-deployment-method conda
if wrapper_args.use_conda:
snakemake_argv.append("--use-conda")
snakemake_argv += ["--software-deployment-method", "conda"]
if mamba_available and wrapper_args.use_mamba:
snakemake_argv += ["--conda-frontend", "mamba"]
else:
Expand Down
9 changes: 8 additions & 1 deletion snappy_pipeline/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,10 @@
from collections import OrderedDict
from collections.abc import MutableMapping
from copy import deepcopy
from pathlib import Path
from typing import TYPE_CHECKING, Any, AnyStr, Dict

import pydantic
import ruamel.yaml as ruamel_yaml

from .models import SnappyModel, SnappyStepModel
Expand Down Expand Up @@ -56,8 +58,13 @@ def expand_ref(
- paths containing included config files
- config files included
"""
lookup_paths = lookup_paths or [os.getcwd()]
lookup_paths = lookup_paths or [os.getcwd(), str(Path(os.getcwd()).parent / ".snappy_pipeline")]
resolver = RefResolver(lookup_paths=lookup_paths, dict_class=dict_class)
# In case of submodules, the dict_data can be a pydantic model
# To work with the ref_resolver, we convert it to a dict first, excluding None values
# which ref_resolver does not support
if isinstance(dict_data, pydantic.BaseModel):
dict_data = dict_data.model_dump(by_alias=True, exclude_none=True)
# Perform resolution
resolved = resolver.resolve("file://" + config_path, dict_data)
# Collect paths of all included configuration files, important for
Expand Down
23 changes: 13 additions & 10 deletions snappy_pipeline/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,13 @@

__author__ = "Manuel Holtgrewe <[email protected]>"

import typing
from typing import Any, Callable, Generator, Iterable, TypeVar

F = TypeVar("F")
T = TypeVar("T")

def listify(gen):

def listify(gen: Callable[..., Generator[T, None, None]]) -> Callable[..., list[T]]:
"""Decorator that converts a generator into a function which returns a list

Use it in the case where a generator is easier to write but you want
Expand All @@ -19,14 +22,14 @@ def counter(max_no):
yield i
"""

def patched(*args, **kwargs):
def patched(*args, **kwargs) -> list[T]:
"""Wrapper function"""
return list(gen(*args, **kwargs))

return patched


def dictify[**P](gen) -> typing.Callable[P, dict]:
def dictify[**P](gen) -> Callable[P, dict]:
"""Decorator that converts a generator into a function which returns a dict

Use it in the case where a generator is easier to write but you want
Expand All @@ -39,20 +42,20 @@ def counter(max_no):
yield 'key{}'.format(i), i
"""

def patched(*args, **kwargs):
def patched(*args, **kwargs) -> dict[F, T]:
"""Wrapper function"""
return dict(gen(*args, **kwargs))

return patched


def flatten(coll: typing.List[typing.Union[str, typing.List[str]]]) -> typing.List[str]:
def flatten(coll: list[str | list[str] | Iterable[str]]) -> list[str]:
"""Flatten collection of strings or list of strings.

Source: https://stackoverflow.com/a/17865033
"""
for i in coll:
if isinstance(i, typing.Iterable) and not isinstance(i, str):
if isinstance(i, Iterable) and not isinstance(i, str):
for subc in flatten(i):
yield subc
else:
Expand All @@ -70,12 +73,12 @@ def try_or_none(func, exceptions):
return None


def is_none(value):
def is_none(value: Any) -> bool:
"""Helper function returning whether ``value is None``"""
return value is None


def is_not_none(value):
def is_not_none(value: Any) -> bool:
"""Helper function returning whether ``value is not None``"""
return value is not None

Expand All @@ -88,7 +91,7 @@ class DictQuery(dict):
- https://www.haykranen.nl/2016/02/13/handling-complex-nested-dicts-in-python/
"""

def get(self, path, default=None):
def get(self, path: str, default=None):
keys = path.split("/")
val = None

Expand Down
63 changes: 33 additions & 30 deletions snappy_pipeline/workflows/abstract/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@
import attr
import pydantic
import ruamel.yaml as ruamel_yaml
import snakemake
from biomedsheets import io_tsv
from biomedsheets.io import SheetBuilder, json_loads_ordered
from biomedsheets.models import SecondaryIDNotFoundException
Expand All @@ -31,6 +30,7 @@
write_pedigree_to_ped,
write_pedigrees_to_ped,
)
from snakemake.api import Workflow
from snakemake.io import InputFiles, OutputFiles, Wildcards, touch

from snappy_pipeline.base import (
Expand All @@ -39,7 +39,6 @@
merge_kwargs,
print_config,
print_sample_sheets,
snakefile_path,
)
from snappy_pipeline.find_file import FileSystemCrawler, PatternSet
from snappy_pipeline.models import SnappyStepModel
Expand Down Expand Up @@ -280,10 +279,10 @@ def get_input_files(self, action):

@listify
def get_input_files(wildcards):
if "ngs_mapping" not in self.parent.sub_workflows:
if "ngs_mapping" not in self.parent.modules:
return # early exit
# Get shortcut to NGS mapping sub workflow
ngs_mapping = self.parent.sub_workflows["ngs_mapping"]
ngs_mapping = self.parent.modules["ngs_mapping"]
# Get names of primary libraries of the selected pedigree. The pedigree is selected
# by the primary DNA NGS library of the index.
pedigree = self.index_ngs_library_to_pedigree[wildcards.index_ngs_library]
Expand Down Expand Up @@ -650,7 +649,7 @@ def default_config_yaml(self):

def __init__[C: SnappyStepModel](
self,
workflow: snakemake.Workflow,
workflow: Workflow,
config: MutableMapping[str, Any],
config_lookup_paths: tuple[str, ...],
config_paths: tuple[str, ...],
Expand All @@ -671,6 +670,7 @@ def __init__[C: SnappyStepModel](
self.previous_steps = tuple(previous_steps or [])
#: Snakefile "workflow" object
self.workflow = workflow
self.modules = {}
#: Setup logger for the step
self.logger = logging.getLogger(self.name)
#: Merge default configuration with true configuration
Expand Down Expand Up @@ -726,8 +726,6 @@ def __init__[C: SnappyStepModel](
)
# Setup onstart/onerror/onsuccess hooks
self._setup_hooks()
#: Functions from sub workflows, can be used to generate output paths into these workflows
self.sub_workflows: dict[str, snakemake.Workflow] = {}

# Even though we already validated via pydantic, we still call check_config here, as
# some of the checks done in substep check_config are not covered by the pydantic models yet
Expand Down Expand Up @@ -850,30 +848,35 @@ def register_sub_step_classes(
# obj.check_config()
self.sub_steps[klass.name] = obj

def register_sub_workflow(
self, step_name: str, workdir: str, sub_workflow_name: str | None = None
):
"""Register workflow with given pipeline ``step_name`` and in the given ``workdir``.
def register_module(self, step_name: str, prefix: os.PathLike, module_name: str | None = None):
"""
Register workflow with given pipeline ``step_name``, using the given ``prefix``.
This requires importing the respective workflow in the Snakefile
(since the module API is not intended to be used programmatically).
For example:

```
module ngs_mapping:
snakefile:
"../ngs_mapping/Snakefile"
config:
wf.w_config
prefix:
wf.w_config["step_config"]["your_workflow"].get("path_ngs_mapping", "../ngs_mapping")


Optionally, the sub workflow name can be given separate from ``step_name`` (the default)
use rule * from ngs_mapping
```

Optionally, the module name can be given separate from ``step_name`` (the default)
value for it.
"""
sub_workflow_name = sub_workflow_name or step_name
if sub_workflow_name in self.sub_workflows:
raise ValueError("Sub workflow {} already registered!".format(sub_workflow_name))
if os.path.isabs(workdir):
abs_workdir = workdir
else:
abs_workdir = os.path.realpath(os.path.join(os.getcwd(), workdir))
self.workflow.subworkflow(
sub_workflow_name,
workdir=abs_workdir,
snakefile=snakefile_path(step_name),
configfile=abs_workdir + "/" + "config.yaml",
)
self.sub_workflows[sub_workflow_name] = self.workflow.globals[sub_workflow_name]
module_name = module_name or step_name
if module_name in self.modules:
raise ValueError("Sub workflow {} already registered!".format(module_name))
self.modules[module_name] = lambda path: os.path.join(prefix, path)

def get_args(self, sub_step: str, action: str) -> Inputs | Callable[[Wildcards], Inputs]:
def get_args(self, sub_step, action):
"""Return arguments for action of substep with given wildcards

Delegates to the sub step object's get_args function
Expand Down Expand Up @@ -1137,13 +1140,13 @@ def _merge_cache_invalidate_paths(cls, data_set_infos):
# Iterate over DataSetInfo objects
for info in data_set_infos:
# Search paths - expects a list already
out_list.extend(getattr(info, "search_paths"))
out_list.extend(info.search_paths)

# Sheet path
# Only name of file is stored in config file (relative path used),
# hence we need to find it in the base paths
sheet_file_name = getattr(info, "sheet_path") # expects a string
base_paths = getattr(info, "base_paths") # expects a list
sheet_file_name = info.sheet_path # expects a string
base_paths = info.base_paths # expects a list
sheet_path = cls._find_sheet_file(sheet_file_name, base_paths)
# Append if not None
if sheet_path:
Expand Down
3 changes: 2 additions & 1 deletion snappy_pipeline/workflows/adapter_trimming/Snakefile
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,10 @@ localrules:
adapter_trimming_link_out_fastq_run,


rule all:
rule adapter_trimming_all:
input:
wf.get_result_files(),
default_target: True


# House-Keeping ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
Loading
Loading