Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Populate more #4

Merged
merged 47 commits into from
May 19, 2024
Merged

Populate more #4

merged 47 commits into from
May 19, 2024

Conversation

RhinosF1
Copy link
Contributor

@RhinosF1 RhinosF1 commented May 18, 2024

Summary by CodeRabbit

  • New Features

    • Introduced new scripts for managing Puppet configuration and MediaWiki deployments.
    • Added functionality for uploading files to archive.org from the command line.
  • Bug Fixes

    • Updated package versions to ensure compatibility with Python 3.12.
  • Tests

    • Added unit tests for new features and deployment scripts.
  • Documentation

    • Updated setup.cfg with additional classifiers for better project categorisation.
    • Updated maintainer information in setup.py.

Copy link
Contributor

coderabbitai bot commented May 18, 2024

Walkthrough

This update introduces several new scripts and functionalities to the miraheze module, including tools for MediaWiki deployment, Puppet management, and file uploading to archive.org. It also updates package versions for compatibility with Python 3.12 and enhances the testing framework with new unit tests. Additionally, the setup configuration has been refined with more detailed metadata.

Changes

File Path Change Summary
.github/pythonchecks.txt Updated package versions for requests, filelock, and internetarchive for Python 3.12.
.github/workflows/python.yml Replaced flake8 with coverage run and modified mypy command.
miraheze/mediawiki/init.py No visible changes.
miraheze/mediawiki/check_read_only.py New script to check if a site is in read-only mode.
miraheze/mediawiki/cycle_puppet.py New script to disable and re-enable Puppet configuration management.
miraheze/mediawiki/disable_puppet.py New script to disable Puppet for a specified target.
miraheze/mediawiki/enable_puppet.py New script to enable Puppet agent functionality.
miraheze/mediawiki/iaupload.py New script for uploading files to archive.org with metadata.
miraheze/mediawiki/mwdeploy.py New script for MediaWiki deployment processes, including environment management and syncing.
tests/init.py No changes.
tests/test_iaupload.py New unit tests for the ArchiveUploader class.
tests/test_mwdeploy.py New test cases for MediaWiki deployment functions.
tests/test_mwscript.py New test cases for generating commands for MediaWiki scripts.
setup.cfg Added additional classifiers related to environment, audience, license, language, OS, and topics.
setup.py Updated maintainer information.

In the code's garden, scripts did bloom,
For MediaWiki, they make room.
Puppet dances, up and down,
Archive uploads, without a frown.
Tests are written, strong and bright,
Ensuring everything works just right.
With Python's 3.12 embrace,
Miraheze's future, we now trace.


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@RhinosF1 RhinosF1 changed the title Update setup.py Update metadata May 18, 2024
@RhinosF1 RhinosF1 changed the title Update metadata Populate more May 18, 2024
@RhinosF1
Copy link
Contributor Author

@coderabbitai review

Copy link
Contributor

coderabbitai bot commented May 18, 2024

Actions Performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@miraheze miraheze deleted a comment from coderabbitai bot May 18, 2024
@miraheze miraheze deleted a comment from coderabbitai bot May 18, 2024
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 19

Out of diff range and nitpick comments (2)
miraheze/puppet/cycle_puppet.py (1)

5-5: Clarify user interaction in the prompt.

The prompt 'press enter to re-enable puppet' could be more descriptive. It might be helpful to inform the user that Puppet is currently disabled and that pressing enter will re-enable it.

miraheze/mediawiki/iaupload.py (1)

45-56: Manage environment variables carefully.

The script modifies environment variables (HTTP_PROXY and HTTPS_PROXY) which could affect other parts of the application or subsequent requests. Consider using a more localized approach to setting proxies, such as configuring them directly in the request functions.

Review Details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits Files that changed from the base of the PR and between bd16552 and 62c286a.
Files selected for processing (17)
  • .github/pythonchecks.txt (1 hunks)
  • .github/workflows/python.yml (1 hunks)
  • miraheze/mediawiki/init.py (1 hunks)
  • miraheze/mediawiki/check_read_only.py (1 hunks)
  • miraheze/mediawiki/iaupload.py (1 hunks)
  • miraheze/mediawiki/mwdeploy.py (1 hunks)
  • miraheze/mediawiki/mwscript.py (1 hunks)
  • miraheze/puppet/init.py (1 hunks)
  • miraheze/puppet/cycle_puppet.py (1 hunks)
  • miraheze/puppet/disable_puppet.py (1 hunks)
  • miraheze/puppet/enable_puppet.py (1 hunks)
  • setup.cfg (1 hunks)
  • setup.py (1 hunks)
  • tests/init.py (1 hunks)
  • tests/test_iaupload.py (1 hunks)
  • tests/test_mwdeploy.py (1 hunks)
  • tests/test_mwscript.py (1 hunks)
Files skipped from review due to trivial changes (6)
  • .github/pythonchecks.txt
  • miraheze/mediawiki/init.py
  • miraheze/puppet/init.py
  • setup.cfg
  • setup.py
  • tests/init.py
Additional comments not posted (30)
.github/workflows/python.yml (1)

29-30: Ensure comprehensive testing in CI workflows.

The update from flake8 to coverage run --branch -m pytest tests in the CI workflow is a positive change, enhancing the comprehensiveness of the testing process. This change should help catch more issues before deployment.

tests/test_mwscript.py (22)

6-11: The test function test_get_command_simple correctly asserts the command generation for a simple script. Well done.


14-19: The test function test_get_command_extension accurately asserts the command generation for an extension script. Nicely implemented.


22-36: The test function test_get_command_extension_list effectively uses environment variables and mocks to test command generation for a list of extensions. Good use of patching.


39-44: The test function test_get_command_all correctly asserts the command generation for all wikis. This is crucial for comprehensive testing.


47-52: The test function test_get_command_beta accurately asserts the command generation for beta wikis. Excellent specificity in testing.


55-60: The test function test_get_command_args effectively tests command generation with additional arguments. This ensures flexibility in command configurations.


63-68: The test function test_get_command_subdir correctly asserts the command generation for scripts in subdirectories. This is important for maintaining modular script organisation.


71-76: The test function test_get_command_simple_runner effectively asserts the command generation using the run.php wrapper for a simple script. This is essential for testing wrapper integrations.


79-84: The test function test_get_command_extension_runner accurately asserts the command generation using the run.php wrapper for an extension script. Good coverage of wrapper functionality.


87-101: The test function test_get_command_extension_list_runner effectively uses environment variables and mocks to test command generation with the run.php wrapper for a list of extensions. Excellent use of patching and environment manipulation.


104-109: The test function test_get_command_all_runner correctly asserts the command generation using the run.php wrapper for all wikis. This ensures comprehensive testing of the wrapper functionality.


112-117: The test function test_get_command_beta_runner accurately asserts the command generation using the run.php wrapper for beta wikis. Excellent specificity in testing wrapper integrations.


120-125: The test function test_get_command_args_runner effectively tests command generation with the run.php wrapper and additional arguments. This ensures flexibility in command configurations with the wrapper.


128-133: The test function test_get_command_subdir_runner correctly asserts the command generation using the run.php wrapper for scripts in subdirectories. This is important for maintaining modular script organisation with the wrapper.


136-142: The test function test_get_command_class effectively asserts the command generation using the run.php wrapper for a class script. Good coverage of class-based script configurations.


145-150: The test function test_get_command_long_runner correctly asserts the command generation using the run.php wrapper for a long-running script. This is crucial for testing long operation scripts.


153-158: The test function test_get_command_long_runner_class_lower accurately asserts the command generation using the run.php wrapper for a long-running script with lowercase class name. Good attention to case sensitivity in script names.


161-166: The test function test_get_command_long_runner_class_mixed effectively asserts the command generation using the run.php wrapper for a long-running script with mixed case class name. Excellent handling of case variations in script names.


169-174: The test function test_get_command_class_norunner correctly asserts the command generation without using the run.php wrapper for a class script. This is important for testing direct script executions.


177-182: The test function test_get_command_wiki_typo effectively tests command generation with a typo in the wiki name. Good coverage of error handling in wiki name inputs.


185-190: The test function test_get_command_nowiki accurately tests command generation without specifying a wiki. This is crucial for testing default behaviours.


193-198: The test function test_get_command_longextension correctly asserts the command generation for a long extension name. Excellent coverage of long name handling.

tests/test_mwdeploy.py (5)

27-31: The test function test_get_change_tag_map correctly asserts the retrieval and structure of the change tag map. Well implemented.


33-39: The test function test_get_changed_files effectively uses mocking to simulate the retrieval of changed files. Good use of os.popen mocking.


41-55: The test function test_get_changed_files_type accurately categorises changed files by type using mocked data. Excellent coverage of file type differentiation.


57-63: The test function test_get_change_tags correctly asserts the retrieval of change tags from changed files. This is crucial for ensuring accurate tag generation.


70-90: The test function test_get_valid_extensions effectively uses mocking to test the retrieval of valid extensions for given versions. Good use of os.path.isdir and os.scandir mocking.

miraheze/mediawiki/mwdeploy.py (2)

227-238: Validate the logic in remote_sync_file.

The function remote_sync_file seems to handle file syncing across servers, but the logic for handling hostname and server names could be prone to errors. Ensure that the hostname comparison and path handling are robust and secure.

Verification successful

The logic in the remote_sync_file function appears to be correctly implemented. The function is used in two places within the same file (miraheze/mediawiki/mwdeploy.py), and both usages handle the server list and path syncing appropriately.

  • The function is called with the correct parameters, including time, serverlist, path, force, envinfo, and nolog.
  • The hostname comparison and path handling within the function are consistent with the intended logic.

No issues were found with the logic in remote_sync_file.

Scripts Executed

The following scripts were executed for the analysis:


Script:

#!/bin/bash
# Description: Verify the logic in `remote_sync_file`.

# Test: Search for the function usage. Expect: Correct handling of server list and path syncing.
rg --type python $'remote_sync_file'

Length of output: 66



Script:

#!/bin/bash
# Description: Verify the logic in `remote_sync_file`.

# Test: Search for the function usage. Expect: Correct handling of server list and path syncing.
rg 'remote_sync_file'

Length of output: 595



Script:

#!/bin/bash
# Description: Extract and review the relevant lines of code where `remote_sync_file` is used.

# Extract lines where `remote_sync_file` is used.
rg --context 5 'remote_sync_file' miraheze/mediawiki/mwdeploy.py

Length of output: 1242


618-646: Validate the logic in UpgradeExtensionsAction and UpgradeSkinsAction.

These actions are crucial for handling upgrades, but their dependency on the correct order of arguments (--versions before --upgrade-extensions or --upgrade-skins) could lead to user errors. Consider refactoring to reduce dependency on argument order or improve error messages.

Verification successful

The logic in UpgradeExtensionsAction and UpgradeSkinsAction appears to be correctly handling the validation of versions and extensions/skins. The error messages are clear about the requirement of the --versions argument and the validity of the extensions/skins.

  • miraheze/mediawiki/mwdeploy.py:
    • parser.add_argument('--upgrade-extensions', dest='upgrade_extensions', action=UpgradeExtensionsAction, help='extension(s) to upgrade')
    • parser.add_argument('--upgrade-skins', dest='upgrade_skins', action=UpgradeSkinsAction, help='skin(s) to upgrade')
Scripts Executed

The following scripts were executed for the analysis:


Script:

#!/bin/bash
# Description: Verify the logic in `UpgradeExtensionsAction` and `UpgradeSkinsAction`.

# Test: Search for the actions usage. Expect: Correct handling of version and skin/extension validation.
rg --type python $'UpgradeExtensionsAction|UpgradeSkinsAction'

Length of output: 94



Script:

#!/bin/bash
# Description: Verify the logic in `UpgradeExtensionsAction` and `UpgradeSkinsAction`.

# Test: Search for the actions usage. Expect: Correct handling of version and skin/extension validation.
rg --type py 'UpgradeExtensionsAction|UpgradeSkinsAction'

Length of output: 572

Comment on lines +3 to +6
from miraheze.puppet import disable_puppet, enable_puppet
disable_puppet.disable_puppet()
input('press enter to re-enable puppet')
enable_puppet.enable_puppet()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider handling exceptions for external calls.

The script directly calls disable_puppet() and enable_puppet() without any exception handling. If these functions fail (e.g., due to permission issues or network problems), the script will crash. Consider wrapping these calls in a try-except block to handle potential errors gracefully.

Comment on lines +7 to +9
system('sudo puppet agent --enable')
system('sudo puppet agent -tv')
system('logsalmsg enabled puppet')
Copy link
Contributor

@coderabbitai coderabbitai bot May 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use subprocess instead of os.system for better security and control.

Using os.system for executing shell commands is less secure and offers less control than using subprocess. It's recommended to use subprocess.run with explicit arguments to avoid shell injection vulnerabilities.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A shell injection is impossible here

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RhinosF1, while it may seem that shell injection is not a concern with the current commands, using os.system is generally discouraged due to several reasons:

  1. Security: Even if the current commands are safe, using os.system can lead to security vulnerabilities if the commands are ever modified to include user input or other dynamic content. It's a best practice to use subprocess.run to mitigate potential risks.

  2. Error Handling: subprocess.run provides better control over error handling. You can capture the output, check return codes, and handle exceptions more gracefully.

  3. Maintainability: Using subprocess.run makes the code more readable and maintainable. It clearly separates the command and its arguments, reducing the risk of errors.

Here's how you can refactor the enable_puppet function to use subprocess.run:

import subprocess

def enable_puppet() -> None:
    subprocess.run(['sudo', 'puppet', 'agent', '--enable'], check=True)
    subprocess.run(['sudo', 'puppet', 'agent', '-tv'], check=True)
    subprocess.run(['logsalmsg', 'enabled', 'puppet'], check=True)

This approach ensures that each command is executed securely and any errors are properly handled.

Let me know if you need further assistance or have any questions!

miraheze/puppet/enable_puppet.py Show resolved Hide resolved
Comment on lines +8 to +10
system(f'logsalmsg Disabling puppet for {sys.argv[1]}')
system('sudo puppet agent -tv')
system(f'sudo puppet agent --disable "{sys.argv[1]}"')
Copy link
Contributor

@coderabbitai coderabbitai bot May 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Validate command line arguments before use.

The script uses sys.argv[1] directly in system calls, which can lead to security vulnerabilities if the input is not validated. Consider validating or sanitising the input before use to prevent potential command injection attacks.


Use subprocess for system calls.

Replace os.system calls with subprocess.run for better security and control. This change also allows for better error handling and output capturing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably should move to argparse and --reason

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RhinosF1, I have updated the script to use argparse for argument parsing and subprocess.run for system calls. The --reason argument is now validated and used in the script. Here is the updated version:

#!/usr/bin/python3

import argparse
import subprocess

def disable_puppet(reason: str) -> None:
    subprocess.run(['logsalmsg', f'Disabling puppet for {reason}'], check=True)
    subprocess.run(['sudo', 'puppet', 'agent', '-tv'], check=True)
    subprocess.run(['sudo', 'puppet', 'agent', '--disable', reason], check=True)

def main():
    parser = argparse.ArgumentParser(description='Disable Puppet agent with a reason.')
    parser.add_argument('--reason', required=True, help='Reason for disabling Puppet')
    args = parser.parse_args()
    
    disable_puppet(args.reason)

if __name__ == '__main__':
    main()

This script now uses argparse to handle the --reason argument and subprocess.run for executing system commands securely.


New Learnings Added
User: RhinosF1
PR: miraheze/python-functions#4
File: miraheze/puppet/disable_puppet.py:8-10
Timestamp: 2024-05-18T16:56:10.426Z
Learning: In the `disable_puppet.py` script, use `argparse` for argument parsing and `subprocess.run` for system calls to improve security and error handling.

Note: Learnings are effective only in the context of specific code segments. To apply general coding guidelines, please configure review instructions.

miraheze/mediawiki/check_read_only.py Show resolved Hide resolved
Comment on lines +72 to +78
def get_valid_extensions(versions: list[str]) -> list[str]:
valid_extensions = []
for version in versions:
extensions_path = f'/srv/mediawiki-staging/{version}/extensions/'
with os.scandir(extensions_path) as extensions:
valid_extensions += [extension.name for extension in extensions if os.path.isdir(os.path.join(extension.path, '.git'))]
return sorted(valid_extensions)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Optimise the directory scanning in get_valid_extensions.

Using os.scandir() is efficient, but you might consider filtering directories directly in the comprehension to avoid unnecessary list operations.

- valid_extensions += [extension.name for extension in extensions if os.path.isdir(os.path.join(extension.path, '.git'))]
+ valid_extensions.extend(extension.name for extension in extensions if extension.is_dir() and '.git' in os.listdir(extension.path))

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
def get_valid_extensions(versions: list[str]) -> list[str]:
valid_extensions = []
for version in versions:
extensions_path = f'/srv/mediawiki-staging/{version}/extensions/'
with os.scandir(extensions_path) as extensions:
valid_extensions += [extension.name for extension in extensions if os.path.isdir(os.path.join(extension.path, '.git'))]
return sorted(valid_extensions)
def get_valid_extensions(versions: list[str]) -> list[str]:
valid_extensions = []
for version in versions:
extensions_path = f'/srv/mediawiki-staging/{version}/extensions/'
with os.scandir(extensions_path) as extensions:
valid_extensions.extend(extension.name for extension in extensions if extension.is_dir() and '.git' in os.listdir(extension.path))
return sorted(valid_extensions)

Comment on lines +109 to +125
def get_change_tag_map() -> dict[re.Pattern, str]:
build_regex = re.compile(r'^.*?(\.github/.*?|\.phan/.*?|tests/.*?|composer(\.json|\.lock)|package(-lock)?\.json|yarn\.lock|(\.phpcs|\.stylelintrc|\.eslintrc|\.prettierrc|\.stylelintignore|\.eslintignore|\.prettierignore|tsconfig)\.json|\.nvmrc|\.svgo\.config\.js|Gruntfile\.js|bundlesize\.config\.json|jsdoc\.json)$')
codechange_regex = re.compile(
rf'(?!.*{build_regex.pattern})'
r'^.*?(\.(php|js|css|less|scss|vue|lua|mustache|d\.ts)|extension(-repo|-client)?\.json|skin\.json)$',
)
schema_regex = re.compile(
rf'(?!.*{build_regex.pattern})'
r'^.*?\.sql$',
)
i18n_regex = re.compile(r'^.*?i18n/.*?\.json$')
return {
codechange_regex: 'code change',
schema_regex: 'schema change',
build_regex: 'build',
i18n_regex: 'i18n',
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor the regex compilation in get_change_tag_map for efficiency.

Compiling regex patterns every time the function is called can be inefficient. Consider moving the regex compilation to the global scope or using a caching mechanism.

- build_regex = re.compile(r'^.*?(\.github/.*?|\.phan/.*?|tests/.*?|composer(\.json|\.lock)|package(-lock)?\.json|yarn\.lock|(\.phpcs|\.stylelintrc|\.eslintrc|\.prettierrc|\.stylelintignore|\.eslintignore|\.prettierignore|tsconfig)\.json|\.nvmrc|\.svgo\.config\.js|Gruntfile\.js|bundlesize\.config\.json|jsdoc\.json)$')
+ global build_regex
+ build_regex = re.compile(r'^.*?(\.github/.*?|\.phan/.*?|tests/.*?|composer(\.json|\.lock)|package(-lock)?\.json|yarn\.lock|(\.phpcs|\.stylelintrc|\.eslintrc|\.prettierrc|\.stylelintignore|\.eslintignore|\.prettierignore|tsconfig)\.json|\.nvmrc|\.svgo\.config\.js|Gruntfile\.js|bundlesize\.config\.json|jsdoc\.json)$')

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
def get_change_tag_map() -> dict[re.Pattern, str]:
build_regex = re.compile(r'^.*?(\.github/.*?|\.phan/.*?|tests/.*?|composer(\.json|\.lock)|package(-lock)?\.json|yarn\.lock|(\.phpcs|\.stylelintrc|\.eslintrc|\.prettierrc|\.stylelintignore|\.eslintignore|\.prettierignore|tsconfig)\.json|\.nvmrc|\.svgo\.config\.js|Gruntfile\.js|bundlesize\.config\.json|jsdoc\.json)$')
codechange_regex = re.compile(
rf'(?!.*{build_regex.pattern})'
r'^.*?(\.(php|js|css|less|scss|vue|lua|mustache|d\.ts)|extension(-repo|-client)?\.json|skin\.json)$',
)
schema_regex = re.compile(
rf'(?!.*{build_regex.pattern})'
r'^.*?\.sql$',
)
i18n_regex = re.compile(r'^.*?i18n/.*?\.json$')
return {
codechange_regex: 'code change',
schema_regex: 'schema change',
build_regex: 'build',
i18n_regex: 'i18n',
}
global build_regex
build_regex = re.compile(r'^.*?(\.github/.*?|\.phan/.*?|tests/.*?|composer(\.json|\.lock)|package(-lock)?\.json|yarn\.lock|(\.phpcs|\.stylelintrc|\.eslintrc|\.prettierrc|\.stylelintignore|\.eslintignore|\.prettierignore|tsconfig)\.json|\.nvmrc|\.svgo\.config\.js|Gruntfile\.js|bundlesize\.config\.json|jsdoc\.json)$')
def get_change_tag_map() -> dict[re.Pattern, str]:
codechange_regex = re.compile(
rf'(?!.*{build_regex.pattern})'
r'^.*?(\.(php|js|css|less|scss|vue|lua|mustache|d\.ts)|extension(-repo|-client)?\.json|skin\.json)$',
)
schema_regex = re.compile(
rf'(?!.*{build_regex.pattern})'
r'^.*?\.sql$',
)
i18n_regex = re.compile(r'^.*?i18n/.*?\.json$')
return {
codechange_regex: 'code change',
schema_regex: 'schema change',
build_regex: 'build',
i18n_regex: 'i18n',
}

Comment on lines +178 to +224
def check_up(nolog: bool, Debug: str | None = None, Host: str | None = None, domain: str = 'meta.miraheze.org', verify: bool = True, force: bool = False, port: int = 443) -> bool:
if verify is False:
os.environ['PYTHONWARNINGS'] = 'ignore:Unverified HTTPS request'
if not Debug and not Host:
raise Exception('Host or Debug must be specified')

headers = {}
if Debug:
server = f'{Debug}.wikitide.net'
headers['X-WikiTide-Debug'] = server
location = f'{domain}@{server}'

debug_access_key = os.getenv('DEBUG_ACCESS_KEY')

# Check if DEBUG_ACCESS_KEY is set and add it to headers
if debug_access_key:
headers['X-WikiTide-Debug-Access-Key'] = debug_access_key
else:
os.environ['NO_PROXY'] = 'localhost'
domain = 'localhost'
headers['host'] = f'{Host}'
location = f'{Host}@{domain}'
up = False
if port == 443:
proto = 'https://'
else:
proto = 'http://'
req = requests.get(f'{proto}{domain}:{port}/w/api.php?action=query&meta=siteinfo&formatversion=2&format=json', headers=headers, verify=verify)
if req.status_code == 200 and 'miraheze' in req.text and (Debug is None or Debug in req.headers['X-Served-By']):
up = True
if not up:
print(f'Status: {req.status_code}')
print(f'Text: {"miraheze" in req.text} \n {req.text}')
if 'X-Served-By' not in req.headers:
req.headers['X-Served-By'] = 'None'
print(f'Debug: {(Debug is None or Debug in req.headers["X-Served-By"])}')
if force:
print(f'Ignoring canary check error on {location} due to --force')
else:
print(f'Canary check failed for {location}. Aborting... - use --force to proceed')
message = f'/usr/local/bin/logsalmsg DEPLOY ABORTED: Canary check failed for {location}'
if nolog:
print(message)
else:
os.system(message)
sys.exit(3)
return up
Copy link
Contributor

@coderabbitai coderabbitai bot May 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review the error handling and user feedback in check_up.

The function check_up could improve its error handling and user feedback. The current implementation might confuse users with its mixed use of print statements and system exits. Consider using a more structured approach to logging and error reporting.

- print(f'Canary check failed for {location}. Aborting... - use --force to proceed')
+ logger.error(f'Canary check failed for {location}. Aborting... - use --force to proceed')

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
def check_up(nolog: bool, Debug: str | None = None, Host: str | None = None, domain: str = 'meta.miraheze.org', verify: bool = True, force: bool = False, port: int = 443) -> bool:
if verify is False:
os.environ['PYTHONWARNINGS'] = 'ignore:Unverified HTTPS request'
if not Debug and not Host:
raise Exception('Host or Debug must be specified')
headers = {}
if Debug:
server = f'{Debug}.wikitide.net'
headers['X-WikiTide-Debug'] = server
location = f'{domain}@{server}'
debug_access_key = os.getenv('DEBUG_ACCESS_KEY')
# Check if DEBUG_ACCESS_KEY is set and add it to headers
if debug_access_key:
headers['X-WikiTide-Debug-Access-Key'] = debug_access_key
else:
os.environ['NO_PROXY'] = 'localhost'
domain = 'localhost'
headers['host'] = f'{Host}'
location = f'{Host}@{domain}'
up = False
if port == 443:
proto = 'https://'
else:
proto = 'http://'
req = requests.get(f'{proto}{domain}:{port}/w/api.php?action=query&meta=siteinfo&formatversion=2&format=json', headers=headers, verify=verify)
if req.status_code == 200 and 'miraheze' in req.text and (Debug is None or Debug in req.headers['X-Served-By']):
up = True
if not up:
print(f'Status: {req.status_code}')
print(f'Text: {"miraheze" in req.text} \n {req.text}')
if 'X-Served-By' not in req.headers:
req.headers['X-Served-By'] = 'None'
print(f'Debug: {(Debug is None or Debug in req.headers["X-Served-By"])}')
if force:
print(f'Ignoring canary check error on {location} due to --force')
else:
print(f'Canary check failed for {location}. Aborting... - use --force to proceed')
message = f'/usr/local/bin/logsalmsg DEPLOY ABORTED: Canary check failed for {location}'
if nolog:
print(message)
else:
os.system(message)
sys.exit(3)
return up
def check_up(nolog: bool, Debug: str | None = None, Host: str | None = None, domain: str = 'meta.miraheze.org', verify: bool = True, force: bool = False, port: int = 443) -> bool:
if verify is False:
os.environ['PYTHONWARNINGS'] = 'ignore:Unverified HTTPS request'
if not Debug and not Host:
raise Exception('Host or Debug must be specified')
headers = {}
if Debug:
server = f'{Debug}.wikitide.net'
headers['X-WikiTide-Debug'] = server
location = f'{domain}@{server}'
debug_access_key = os.getenv('DEBUG_ACCESS_KEY')
# Check if DEBUG_ACCESS_KEY is set and add it to headers
if debug_access_key:
headers['X-WikiTide-Debug-Access-Key'] = debug_access_key
else:
os.environ['NO_PROXY'] = 'localhost'
domain = 'localhost'
headers['host'] = f'{Host}'
location = f'{Host}@{domain}'
up = False
if port == 443:
proto = 'https://'
else:
proto = 'http://'
req = requests.get(f'{proto}{domain}:{port}/w/api.php?action=query&meta=siteinfo&formatversion=2&format=json', headers=headers, verify=verify)
if req.status_code == 200 and 'miraheze' in req.text and (Debug is None or Debug in req.headers['X-Served-By']):
up = True
if not up:
print(f'Status: {req.status_code}')
print(f'Text: {"miraheze" in req.text} \n {req.text}')
if 'X-Served-By' not in req.headers:
req.headers['X-Served-By'] = 'None'
print(f'Debug: {(Debug is None or Debug in req.headers["X-Served-By"])}')
if force:
print(f'Ignoring canary check error on {location} due to --force')
else:
logger.error(f'Canary check failed for {location}. Aborting... - use --force to proceed')
message = f'/usr/local/bin/logsalmsg DEPLOY ABORTED: Canary check failed for {location}'
if nolog:
print(message)
else:
os.system(message)
sys.exit(3)
return up

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

need to import logger too

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RhinosF1, thank you for pointing that out. Here is the updated suggestion with the import statement for the logger and basic configuration:

import logging

# Set up basic configuration for the logger
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

def check_up(nolog: bool, Debug: str | None = None, Host: str | None = None, domain: str = 'meta.miraheze.org', verify: bool = True, force: bool = False, port: int = 443) -> bool:
    if verify is False:
        os.environ['PYTHONWARNINGS'] = 'ignore:Unverified HTTPS request'
    if not Debug and not Host:
        raise Exception('Host or Debug must be specified')

    headers = {}
    if Debug:
        server = f'{Debug}.wikitide.net'
        headers['X-WikiTide-Debug'] = server
        location = f'{domain}@{server}'

        debug_access_key = os.getenv('DEBUG_ACCESS_KEY')

        # Check if DEBUG_ACCESS_KEY is set and add it to headers
        if debug_access_key:
            headers['X-WikiTide-Debug-Access-Key'] = debug_access_key
    else:
        os.environ['NO_PROXY'] = 'localhost'
        domain = 'localhost'
        headers['host'] = f'{Host}'
        location = f'{Host}@{domain}'
    up = False
    if port == 443:
        proto = 'https://'
    else:
        proto = 'http://'
    req = requests.get(f'{proto}{domain}:{port}/w/api.php?action=query&meta=siteinfo&formatversion=2&format=json', headers=headers, verify=verify)
    if req.status_code == 200 and 'miraheze' in req.text and (Debug is None or Debug in req.headers['X-Served-By']):
        up = True
    if not up:
        print(f'Status: {req.status_code}')
        print(f'Text: {"miraheze" in req.text} \n {req.text}')
        if 'X-Served-By' not in req.headers:
            req.headers['X-Served-By'] = 'None'
        print(f'Debug: {(Debug is None or Debug in req.headers["X-Served-By"])}')
        if force:
            print(f'Ignoring canary check error on {location} due to --force')
        else:
            logger.error(f'Canary check failed for {location}. Aborting... - use --force to proceed')
            message = f'/usr/local/bin/logsalmsg DEPLOY ABORTED: Canary check failed for {location}'
            if nolog:
                print(message)
            else:
                os.system(message)
            sys.exit(3)
    return up

This should ensure that the logging is properly handled. Let me know if you need any further adjustments!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

imports go at top of file

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RhinosF1, thank you for the feedback. Here is the updated suggestion with the import statements moved to the top of the file:

import logging
import os
import requests
import socket
import sys
import time

# Set up basic configuration for the logger
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

def check_up(nolog: bool, Debug: str | None = None, Host: str | None = None, domain: str = 'meta.miraheze.org', verify: bool = True, force: bool = False, port: int = 443) -> bool:
    if verify is False:
        os.environ['PYTHONWARNINGS'] = 'ignore:Unverified HTTPS request'
    if not Debug and not Host:
        raise Exception('Host or Debug must be specified')

    headers = {}
    if Debug:
        server = f'{Debug}.wikitide.net'
        headers['X-WikiTide-Debug'] = server
        location = f'{domain}@{server}'

        debug_access_key = os.getenv('DEBUG_ACCESS_KEY')

        # Check if DEBUG_ACCESS_KEY is set and add it to headers
        if debug_access_key:
            headers['X-WikiTide-Debug-Access-Key'] = debug_access_key
    else:
        os.environ['NO_PROXY'] = 'localhost'
        domain = 'localhost'
        headers['host'] = f'{Host}'
        location = f'{Host}@{domain}'
    up = False
    if port == 443:
        proto = 'https://'
    else:
        proto = 'http://'
    req = requests.get(f'{proto}{domain}:{port}/w/api.php?action=query&meta=siteinfo&formatversion=2&format=json', headers=headers, verify=verify)
    if req.status_code == 200 and 'miraheze' in req.text and (Debug is None or Debug in req.headers['X-Served-By']):
        up = True
    if not up:
        print(f'Status: {req.status_code}')
        print(f'Text: {"miraheze" in req.text} \n {req.text}')
        if 'X-Served-By' not in req.headers:
            req.headers['X-Served-By'] = 'None'
        print(f'Debug: {(Debug is None or Debug in req.headers["X-Served-By"])}')
        if force:
            print(f'Ignoring canary check error on {location} due to --force')
        else:
            logger.error(f'Canary check failed for {location}. Aborting... - use --force to proceed')
            message = f'/usr/local/bin/logsalmsg DEPLOY ABORTED: Canary check failed for {location}'
            if nolog:
                print(message)
            else:
                os.system(message)
            sys.exit(3)
    return up

This should ensure that the imports are correctly placed at the top of the file. Let me know if you need any further adjustments!


New Learnings Added
User: RhinosF1
PR: miraheze/python-functions#4
File: miraheze/mediawiki/mwdeploy.py:178-224
Timestamp: 2024-05-18T17:25:44.011Z
Learning: When suggesting changes to import statements, ensure they are placed at the top of the file as per best practices.

Note: Learnings are effective only in the context of specific code segments. To apply general coding guidelines, please configure review instructions.

Comment on lines +252 to +269
def _construct_rsync_command(time: bool | str, dest: str, recursive: bool = True, local: bool = True, location: str | None = None, server: str | None = None) -> str:
if time:
params = '--inplace'
else:
params = '--update'
if recursive:
params = params + ' -r --delete'
if local:
if location is None:
raise Exception('Location must be specified for local rsync.')
return f'sudo -u {DEPLOYUSER} rsync {params} --exclude=".*" {location} {dest}'
if location is None:
location = dest
if location == dest and server: # ignore location if not specified, if given must equal dest.
return f'sudo -u {DEPLOYUSER} rsync {params} -e "ssh -i /srv/mediawiki-staging/deploykey" {dest} {DEPLOYUSER}@{server}.wikitide.net:{dest}'
# a return None here would be dangerous - except and ignore R503 as return after Exception is not reachable
raise Exception(f'Error constructing command. Either server was missing or {location} != {dest}')

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Simplify the construction of rsync commands in _construct_rsync_command.

The function _construct_rsync_command is quite complex and hard to follow. Consider refactoring it to improve readability and maintainability.

- if local:
-     if location is None:
-         raise Exception('Location must be specified for local rsync.')
-     return f'sudo -u {DEPLOYUSER} rsync {params} --exclude=".*" {location} {dest}'
- if location is None:
-     location = dest
- if location == dest and server:  # ignore location if not specified, if given must equal dest.
-     return f'sudo -u {DEPLOYUSER} rsync {params} -e "ssh -i /srv/mediawiki-staging/deploykey" {dest} {DEPLOYUSER}@{server}.wikitide.net:{dest}'
+ simplified_command = f'sudo -u {DEPLOYUSER} rsync {params} --exclude=".*" '
+ if local:
+     if location is None:
+         raise Exception('Location must be specified for local rsync.')
+     return simplified_command + f'{location} {dest}'
+ if location is None:
+     location = dest
+ if location == dest and server:
+     return simplified_command + f'-e "ssh -i /srv/mediawiki-staging/deploykey" {dest} {DEPLOYUSER}@{server}.wikitide.net:{dest}'

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
def _construct_rsync_command(time: bool | str, dest: str, recursive: bool = True, local: bool = True, location: str | None = None, server: str | None = None) -> str:
if time:
params = '--inplace'
else:
params = '--update'
if recursive:
params = params + ' -r --delete'
if local:
if location is None:
raise Exception('Location must be specified for local rsync.')
return f'sudo -u {DEPLOYUSER} rsync {params} --exclude=".*" {location} {dest}'
if location is None:
location = dest
if location == dest and server: # ignore location if not specified, if given must equal dest.
return f'sudo -u {DEPLOYUSER} rsync {params} -e "ssh -i /srv/mediawiki-staging/deploykey" {dest} {DEPLOYUSER}@{server}.wikitide.net:{dest}'
# a return None here would be dangerous - except and ignore R503 as return after Exception is not reachable
raise Exception(f'Error constructing command. Either server was missing or {location} != {dest}')
def _construct_rsync_command(time: bool | str, dest: str, recursive: bool = True, local: bool = True, location: str | None = None, server: str | None = None) -> str:
if time:
params = '--inplace'
else:
params = '--update'
if recursive:
params = params + ' -r --delete'
simplified_command = f'sudo -u {DEPLOYUSER} rsync {params} --exclude=".*" '
if local:
if location is None:
raise Exception('Location must be specified for local rsync.')
return simplified_command + f'{location} {dest}'
if location is None:
location = dest
if location == dest and server:
return simplified_command + f'-e "ssh -i /srv/mediawiki-staging/deploykey" {dest} {DEPLOYUSER}@{server}.wikitide.net:{dest}'
raise Exception(f'Error constructing command. Either server was missing or {location} != {dest}')

Comment on lines +303 to +382
def run(args: argparse.Namespace, start: float) -> None: # pragma: no cover
loginfo = {}
for arg in vars(args).items():
if arg[1] is not None and arg[1] is not False:
if isinstance(arg[1], list) and len(arg[1]) == 1:
loginfo[arg[0]] = arg[1][0]
else:
loginfo[arg[0]] = arg[1]

if args.upgrade_world and not args.reset_world:
args.world = True
args.pull = 'world'
args.l10n = True
args.ignore_time = True
args.extension_list = True
args.upgrade_vendor = True
args.upgrade_extensions = get_valid_extensions(args.versions)
args.upgrade_skins = get_valid_skins(args.versions)

if len(args.servers) > 1 and args.servers == get_environment_info()['servers']:
loginfo['servers'] = 'all'

use_version = args.world or args.l10n or args.extension_list or args.reset_world or args.upgrade_extensions or args.upgrade_skins or args.upgrade_vendor

if args.versions:
if args.upgrade_extensions == get_valid_extensions(args.versions):
loginfo['upgrade_extensions'] = 'all'

if args.upgrade_skins == get_valid_skins(args.versions):
loginfo['upgrade_skins'] = 'all'

if args.upgrade_pack:
del loginfo['upgrade_extensions']
del loginfo['upgrade_skins']

if not use_version:
del loginfo['versions']

synced = loginfo['servers']
del loginfo['servers']

text = f'starting deploy of "{str(loginfo)}" to {synced}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {text}')
else:
print(text)

exitcodes = run_process(args=args)
failed = non_zero_code(ec=exitcodes, leave=False)

fintext = f'finished deploy of "{str(loginfo)}" to {synced}'

if failed:
fintext += f' - FAIL: {exitcodes}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)
sys.exit(1)

if use_version:
for version in args.versions:
exitcodes = run_process(args=args, version=version)
failed = non_zero_code(ec=exitcodes, leave=False)

if failed:
fintext += f' - FAIL: {exitcodes}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)
sys.exit(1)

fintext += ' - SUCCESS'
fintext += f' in {str(int(time.time() - start))}s'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)

Copy link
Contributor

@coderabbitai coderabbitai bot May 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure comprehensive logging and error handling in run.

The function run handles a complex deployment process but lacks comprehensive error handling and logging. Consider implementing a more robust logging system and structured error handling to improve reliability and maintainability.

- print(f'finished deploy of "{str(loginfo)}" to {synced}')
+ logger.info(f'finished deploy of "{str(loginfo)}" to {synced}')
- sys.exit(1)
+ raise DeploymentError('Deployment failed due to non-zero exit codes')

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
def run(args: argparse.Namespace, start: float) -> None: # pragma: no cover
loginfo = {}
for arg in vars(args).items():
if arg[1] is not None and arg[1] is not False:
if isinstance(arg[1], list) and len(arg[1]) == 1:
loginfo[arg[0]] = arg[1][0]
else:
loginfo[arg[0]] = arg[1]
if args.upgrade_world and not args.reset_world:
args.world = True
args.pull = 'world'
args.l10n = True
args.ignore_time = True
args.extension_list = True
args.upgrade_vendor = True
args.upgrade_extensions = get_valid_extensions(args.versions)
args.upgrade_skins = get_valid_skins(args.versions)
if len(args.servers) > 1 and args.servers == get_environment_info()['servers']:
loginfo['servers'] = 'all'
use_version = args.world or args.l10n or args.extension_list or args.reset_world or args.upgrade_extensions or args.upgrade_skins or args.upgrade_vendor
if args.versions:
if args.upgrade_extensions == get_valid_extensions(args.versions):
loginfo['upgrade_extensions'] = 'all'
if args.upgrade_skins == get_valid_skins(args.versions):
loginfo['upgrade_skins'] = 'all'
if args.upgrade_pack:
del loginfo['upgrade_extensions']
del loginfo['upgrade_skins']
if not use_version:
del loginfo['versions']
synced = loginfo['servers']
del loginfo['servers']
text = f'starting deploy of "{str(loginfo)}" to {synced}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {text}')
else:
print(text)
exitcodes = run_process(args=args)
failed = non_zero_code(ec=exitcodes, leave=False)
fintext = f'finished deploy of "{str(loginfo)}" to {synced}'
if failed:
fintext += f' - FAIL: {exitcodes}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)
sys.exit(1)
if use_version:
for version in args.versions:
exitcodes = run_process(args=args, version=version)
failed = non_zero_code(ec=exitcodes, leave=False)
if failed:
fintext += f' - FAIL: {exitcodes}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)
sys.exit(1)
fintext += ' - SUCCESS'
fintext += f' in {str(int(time.time() - start))}s'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)
def run(args: argparse.Namespace, start: float) -> None: # pragma: no cover
loginfo = {}
for arg in vars(args).items():
if arg[1] is not None and arg[1] is not False:
if isinstance(arg[1], list) and len(arg[1]) == 1:
loginfo[arg[0]] = arg[1][0]
else:
loginfo[arg[0]] = arg[1]
if args.upgrade_world and not args.reset_world:
args.world = True
args.pull = 'world'
args.l10n = True
args.ignore_time = True
args.extension_list = True
args.upgrade_vendor = True
args.upgrade_extensions = get_valid_extensions(args.versions)
args.upgrade_skins = get_valid_skins(args.versions)
if len(args.servers) > 1 and args.servers == get_environment_info()['servers']:
loginfo['servers'] = 'all'
use_version = args.world or args.l10n or args.extension_list or args.reset_world or args.upgrade_extensions or args.upgrade_skins or args.upgrade_vendor
if args.versions:
if args.upgrade_extensions == get_valid_extensions(args.versions):
loginfo['upgrade_extensions'] = 'all'
if args.upgrade_skins == get_valid_skins(args.versions):
loginfo['upgrade_skins'] = 'all'
if args.upgrade_pack:
del loginfo['upgrade_extensions']
del loginfo['upgrade_skins']
if not use_version:
del loginfo['versions']
synced = loginfo['servers']
del loginfo['servers']
text = f'starting deploy of "{str(loginfo)}" to {synced}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {text}')
else:
print(text)
exitcodes = run_process(args=args)
failed = non_zero_code(ec=exitcodes, leave=False)
fintext = f'finished deploy of "{str(loginfo)}" to {synced}'
if failed:
fintext += f' - FAIL: {exitcodes}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)
raise DeploymentError('Deployment failed due to non-zero exit codes')
if use_version:
for version in args.versions:
exitcodes = run_process(args=args, version=version)
failed = non_zero_code(ec=exitcodes, leave=False)
if failed:
fintext += f' - FAIL: {exitcodes}'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)
raise DeploymentError('Deployment failed due to non-zero exit codes')
fintext += ' - SUCCESS'
fintext += f' in {str(int(time.time() - start))}s'
if not args.nolog:
os.system(f'/usr/local/bin/logsalmsg {fintext}')
else:
print(fintext)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no DeploymentError

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RhinosF1, understood. Here is the revised error handling for the run function without using DeploymentError:

def run(args: argparse.Namespace, start: float) -> None:  # pragma: no cover
    loginfo = {}
    for arg in vars(args).items():
        if arg[1] is not None and arg[1] is not False:
            if isinstance(arg[1], list) and len(arg[1]) == 1:
                loginfo[arg[0]] = arg[1][0]
            else:
                loginfo[arg[0]] = arg[1]

    if args.upgrade_world and not args.reset_world:
        args.world = True
        args.pull = 'world'
        args.l10n = True
        args.ignore_time = True
        args.extension_list = True
        args.upgrade_vendor = True
        args.upgrade_extensions = get_valid_extensions(args.versions)
        args.upgrade_skins = get_valid_skins(args.versions)

    if len(args.servers) > 1 and args.servers == get_environment_info()['servers']:
        loginfo['servers'] = 'all'

    use_version = args.world or args.l10n or args.extension_list or args.reset_world or args.upgrade_extensions or args.upgrade_skins or args.upgrade_vendor

    if args.versions:
        if args.upgrade_extensions == get_valid_extensions(args.versions):
            loginfo['upgrade_extensions'] = 'all'

        if args.upgrade_skins == get_valid_skins(args.versions):
            loginfo['upgrade_skins'] = 'all'

        if args.upgrade_pack:
            del loginfo['upgrade_extensions']
            del loginfo['upgrade_skins']

        if not use_version:
            del loginfo['versions']

    synced = loginfo['servers']
    del loginfo['servers']

    text = f'starting deploy of "{str(loginfo)}" to {synced}'
    if not args.nolog:
        os.system(f'/usr/local/bin/logsalmsg {text}')
    else:
        print(text)

    exitcodes = run_process(args=args)
    failed = non_zero_code(ec=exitcodes, leave=False)

    fintext = f'finished deploy of "{str(loginfo)}" to {synced}'

    if failed:
        fintext += f' - FAIL: {exitcodes}'
        if not args.nolog:
            os.system(f'/usr/local/bin/logsalmsg {fintext}')
        else:
            print(fintext)
        sys.exit(1)

    if use_version:
        for version in args.versions:
            exitcodes = run_process(args=args, version=version)
            failed = non_zero_code(ec=exitcodes, leave=False)

            if failed:
                fintext += f' - FAIL: {exitcodes}'
                if not args.nolog:
                    os.system(f'/usr/local/bin/logsalmsg {fintext}')
                else:
                    print(fintext)
                sys.exit(1)

    fintext += ' - SUCCESS'
    fintext += f' in {str(int(time.time() - start))}s'
    if not args.nolog:
        os.system(f'/usr/local/bin/logsalmsg {fintext}')
    else:
        print(fintext)

This version maintains comprehensive logging and uses sys.exit(1) for error handling instead of raising an exception. Let me know if any further adjustments are needed.

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
@RhinosF1 RhinosF1 merged commit 00ba333 into master May 19, 2024
8 checks passed
@RhinosF1 RhinosF1 deleted the RhinosF1-patch-2 branch May 19, 2024 11:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

1 participant