Tutorial

Neue Updates und Verbesserungen zu Macfleet.

Wichtiger Hinweis

Die in diesen Tutorials bereitgestellten Codebeispiele und Skripte dienen nur zu Bildungszwecken. Macfleet ist nicht verantwortlich für Probleme, Schäden oder Sicherheitslücken, die durch die Verwendung, Änderung oder Implementierung dieser Beispiele entstehen können. Überprüfen und testen Sie Code immer in einer sicheren Umgebung, bevor Sie ihn in Produktionssystemen verwenden.

File and Folder Copy Management on macOS

Efficiently manage file and folder copying operations across your MacFleet deployment with enterprise-grade safety features, integrity verification, and comprehensive audit capabilities. This tutorial transforms basic cp commands into robust data distribution solutions.

Understanding Enterprise File Copy Operations

Enterprise file copying goes beyond basic duplication, requiring:

  • Safety validation to prevent accidental overwrites
  • Integrity verification to ensure data consistency
  • Permission preservation for security compliance
  • Progress monitoring for large operations
  • Rollback capabilities for failed transfers
  • Audit logging for compliance requirements

Core Copy Operations

Basic File Copy

#!/bin/bash

# Simple file copy with validation
copy_file() {
    local source="$1"
    local destination="$2"
    
    # Validate source exists
    if [[ ! -f "$source" ]]; then
        echo "Error: Source file '$source' not found"
        return 1
    fi
    
    # Create destination directory if needed
    local dest_dir=$(dirname "$destination")
    mkdir -p "$dest_dir"
    
    # Copy file
    if cp "$source" "$destination"; then
        echo "Successfully copied '$source' to '$destination'"
        return 0
    else
        echo "Failed to copy '$source' to '$destination'"
        return 1
    fi
}

# Usage example
# copy_file "/Users/admin/document.pdf" "/Users/shared/documents/document.pdf"

Basic Directory Copy

#!/bin/bash

# Recursive directory copy with verification
copy_directory() {
    local source="$1"
    local destination="$2"
    
    # Validate source directory exists
    if [[ ! -d "$source" ]]; then
        echo "Error: Source directory '$source' not found"
        return 1
    fi
    
    # Copy directory recursively
    if cp -R "$source" "$destination"; then
        echo "Successfully copied directory '$source' to '$destination'"
        return 0
    else
        echo "Failed to copy directory '$source' to '$destination'"
        return 1
    fi
}

# Usage example
# copy_directory "/Users/admin/project" "/Users/shared/projects/"

Enterprise Copy Management System

#!/bin/bash

# MacFleet Enterprise File Copy Management System
# Comprehensive file and folder copying with enterprise features

# Configuration
SCRIPT_NAME="MacFleet Copy Manager"
VERSION="1.0.0"
LOG_FILE="/var/log/macfleet_copy_operations.log"
TEMP_DIR="/tmp/macfleet_copy"
MAX_FILE_SIZE="10G"
ALLOWED_EXTENSIONS=(".pdf" ".docx" ".xlsx" ".pptx" ".txt" ".png" ".jpg" ".gif" ".mp4" ".mov")
RESTRICTED_PATHS=("/System" "/usr/bin" "/usr/sbin" "/private/var")
BUSINESS_HOURS_START=9
BUSINESS_HOURS_END=17

# Create necessary directories
mkdir -p "$TEMP_DIR"
mkdir -p "$(dirname "$LOG_FILE")"

# Logging function
log_operation() {
    local level="$1"
    local message="$2"
    local timestamp=$(date '+%Y-%m-%d %H:%M:%S')
    echo "[$timestamp] [$level] $message" | tee -a "$LOG_FILE"
}

# Check if current time is within business hours
is_business_hours() {
    local current_hour=$(date +%H)
    if [[ $current_hour -ge $BUSINESS_HOURS_START && $current_hour -lt $BUSINESS_HOURS_END ]]; then
        return 0
    else
        return 1
    fi
}

# Validate file extension
is_allowed_extension() {
    local file="$1"
    local extension="${file##*.}"
    extension=".$extension"
    
    for allowed in "${ALLOWED_EXTENSIONS[@]}"; do
        if [[ "$extension" == "$allowed" ]]; then
            return 0
        fi
    done
    return 1
}

# Check if path is restricted
is_restricted_path() {
    local path="$1"
    for restricted in "${RESTRICTED_PATHS[@]}"; do
        if [[ "$path" == "$restricted"* ]]; then
            return 0
        fi
    done
    return 1
}

# Get file size in human readable format
get_file_size() {
    local file="$1"
    if [[ -f "$file" ]]; then
        stat -f%z "$file" 2>/dev/null
    elif [[ -d "$file" ]]; then
        du -sk "$file" 2>/dev/null | awk '{print $1 * 1024}'
    else
        echo "0"
    fi
}

# Convert bytes to human readable format
format_size() {
    local bytes="$1"
    local sizes=("B" "KB" "MB" "GB" "TB")
    local unit=0
    
    while [[ $bytes -gt 1024 && $unit -lt 4 ]]; do
        bytes=$((bytes / 1024))
        ((unit++))
    done
    
    echo "${bytes}${sizes[$unit]}"
}

# Calculate checksum for integrity verification
calculate_checksum() {
    local file="$1"
    if [[ -f "$file" ]]; then
        shasum -a 256 "$file" 2>/dev/null | awk '{print $1}'
    else
        echo ""
    fi
}

# Verify copy integrity
verify_copy_integrity() {
    local source="$1"
    local destination="$2"
    
    if [[ -f "$source" && -f "$destination" ]]; then
        local source_checksum=$(calculate_checksum "$source")
        local dest_checksum=$(calculate_checksum "$destination")
        
        if [[ "$source_checksum" == "$dest_checksum" ]]; then
            log_operation "INFO" "Integrity verification passed for: $destination"
            return 0
        else
            log_operation "ERROR" "Integrity verification failed for: $destination"
            return 1
        fi
    elif [[ -d "$source" && -d "$destination" ]]; then
        # For directories, compare file counts and sizes
        local source_count=$(find "$source" -type f | wc -l)
        local dest_count=$(find "$destination" -type f | wc -l)
        
        if [[ $source_count -eq $dest_count ]]; then
            log_operation "INFO" "Directory integrity verification passed for: $destination"
            return 0
        else
            log_operation "ERROR" "Directory integrity verification failed for: $destination"
            return 1
        fi
    else
        log_operation "ERROR" "Cannot verify integrity - source or destination missing"
        return 1
    fi
}

# Create backup before overwrite
create_backup() {
    local file="$1"
    local backup_dir="/var/backups/macfleet/copy_operations"
    local timestamp=$(date '+%Y%m%d_%H%M%S')
    
    mkdir -p "$backup_dir"
    
    if [[ -e "$file" ]]; then
        local filename=$(basename "$file")
        local backup_file="$backup_dir/${filename}_backup_$timestamp"
        
        if cp -R "$file" "$backup_file"; then
            log_operation "INFO" "Backup created: $backup_file"
            echo "$backup_file"
            return 0
        else
            log_operation "ERROR" "Failed to create backup for: $file"
            return 1
        fi
    fi
}

# Enhanced file copy with enterprise features
enterprise_copy_file() {
    local source="$1"
    local destination="$2"
    local preserve_permissions="${3:-true}"
    local verify_integrity="${4:-true}"
    local create_backup_flag="${5:-true}"
    
    log_operation "INFO" "Starting file copy operation: $source -> $destination"
    
    # Pre-flight validations
    if [[ ! -f "$source" ]]; then
        log_operation "ERROR" "Source file not found: $source"
        return 1
    fi
    
    # Check if source is restricted
    if is_restricted_path "$source"; then
        log_operation "ERROR" "Source path is restricted: $source"
        return 1
    fi
    
    # Check if destination is restricted
    if is_restricted_path "$destination"; then
        log_operation "ERROR" "Destination path is restricted: $destination"
        return 1
    fi
    
    # Validate file extension
    if ! is_allowed_extension "$source"; then
        log_operation "WARNING" "File extension not in allowed list: $source"
    fi
    
    # Check file size
    local file_size=$(get_file_size "$source")
    local max_size_bytes=$(echo "$MAX_FILE_SIZE" | sed 's/G//' | awk '{print $1 * 1024 * 1024 * 1024}')
    
    if [[ $file_size -gt $max_size_bytes ]]; then
        log_operation "ERROR" "File too large: $(format_size $file_size) > $MAX_FILE_SIZE"
        return 1
    fi
    
    # Business hours check for large files (>100MB)
    if [[ $file_size -gt 104857600 ]] && ! is_business_hours; then
        log_operation "WARNING" "Large file copy outside business hours: $(format_size $file_size)"
    fi
    
    # Create destination directory
    local dest_dir=$(dirname "$destination")
    if ! mkdir -p "$dest_dir"; then
        log_operation "ERROR" "Failed to create destination directory: $dest_dir"
        return 1
    fi
    
    # Create backup if destination exists
    local backup_file=""
    if [[ -f "$destination" && "$create_backup_flag" == "true" ]]; then
        backup_file=$(create_backup "$destination")
        if [[ $? -ne 0 ]]; then
            log_operation "ERROR" "Failed to create backup for existing file: $destination"
            return 1
        fi
    fi
    
    # Perform the copy operation
    local copy_options=""
    if [[ "$preserve_permissions" == "true" ]]; then
        copy_options="-p"
    fi
    
    if cp $copy_options "$source" "$destination"; then
        log_operation "INFO" "File copied successfully: $(format_size $file_size)"
        
        # Verify integrity if requested
        if [[ "$verify_integrity" == "true" ]]; then
            if ! verify_copy_integrity "$source" "$destination"; then
                log_operation "ERROR" "Copy integrity verification failed, removing destination"
                rm -f "$destination"
                
                # Restore backup if it exists
                if [[ -n "$backup_file" && -f "$backup_file" ]]; then
                    cp "$backup_file" "$destination"
                    log_operation "INFO" "Restored backup file: $destination"
                fi
                return 1
            fi
        fi
        
        # Log successful operation
        log_operation "SUCCESS" "File copy completed: $source -> $destination"
        return 0
    else
        log_operation "ERROR" "Copy operation failed: $source -> $destination"
        
        # Restore backup if it exists
        if [[ -n "$backup_file" && -f "$backup_file" ]]; then
            cp "$backup_file" "$destination"
            log_operation "INFO" "Restored backup file: $destination"
        fi
        return 1
    fi
}

# Enhanced directory copy with enterprise features
enterprise_copy_directory() {
    local source="$1"
    local destination="$2"
    local preserve_permissions="${3:-true}"
    local verify_integrity="${4:-true}"
    local create_backup_flag="${5:-true}"
    
    log_operation "INFO" "Starting directory copy operation: $source -> $destination"
    
    # Pre-flight validations
    if [[ ! -d "$source" ]]; then
        log_operation "ERROR" "Source directory not found: $source"
        return 1
    fi
    
    # Check if source is restricted
    if is_restricted_path "$source"; then
        log_operation "ERROR" "Source path is restricted: $source"
        return 1
    fi
    
    # Check if destination is restricted
    if is_restricted_path "$destination"; then
        log_operation "ERROR" "Destination path is restricted: $destination"
        return 1
    fi
    
    # Calculate directory size
    local dir_size=$(get_file_size "$source")
    local file_count=$(find "$source" -type f | wc -l)
    
    log_operation "INFO" "Directory stats - Size: $(format_size $dir_size), Files: $file_count"
    
    # Business hours check for large directories (>1GB or >1000 files)
    if [[ $dir_size -gt 1073741824 || $file_count -gt 1000 ]] && ! is_business_hours; then
        log_operation "WARNING" "Large directory copy outside business hours"
    fi
    
    # Create backup if destination exists
    local backup_dir=""
    if [[ -d "$destination" && "$create_backup_flag" == "true" ]]; then
        backup_dir=$(create_backup "$destination")
        if [[ $? -ne 0 ]]; then
            log_operation "ERROR" "Failed to create backup for existing directory: $destination"
            return 1
        fi
    fi
    
    # Perform the copy operation
    local copy_options="-R"
    if [[ "$preserve_permissions" == "true" ]]; then
        copy_options="-Rp"
    fi
    
    if cp $copy_options "$source" "$destination"; then
        log_operation "INFO" "Directory copied successfully: $(format_size $dir_size)"
        
        # Verify integrity if requested
        if [[ "$verify_integrity" == "true" ]]; then
            if ! verify_copy_integrity "$source" "$destination"; then
                log_operation "ERROR" "Directory copy integrity verification failed, removing destination"
                rm -rf "$destination"
                
                # Restore backup if it exists
                if [[ -n "$backup_dir" && -d "$backup_dir" ]]; then
                    cp -R "$backup_dir" "$destination"
                    log_operation "INFO" "Restored backup directory: $destination"
                fi
                return 1
            fi
        fi
        
        # Log successful operation
        log_operation "SUCCESS" "Directory copy completed: $source -> $destination"
        return 0
    else
        log_operation "ERROR" "Directory copy operation failed: $source -> $destination"
        
        # Restore backup if it exists
        if [[ -n "$backup_dir" && -d "$backup_dir" ]]; then
            cp -R "$backup_dir" "$destination"
            log_operation "INFO" "Restored backup directory: $destination"
        fi
        return 1
    fi
}

# Bulk copy operations with progress monitoring
bulk_copy_operation() {
    local operation_type="$1"  # "file" or "directory"
    local source_list="$2"     # File containing source paths
    local destination_base="$3" # Base destination directory
    
    if [[ ! -f "$source_list" ]]; then
        log_operation "ERROR" "Source list file not found: $source_list"
        return 1
    fi
    
    local total_items=$(wc -l < "$source_list")
    local current_item=0
    local success_count=0
    local failure_count=0
    
    log_operation "INFO" "Starting bulk copy operation - Total items: $total_items"
    
    while IFS= read -r source_path; do
        ((current_item++))
        
        # Skip empty lines and comments
        [[ -z "$source_path" || "$source_path" =~ ^#.* ]] && continue
        
        local filename=$(basename "$source_path")
        local destination="$destination_base/$filename"
        
        echo "Processing [$current_item/$total_items]: $filename"
        
        if [[ "$operation_type" == "file" ]]; then
            if enterprise_copy_file "$source_path" "$destination"; then
                ((success_count++))
            else
                ((failure_count++))
            fi
        elif [[ "$operation_type" == "directory" ]]; then
            if enterprise_copy_directory "$source_path" "$destination"; then
                ((success_count++))
            else
                ((failure_count++))
            fi
        fi
        
        # Progress update
        local progress=$((current_item * 100 / total_items))
        echo "Progress: $progress% ($success_count successful, $failure_count failed)"
        
    done < "$source_list"
    
    log_operation "SUCCESS" "Bulk copy completed - Success: $success_count, Failed: $failure_count"
    return $failure_count
}

# Generate copy operation report
generate_copy_report() {
    local report_file="/tmp/macfleet_copy_report_$(date +%Y%m%d_%H%M%S).txt"
    
    {
        echo "MacFleet Copy Operations Report"
        echo "Generated: $(date)"
        echo "Hostname: $(hostname)"
        echo "=============================="
        echo ""
        
        echo "Recent Copy Operations (Last 24 hours):"
        if [[ -f "$LOG_FILE" ]]; then
            local yesterday=$(date -v-1d '+%Y-%m-%d')
            grep "$yesterday\|$(date '+%Y-%m-%d')" "$LOG_FILE" | tail -50
        else
            echo "No log file found"
        fi
        
        echo ""
        echo "System Information:"
        echo "Available Space:"
        df -h | grep -E "^/dev/"
        
        echo ""
        echo "Copy Operation Statistics:"
        if [[ -f "$LOG_FILE" ]]; then
            echo "Total Operations: $(grep -c "copy operation" "$LOG_FILE" 2>/dev/null || echo "0")"
            echo "Successful: $(grep -c "SUCCESS.*copy completed" "$LOG_FILE" 2>/dev/null || echo "0")"
            echo "Failed: $(grep -c "ERROR.*copy.*failed" "$LOG_FILE" 2>/dev/null || echo "0")"
        fi
        
    } > "$report_file"
    
    echo "Copy operations report saved to: $report_file"
    log_operation "INFO" "Copy report generated: $report_file"
}

# Main copy management function
main() {
    local action="${1:-help}"
    
    case "$action" in
        "copy-file")
            local source="$2"
            local destination="$3"
            local preserve_perms="${4:-true}"
            local verify="${5:-true}"
            local backup="${6:-true}"
            
            if [[ -z "$source" || -z "$destination" ]]; then
                echo "Usage: $0 copy-file <source> <destination> [preserve_permissions] [verify_integrity] [create_backup]"
                exit 1
            fi
            
            enterprise_copy_file "$source" "$destination" "$preserve_perms" "$verify" "$backup"
            ;;
        "copy-directory")
            local source="$2"
            local destination="$3"
            local preserve_perms="${4:-true}"
            local verify="${5:-true}"
            local backup="${6:-true}"
            
            if [[ -z "$source" || -z "$destination" ]]; then
                echo "Usage: $0 copy-directory <source> <destination> [preserve_permissions] [verify_integrity] [create_backup]"
                exit 1
            fi
            
            enterprise_copy_directory "$source" "$destination" "$preserve_perms" "$verify" "$backup"
            ;;
        "bulk-files")
            local source_list="$2"
            local destination="$3"
            
            if [[ -z "$source_list" || -z "$destination" ]]; then
                echo "Usage: $0 bulk-files <source_list_file> <destination_directory>"
                exit 1
            fi
            
            bulk_copy_operation "file" "$source_list" "$destination"
            ;;
        "bulk-directories")
            local source_list="$2"
            local destination="$3"
            
            if [[ -z "$source_list" || -z "$destination" ]]; then
                echo "Usage: $0 bulk-directories <source_list_file> <destination_directory>"
                exit 1
            fi
            
            bulk_copy_operation "directory" "$source_list" "$destination"
            ;;
        "report")
            generate_copy_report
            ;;
        "help"|*)
            echo "$SCRIPT_NAME v$VERSION"
            echo "Enterprise File and Folder Copy Management"
            echo ""
            echo "Usage: $0 <action> [options]"
            echo ""
            echo "Actions:"
            echo "  copy-file <source> <destination>       - Copy single file"
            echo "  copy-directory <source> <destination>  - Copy directory recursively"
            echo "  bulk-files <list_file> <destination>   - Bulk copy files from list"
            echo "  bulk-directories <list_file> <dest>    - Bulk copy directories from list"
            echo "  report                                  - Generate operations report"
            echo "  help                                    - Show this help message"
            echo ""
            echo "Features:"
            echo "  • Safety validation and integrity verification"
            echo "  • Automatic backup creation before overwrite"
            echo "  • Business hours compliance checking"
            echo "  • Comprehensive audit logging"
            echo "  • Permission preservation"
            echo "  • Progress monitoring for bulk operations"
            ;;
    esac
}

# Execute main function with all arguments
main "$@"

Quick Reference Commands

Single File Operations

# Copy file with full enterprise features
./copy_manager.sh copy-file "/Users/admin/document.pdf" "/Users/shared/documents/document.pdf"

# Copy file without backup creation
./copy_manager.sh copy-file "/source/file.txt" "/destination/file.txt" true true false

# Copy file without integrity verification (faster)
./copy_manager.sh copy-file "/source/file.txt" "/destination/file.txt" true false true

Directory Operations

# Copy directory with all subdirectories
./copy_manager.sh copy-directory "/Users/admin/project" "/Users/shared/projects/"

# Copy directory preserving permissions
./copy_manager.sh copy-directory "/source/folder" "/destination/" true true true

Bulk Operations

# Create file list for bulk operations
echo "/Users/admin/doc1.pdf" > /tmp/files_to_copy.txt
echo "/Users/admin/doc2.pdf" >> /tmp/files_to_copy.txt
echo "/Users/admin/doc3.pdf" >> /tmp/files_to_copy.txt

# Execute bulk file copy
./copy_manager.sh bulk-files "/tmp/files_to_copy.txt" "/Users/shared/documents/"

# Execute bulk directory copy
./copy_manager.sh bulk-directories "/tmp/dirs_to_copy.txt" "/Users/shared/projects/"

Integration Examples

JAMF Pro Integration

#!/bin/bash

# JAMF Pro script for enterprise file distribution
# Parameters: $4 = source_path, $5 = destination_path, $6 = operation_type

SOURCE_PATH="$4"
DESTINATION_PATH="$5"
OPERATION_TYPE="$6"

# Download copy manager if not present
if [[ ! -f "/usr/local/bin/macfleet_copy_manager.sh" ]]; then
    curl -o "/usr/local/bin/macfleet_copy_manager.sh" "https://scripts.macfleet.com/copy_manager.sh"
    chmod +x "/usr/local/bin/macfleet_copy_manager.sh"
fi

# Execute copy operation
case "$OPERATION_TYPE" in
    "file")
        /usr/local/bin/macfleet_copy_manager.sh copy-file "$SOURCE_PATH" "$DESTINATION_PATH"
        ;;
    "directory")
        /usr/local/bin/macfleet_copy_manager.sh copy-directory "$SOURCE_PATH" "$DESTINATION_PATH"
        ;;
    *)
        echo "Invalid operation type: $OPERATION_TYPE"
        exit 1
        ;;
esac

# Generate report
/usr/local/bin/macfleet_copy_manager.sh report

exit $?

Configuration Profile for Copy Policies

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>PayloadContent</key>
    <array>
        <dict>
            <key>PayloadType</key>
            <string>com.macfleet.copy.policy</string>
            <key>PayloadIdentifier</key>
            <string>com.macfleet.copy.policy.main</string>
            <key>PayloadDisplayName</key>
            <string>MacFleet Copy Policy</string>
            <key>MaxFileSize</key>
            <string>10G</string>
            <key>AllowedExtensions</key>
            <array>
                <string>.pdf</string>
                <string>.docx</string>
                <string>.xlsx</string>
                <string>.pptx</string>
            </array>
            <key>BusinessHoursOnly</key>
            <true/>
            <key>RequireIntegrityCheck</key>
            <true/>
            <key>CreateBackups</key>
            <true/>
        </dict>
    </array>
</dict>
</plist>

Security and Compliance Features

File System Monitoring

# Monitor copy operations with FSEvents
fswatch -r /Users/shared/ | while read event; do
    if [[ "$event" =~ "CREATED" ]]; then
        log_operation "MONITOR" "File created: $event"
    fi
done

Compliance Reporting

# Generate compliance report for auditors
generate_compliance_report() {
    local report_file="/var/reports/copy_compliance_$(date +%Y%m%d).csv"
    
    echo "Timestamp,User,Source,Destination,Size,Status,Checksum" > "$report_file"
    
    # Parse log file for copy operations
    grep "copy completed\|copy.*failed" "$LOG_FILE" | while read line; do
        # Extract relevant information and format as CSV
        echo "$line" | awk -F' - ' '{print $1","$2}' >> "$report_file"
    done
    
    echo "Compliance report generated: $report_file"
}

Troubleshooting

Common Issues and Solutions

IssueCauseSolution
Permission deniedInsufficient privilegesRun with sudo or check file permissions
Disk space errorDestination fullCheck available space with df -h
Integrity check failedCorrupt copyRe-run copy operation, check disk health
Source not foundIncorrect pathVerify source path exists
Operation timeoutLarge file/slow diskIncrease timeout or split operation

Log Analysis

# View recent copy operations
tail -f /var/log/macfleet_copy_operations.log

# Search for failed operations
grep "ERROR" /var/log/macfleet_copy_operations.log

# Count operations by status
grep -c "SUCCESS" /var/log/macfleet_copy_operations.log
grep -c "ERROR" /var/log/macfleet_copy_operations.log

Best Practices

  1. Always test on single device before fleet deployment
  2. Monitor disk space before large copy operations
  3. Use business hours restrictions for resource-intensive operations
  4. Enable integrity verification for critical data
  5. Maintain regular backups of important destinations
  6. Review logs regularly for failed operations
  7. Set appropriate file size limits based on network capacity
  8. Use bulk operations for efficiency at scale

This enterprise copy management system provides comprehensive safety, monitoring, and compliance features while maintaining the simplicity of basic copy operations for day-to-day fleet management tasks.

Tutorial

Neue Updates und Verbesserungen zu Macfleet.

Konfiguration eines GitHub Actions Runners auf einem Mac Mini (Apple Silicon)

GitHub Actions Runner

GitHub Actions ist eine leistungsstarke CI/CD-Plattform, die es Ihnen ermöglicht, Ihre Software-Entwicklungsworkflows zu automatisieren. Während GitHub gehostete Runner anbietet, bieten selbst-gehostete Runner erhöhte Kontrolle und Anpassung für Ihr CI/CD-Setup. Dieses Tutorial führt Sie durch die Einrichtung, Konfiguration und Verbindung eines selbst-gehosteten Runners auf einem Mac mini zur Ausführung von macOS-Pipelines.

Voraussetzungen

Bevor Sie beginnen, stellen Sie sicher, dass Sie haben:

  • Einen Mac mini (registrieren Sie sich bei Macfleet)
  • Ein GitHub-Repository mit Administratorrechten
  • Einen installierten Paketmanager (vorzugsweise Homebrew)
  • Git auf Ihrem System installiert

Schritt 1: Ein dediziertes Benutzerkonto erstellen

Erstellen Sie zunächst ein dediziertes Benutzerkonto für den GitHub Actions Runner:

# Das 'gh-runner' Benutzerkonto erstellen
sudo dscl . -create /Users/gh-runner
sudo dscl . -create /Users/gh-runner UserShell /bin/bash
sudo dscl . -create /Users/gh-runner RealName "GitHub runner"
sudo dscl . -create /Users/gh-runner UniqueID "1001"
sudo dscl . -create /Users/gh-runner PrimaryGroupID 20
sudo dscl . -create /Users/gh-runner NFSHomeDirectory /Users/gh-runner

# Das Passwort für den Benutzer setzen
sudo dscl . -passwd /Users/gh-runner ihr_passwort

# 'gh-runner' zur 'admin'-Gruppe hinzufügen
sudo dscl . -append /Groups/admin GroupMembership gh-runner

Wechseln Sie zum neuen Benutzerkonto:

su gh-runner

Schritt 2: Erforderliche Software installieren

Installieren Sie Git und Rosetta 2 (wenn Sie Apple Silicon verwenden):

# Git installieren, falls noch nicht installiert
brew install git

# Rosetta 2 für Apple Silicon Macs installieren
softwareupdate --install-rosetta

Schritt 3: Den GitHub Actions Runner konfigurieren

  1. Gehen Sie zu Ihrem GitHub-Repository
  2. Navigieren Sie zu Einstellungen > Actions > Runners

GitHub Actions Runner

  1. Klicken Sie auf "New self-hosted runner" (https://github.com/<username>/<repository>/settings/actions/runners/new)
  2. Wählen Sie macOS als Runner-Image und ARM64 als Architektur
  3. Folgen Sie den bereitgestellten Befehlen, um den Runner herunterzuladen und zu konfigurieren

GitHub Actions Runner

Erstellen Sie eine .env-Datei im _work-Verzeichnis des Runners:

# _work/.env Datei
ImageOS=macos15
XCODE_15_DEVELOPER_DIR=/Applications/Xcode.app/Contents/Developer
  1. Führen Sie das run.sh-Skript in Ihrem Runner-Verzeichnis aus, um die Einrichtung abzuschließen.
  2. Überprüfen Sie, dass der Runner aktiv ist und auf Jobs im Terminal wartet, und überprüfen Sie die GitHub-Repository-Einstellungen für die Runner-Zuordnung und den Idle-Status.

GitHub Actions Runner

Schritt 4: Sudoers konfigurieren (Optional)

Wenn Ihre Actions Root-Privilegien benötigen, konfigurieren Sie die sudoers-Datei:

sudo visudo

Fügen Sie die folgende Zeile hinzu:

gh-runner ALL=(ALL) NOPASSWD: ALL

Schritt 5: Den Runner in Workflows verwenden

Konfigurieren Sie Ihren GitHub Actions Workflow, um den selbst-gehosteten Runner zu verwenden:

name: Beispiel-Workflow

on:
  workflow_dispatch:

jobs:
  build:
    runs-on: [self-hosted, macOS, ARM64]
    steps:
      - name: NodeJS installieren
        run: brew install node

Der Runner ist bei Ihrem Repository authentifiziert und mit self-hosted, macOS und ARM64 markiert. Verwenden Sie ihn in Ihren Workflows, indem Sie diese Labels im runs-on-Feld angeben:

runs-on: [self-hosted, macOS, ARM64]

Best Practices

  • Halten Sie Ihre Runner-Software auf dem neuesten Stand
  • Überwachen Sie regelmäßig Runner-Logs auf Probleme
  • Verwenden Sie spezifische Labels für verschiedene Runner-Typen
  • Implementieren Sie angemessene Sicherheitsmaßnahmen
  • Erwägen Sie die Verwendung mehrerer Runner für Lastverteilung

Fehlerbehebung

Häufige Probleme und Lösungen:

  1. Runner verbindet sich nicht:

    • Überprüfen Sie die Netzwerkverbindung
    • Überprüfen Sie die Gültigkeit des GitHub-Tokens
    • Stellen Sie angemessene Berechtigungen sicher
  2. Build-Fehler:

    • Überprüfen Sie die Xcode-Installation
    • Überprüfen Sie erforderliche Abhängigkeiten
    • Überprüfen Sie Workflow-Logs
  3. Berechtigungsprobleme:

    • Überprüfen Sie Benutzerberechtigungen
    • Überprüfen Sie sudoers-Konfiguration
    • Überprüfen Sie Dateisystem-Berechtigungen

Fazit

Sie haben jetzt einen selbst-gehosteten GitHub Actions Runner auf Ihrem Mac mini konfiguriert. Diese Einrichtung bietet Ihnen mehr Kontrolle über Ihre CI/CD-Umgebung und ermöglicht es Ihnen, macOS-spezifische Workflows effizient auszuführen.

Denken Sie daran, Ihren Runner regelmäßig zu warten und ihn mit den neuesten Sicherheitspatches und Software-Versionen auf dem neuesten Stand zu halten.

Native App

Macfleet native App

Macfleet Installationsanleitung

Macfleet ist eine leistungsstarke Flottenmanagement-Lösung, die speziell für Cloud-gehostete Mac Mini-Umgebungen entwickelt wurde. Als Mac Mini Cloud-Hosting-Anbieter können Sie Macfleet verwenden, um Ihre gesamte Flotte virtualisierter Mac-Instanzen zu überwachen, zu verwalten und zu optimieren.

Diese Installationsanleitung führt Sie durch die Einrichtung der Macfleet-Überwachung auf macOS-, Windows- und Linux-Systemen, um eine umfassende Übersicht über Ihre Cloud-Infrastruktur zu gewährleisten.

🍎 macOS

  • Laden Sie die .dmg-Datei für Mac hier herunter
  • Doppelklicken Sie auf die heruntergeladene .dmg-Datei
  • Ziehen Sie die Macfleet-App in den Anwendungsordner
  • Werfen Sie die .dmg-Datei aus
  • Öffnen Sie Systemeinstellungen > Sicherheit & Datenschutz
    • Datenschutz-Tab > Bedienungshilfen
    • Aktivieren Sie Macfleet, um Überwachung zu erlauben
  • Starten Sie Macfleet aus den Anwendungen
  • Die Verfolgung startet automatisch

🪟 Windows

  • Laden Sie die .exe-Datei für Windows hier herunter
  • Rechtsklick auf die .exe-Datei > "Als Administrator ausführen"
  • Folgen Sie dem Installationsassistenten
  • Akzeptieren Sie die Allgemeinen Geschäftsbedingungen
  • Erlauben Sie in Windows Defender, wenn aufgefordert
  • Gewähren Sie Anwendungsüberwachungsberechtigungen
  • Starten Sie Macfleet aus dem Startmenü
  • Die Anwendung beginnt automatisch mit der Verfolgung

🐧 Linux

  • Laden Sie das .deb-Paket (Ubuntu/Debian) oder .rpm (CentOS/RHEL) hier herunter
  • Installieren Sie mit Ihrem Paketmanager
    • Ubuntu/Debian: sudo dpkg -i Macfleet-linux.deb
    • CentOS/RHEL: sudo rpm -ivh Macfleet-linux.rpm
  • Erlauben Sie X11-Zugriffsberechtigungen, wenn aufgefordert
  • Fügen Sie den Benutzer zu entsprechenden Gruppen hinzu, falls erforderlich
  • Starten Sie Macfleet aus dem Anwendungsmenü
  • Die Anwendung beginnt automatisch mit der Verfolgung

Hinweis: Nach der Installation auf allen Systemen melden Sie sich mit Ihren Macfleet-Anmeldedaten an, um Daten mit Ihrem Dashboard zu synchronisieren.