How to archive IIS logs automatically?
Jul 30, 2025 am 12:35 AMTo automatically archive IIS logs, you can automatically run by setting the log rolling cycle, using PowerShell scripts to compress legacy logs, and using task schedulers. 1. Set the log file scrolling interval in IIS Manager. It is recommended to scroll every day or scroll by size (such as 10MB~100MB) for easy subsequent processing; 2. Write PowerShell scripts, find log files that exceed the set number of days (such as 7 days) under the specified path, compress them to the specified directory and delete the original file; 3. Create basic tasks through the task scheduler, set the trigger frequency (such as daily), run the script with the highest permissions, and add the parameter -ExecutionPolicy Bypass to ensure the stable execution of the script. In addition, it is necessary to clarify the log retention policy, regularly backup archived data, and monitor script execution status to improve the reliability of automated management.
If you are running an IIS-based website, the log files will continue to grow over time, not only occupying disk space, but may also affect server performance. Manual archives are both cumbersome and easy to miss, so many people will ask: How to automatically archive IIS logs?

The answer is actually not complicated. As long as you make rational use of Windows' own tools and some scripts, it can be easily implemented.
Set log rolling cycle
IIS itself supports cutting log files by time or size, which is the first step in automated archiving.

- Find your site in IIS Manager and click the "Logs" section
- Modify the "Log File Scrolling Interval", it is usually recommended to set it to Daily (Daily)
- If you care more about file size, you can choose "Scroll by size" and set the maximum value, such as 10MB or 100MB
After doing this, the logs are divided into multiple small files for subsequent processing and archives, rather than piled up into a huge file.
Compress old logs using batch processing or PowerShell scripts
Once the number of generated log files is large, the space occupied will also increase. It is a common practice to use scripts to compress these logs regularly.

You can write a simple PowerShell script, for example:
$LogPath = "C:\inetpub\logs\LogFiles" $ArchivePath = "D:\iis_logs_archive" $DaysToKeep = 7 Get-ChildItem -Path $LogPath -Recurse -File -Include *.log | Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-$DaysToKeep) } | ForEach-Object { Compress-Archive -Path $_.FullName -DestinationPath "$ArchivePath\$($_.Name).zip" -Force Remove-Item $_.FullName -Force }
The purpose of this script is:
- Find log files that have been in the specified path for more than 7 days
- Compress it to another directory
- Delete the original file after compression is completed
You can save this script as a .ps1
file and run it regularly through the task scheduler, such as execution once every morning.
Automatically run scripts with the task scheduler
To truly automate the entire process, it is necessary to use Windows' "task planner".
The operation steps are as follows:
- Open Task Scheduler
- Create basic tasks and set the trigger frequency (such as daily and weekly)
- Select "Start Program" as the operation type
- Browse to your PowerShell script and set execution permissions (note that it runs with the highest permissions)
What should be noted is:
- PowerShell default execution policy may prevent scripts from running, it is recommended to add parameters to tasks
-ExecutionPolicy Bypass
- It is best to use absolute paths to avoid relative paths causing the script to find the file
In this way, the system will automatically complete log compression and cleaning for you at the set time.
Notes and optimization suggestions
- Log retention policy should be clear : not all logs need to be permanently retained, and the number of retention days can be set according to business needs.
- Backup archived data : It is also recommended to back up the compressed logs regularly to prevent data loss due to hard disk corruption.
- Monitor script execution status : You can add logging function to output the results of each run to text, which is convenient for troubleshooting problems.
Basically these are the methods. The whole process is not complicated, but the key point lies in the stability of the script and the accuracy of task scheduling. IIS log management can be much easier as long as it is configured properly.
The above is the detailed content of How to archive IIS logs automatically?. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undress AI Tool
Undress images for free

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

How to use MySQL triggers to implement automatic archiving of data Introduction: In the field of modern data management, automatic archiving and cleaning of data is an important and common requirement. As the amount of data increases, retaining complete historical data will occupy excessive storage resources and reduce query performance. MySQL triggers provide an effective way to achieve this requirement. This article will introduce how to use MySQL triggers to achieve automatic archiving of data. 1. What is a MySQL trigger? A MySQL trigger is a special kind of memory.

To search for specific strings in IIS logs, use built-in Windows tools or scripts. 1. Use the findstr command of the command prompt to search recursively, such as: findstr/s/i/m"string"*.log; 2. Use PowerShell to perform more flexible searches, such as: Get-ChildItem combined with Select-String and supports regular expressions; 3. When using frequently, you can use the LogParser tool to support SQL syntax query and can export results; 4. Note that the log location may be different, and large files need to optimize the search method.

IIS logs are stored in the inetpub\logs\LogFiles directory of the C drive by default and will not be cleaned automatically. The retention period needs to be controlled manually or through scripts. To modify the path, you can open IIS Manager → select a site or server node → double-click "Login" → click "..." to select a new directory. It is recommended to use non-system disks such as D:\IISLogs or multiple servers to configure the network path in a unified manner; set retention time can be achieved through LogParser scripts, task planning PowerShell scripts (such as 30 days of retention), third-party tools, etc.; in addition, it is recommended to adjust the log format as needed, close unnecessary fields, or temporarily close the debug log, and enable log compression to optimize performance and space usage.

IIS logs on multiple servers can be implemented in the following ways: 1. Use Windows event forwarding, suitable for scenarios where logs have been written to event logs, create subscriptions on the central server and configure forwarding rules on each IIS server; 2. Use file sharing scripts to collect regularly, suitable for small environments, use scripts to copy log files from each server regularly, combining robocopy or xcopy with scheduled task execution; 3. Deploy log collection tools such as Logstash, NXLog, Fluentd, suitable for large-scale environments, support automatic collection, filtering, compression and forwarding, and have failed retry and breakpoint continuous transmission functions. In addition, it is necessary to unify the log path, configure access permissions, pay attention to the log rotation mechanism and consider compression

IIS can automatically split logs by file size through registry configuration. 1. Enter the "Log" setting in the IIS manager, check "Enablelogrolloverbasedonfilesize", and uncheck "Schedule". 2. Modify the registry path HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\W3SVC\Parameters, add or modify the MaxFileSizeDWORD value (unit bytes), such as 100MB is 104857600. 3. Restart the IIS effective settings and pay attention to setting the file size reasonably to balance performance and management

Application pool crashes can quickly locate the causes by analyzing the IIS log. 1. First check the W3SVC log at the crash time point, search for 503 errors, and determine whether it is caused by application pool crash or frequent recycling; 2. Combine the HTTPERR log to check whether there are any underlying error entries such as Connection_Dropped or RequestQueueFull, and confirm that the backend cannot respond; 3. Check the application and system logs in the event viewer, find events such as 5002, 5015, 5017 from WAS or IIS-WMSVC sources, and confirm that the application pool life cycle abnormality; 4. Troubleshoot common causes, such as code exceptions, unavailability of dependency resources, rapid failure triggering, memory leaks, etc., and combine debugging tools

ThedefaultIISlogfilepathisC:\inetpub\logs\LogFiles,butitcanbecustomized.1.EachwebsitehasitsownsubfolderlikeW3SVC1.2.Tofindtheexactpath,openIISManager,selectthesite,andchecktheLoggingsection.3.LogsusetheW3Cformatandcontaindetailslikerequesttime,IPaddr

ParsingIISlogswithPowerShell is a way to quickly get useful information without complex tools. 1. First understand the IIS log format, which defaults to W3C extended log format, and fields are separated by spaces; 2. Use the Import-Csv command to import the log file and skip the comment lines, pay attention to handling the quotation fields; 3. Use Where-Object, Group-Object and other commands to filter 404 errors, count IP requests, and query specific page access; 4. The analysis results can be exported to CSV for reporting; 5. This method is suitable for small and medium-sized log files. Large-scale or complex analysis can consider tools such as LogParser and ELKStack.
