1: Background
I used kafka in the Linux environment before, and there was no problem with kafka itself. However, a project I am currently working on needs to be compatible with Window and Linux, and when I use Kafka in the Window environment, I found that Kafka itself is not as compatible with Windows as Linux.
2: Problem
When testing Kafka under Window, I found that Kafka had an error after it was launched for a period of time:
Another program is using the file and the process cannot access it.
[2021-07-06 09:06:10,800] ERROR Failed to clean up log for __consumer_offsets-42 in dir C:\tmp\kafka-logs due to IOException (kafka.server.LogDirFailureChannel)
java.nio.file.FileSystemException: C:\tmp\kafka-logs\__consumer_offsets-42\00000000000000000000.timeindex.cleaned ->
C:\tmp\kafka-logs\__consumer_offsets-42\00000000000000000000.timeindex.swap: 另一个程序正在使用此文件,进程无法访问。
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86) at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97) at sun.nio.fs.WindowsFileCopy.move(WindowsFileCopy.java:387) at sun.nio.fs.WindowsFileSystemProvider.move(WindowsFileSystemProvider.java:287) at java.nio.file.Files.move(Files.java:1395) at org.apache.kafka.common.utils.Utils.atomicMoveWithFallback(Utils.java:904) at kafka.log.AbstractIndex.renameTo(AbstractIndex.scala:210) at kafka.log.LazyIndex$IndexValue.renameTo(LazyIndex.scala:155) at kafka.log.LazyIndex.$anonfun$renameTo$1(LazyIndex.scala:79) at kafka.log.LazyIndex.renameTo(LazyIndex.scala:79) at kafka.log.LogSegment.changeFileSuffixes(LogSegment.scala:496) at kafka.log.Log.$anonfun$replaceSegments$4(Log.scala:2402) at kafka.log.Log.$anonfun$replaceSegments$4$adapted(Log.scala:2402) at scala.collection.immutable.List.foreach(List.scala:333) at kafka.log.Log.replaceSegments(Log.scala:2402) at kafka.log.Cleaner.cleanSegments(LogCleaner.scala:613) at kafka.log.Cleaner.$anonfun$doClean$6(LogCleaner.scala:538) at kafka.log.Cleaner.$anonfun$doClean$6$adapted(LogCleaner.scala:537) at scala.collection.immutable.List.foreach(List.scala:333) at kafka.log.Cleaner.doClean(LogCleaner.scala:537) at kafka.log.Cleaner.clean(LogCleaner.scala:511) at kafka.log.LogCleaner$CleanerThread.cleanLog(LogCleaner.scala:380) at kafka.log.LogCleaner$CleanerThread.cleanFilthiestLog(LogCleaner.scala:352) at kafka.log.LogCleaner$CleanerThread.tryCleanFilthiestLog(LogCleaner.scala:332) at kafka.log.LogCleaner$CleanerThread.doWork(LogCleaner.scala:321) at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:96) Suppressed: java.nio.file.FileSystemException: C:\tmp\kafka-logs\__consumer_offsets-42\00000000000000000000.timeindex.cleaned -> C:\tmp\kafka-logs\__consumer_offsets-42\ 00000000000000000000000.timeindex.swap: Another program is using this file and the process is inaccessible.
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86) at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97) at sun.nio.fs.WindowsFileCopy.move(WindowsFileCopy.java:301) at sun.nio.fs.WindowsFileSystemProvider.move(WindowsFileSystemProvider.java:287) at java.nio.file.Files.move(Files.java:1395) at org.apache.kafka.common.utils.Utils.atomicMoveWithFallback(Utils.java:901) ... 20 more
3: Cause
The Kafka log cleaning policy triggers that in the Windows environment, it is not allowed to rename the file while opening the log that needs to be cleaned (it is possible in the Linux environment), causing Kafka to crash.
The most common solution on the Internet is to "empty the log files of kafka and restart kafka", which is obviously unrealistic in a production environment. So at the beginning, I came up with the following two solutions for the situation:
Solution 1: Modify the log cleanup policy to change the log cleanup time to infinite (-1) to store kafka data logs permanently Disadvantages: (1) The disk space will continue to increase Option 2: Build a virtual machine on Windows (docker is the same, but more troublesome), and then deploy Kafka in the virtual machine Disadvantages: (1) O&M personnel need to know some Linux O&M knowledge (2) Increase memory consumption
Obviously, the shortcomings of both options were unbearable for me, so I turned my attention to the Kafka community for help.
Click here for a discussion on worshiping the gods:The hyperlink login is visible.
It is understood that the problem of Kafka under the window is not as easy as imagined.There is no official solution so far。 In other words (Don't use kafka under window! )
Solution 3: But even so, the need to use kafka under window still exists, so there are still many gods who are paying attention and proposing some solutions. One of the great gods has developed the kafka source code for this problem. After testing, his solution did solve the problem under window, making kafka available under window. However, for this solution, Kafka's contributor said that it may not be safe (that is, the patch cannot be integrated into the official version).
4: Solve (there is a download link for the compiled kafka package at the end) Although Kafka officials say that the patch is not very safe, the official has not done anything so far. And the demand does exist, so after some consideration, I still plan to use the method of option 3 to solve the problem once and for all, so we will put the patch in and recompile kafka.
4.1: Kafka patch version download
Download the kafka patch version of this author
The hyperlink login is visible.
4.2: Compile the Kafka patch version Since kafka compilation is done through gradle, you need to configure gradle first
For how to configure gradle, you can refer to this article (reprinted, infringement will be deleted) Installation and configuration of gradle under Windows:The hyperlink login is visible.
For information on how to compile Kafka, you can refer to github------ How to compile Kafka using Gradle:The hyperlink login is visible.
After compiling, you can get a Kafka package that can be used under Windows.
In line with the concept of "copy-paste" (after all, compiling the source code is still quite time-consuming), here is a download link for the kafka package that I compiled, and you can download it and use it directly.
kafka_2.12-2.3.0_window Download address for the anti-downtime version:
The hyperlink login is visible.
Original link:The hyperlink login is visible.
|