我正在运行一个程序来测试查找和迭代包含大量文件的文件夹中的所有文件的速度。该过程中最慢的部分是创建100万个文件。我正在使用一种非常天真的方法来创建文件:
Console.Write("Creating {0:N0} file(s) of size {1:N0} bytes... ",
options.FileCount, options.FileSize);
var createTimer = Stopwatch.StartNew();
var fileNames = new List<string>();
for (long i = 0; i < options.FileCount; i++)
{
var filename = Path.Combine(options.Directory.FullName,
CreateFilename(i, options.FileCount));
using (var file = new FileStream(filename, FileMode.CreateNew,
FileAccess.Write, FileShare.None, 4096,
FileOptions.WriteThrough))
{
// I have an option to write some data to files, but it's not being used.
// That's why there's a using here.
}
fileNames.Add(filename);
}
createTimer.Stop();
Console.WriteLine("Done.");
// Other code appears here.....
Console.WriteLine("Time to CreateFiles: {0:N3}sec ({1:N2} files/sec, 1 in {2:N4}ms)"
, createTimer.Elapsed.TotalSeconds
, (double)total / createTimer.Elapsed.TotalSeconds
, createTimer.Elapsed.TotalMilliseconds / (double)options.FileCount);
输出:
Creating 1,000,000 file(s) of size 0 bytes... Done.
Time to CreateFiles: 9,182.283sec (1,089.05 files/sec, 1 in 9.1823ms)
如果有什么明显比这更好的?我希望测试几个数量级超过100万的数量级,创建文件需要一天的时间!
我没有尝试任何类型的并行性,尝试优化任何文件系统选项或更改文件创建顺序。
为了完整性,这里是CreateFilename()
:
public static string CreateFilename(long i, long totalFiles)
{
if (totalFiles < 0)
throw new ArgumentOutOfRangeException("totalFiles",
totalFiles, "totalFiles must be positive");
// This tries to keep filenames to the 8.3 format as much as possible.
if (totalFiles < 99999999)
// No extension.
return String.Format("{0:00000000}", i);
else if (totalFiles >= 100000000 && totalFiles < 9999999999)
{
// Extend numbers into extension.
long rem = 0;
long div = Math.DivRem(i, 1000, out rem);
return String.Format("{0:00000000}", div) + "." +
String.Format("{0:000}", rem);
}
else
// Doesn't fit in 8.3, so just tostring the long.
return i.ToString();
}
更新
尝试使用Parallel.For()
根据StriplingWarrior的建议进行并行化。结果:大约30个线程颠簸我的磁盘并且网络减速!
var fileNames = new ConcurrentBag<string>();
var opts = new ParallelOptions();
opts.MaxDegreeOfParallelism = 1; // 1 thread turns out to be fastest.
Parallel.For(0L, options.FileCount, opts,
() => new { Files = new List<string>() },
(i, parState, state) =>
{
var filename = Path.Combine(options.Directory.FullName,
CreateFilename(i, options.FileCount));
using (var file = new FileStream(filename, FileMode.CreateNew
, FileAccess.Write, FileShare.None
, 4096, FileOptions.WriteThrough))
{
}
fileNames.Add(filename);
return state;
},
state =>
{
foreach (var f in state.Files)
{
fileNames.Add(f);
}
});
createTimer.Stop();
Console.WriteLine("Done.");
发现将FileOptions
的{{1}}改进了性能提高了约50%。似乎我关闭了任何写缓存。
FileStream
结果:
new FileStream(filename, FileMode.CreateNew,
FileAccess.Write, FileShare.None,
4096, FileOptions.None)
其他想法仍然受欢迎。
答案 0 :(得分:9)
这里最大的瓶颈无疑是你的硬盘。在一些快速测试中,我通过利用并行性能够看到一些显着的性能改进(但不是数量级):
Parallel.For(1, 10000,
i => File.Create(Path.Combine(path, i.ToString())));
有趣的是,至少在我的机器上,SSD似乎对此操作没有太大影响。
答案 1 :(得分:3)
我找到的最快的方法是围绕File.Create()
简单循环:
IEnumerable filenames = GetFilenames();
foreach (var filename in filenames)
{
File.Create(filename);
}
这相当于(我在代码中实际使用的内容):
IEnumerable filenames= GetFilenames();
foreach (var filename in filenames)
{
new FileStream(filename, FileMode.CreateNew,
FileAccess.Write, FileShare.None,
4096, FileOptions.None)
}
如果你真的想写一些文件:
IEnumerable filenames= GetFilenames();
foreach (var filename in filenames)
{
using (var fs = new FileStream(filename, FileMode.CreateNew,
FileAccess.Write, FileShare.None,
4096, FileOptions.None))
{
// Write something to your file.
}
}
似乎没有帮助的事情:
Parallel.ForEach()
或Parallel.For()
形式的并行性。这导致净减速随着线程数量的增加而变得更糟。