问题描述
我不确定从哪里开始,我是从模板管理代码的。使用以下代码,我可以从Http服务器端下载所有文件。它将检查是否已下载该文件,如果已下载,则不会从站点获取它。我只想下载部分文件。我正在尝试一种简单的解决方案来实现以下几点之一:
- 获取Server-Http上的上次修改数据或上次创建时间。我知道如何从文件夹中执行此操作,但是我不想下载文件然后进行检查,我需要在服务器上执行此操作。 Onlocal pc将是
FileInfo infoSource = new FileInfo(sourceDir);
,然后是infoSource.CreationTime
,其中sourceDir是文件路径。 http上可能有类似的东西吗? - 仅从服务器站点获取最新的10个文件。没有最新的,但是最新的10。
- 监视服务器站点,以便一旦在站点上放置文件MyFileName_Version,它将使用此命名约定获取最新文件。
这些方法中的任何一种都对我有用,但是我仍然是这些方法中的新手,因此请在这里努力。 目前,我有以下代码:
using System;
using System.Diagnostics;
using System.IO;
using System.Net;
using System.Security.Cryptography.X509Certificates;
using System.Text.RegularExpressions;
using Topshelf;
namespace AutomaticUpgrades
{
class Program
{
static void Main(string[] args)
{
// This path containts of the Site,Then binary-release/,The
string url = "HTTP://LOCALHOUST:1000000";
DownloadDataFromArtifactory(url);
}
private static void DownloadDataFromArtifactory(string url)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
string html = reader.ReadToEnd();
Regex regex = new Regex(GetDirectoryListingRegexForUrl(url));
MatchCollection matches = regex.Matches(html);
if (matches.Count > 0)
{
WebClient webClient = new WebClient();
foreach (Match match in matches)
{
if (match.Success)
{
Console.WriteLine(match.Groups["name"]);
//"C:\\Users\\RLEbedEVS\\Desktop\\sourceFolder\\Http-server Download"
if (match.Groups["name"].Length > 5
&& DupeFile(match.Groups["name"].ToString(),"C:\\Users\\RLEbedEVS\\Desktop\\sourceFolder\\Http-server Download")
)
{
webClient.DownloadFile("HTTP://LOCALHOUST:1000000" + match.Groups["name"],"C:\\Users\\RLEbedEVS\\Desktop\\sourceFolder\\Http-server Download\\" + match.Groups["name"]);
}
}
}
webClient.dispose();
}
}
}
}
public static string GetDirectoryListingRegexForUrl(string url)
{
if (url.Equals("HTTP://LOCALHOUST:1000000"))
{
return "<a href=\".*\">(?<name>.*)</a>";
}
throw new NotSupportedException();
}
private static bool DupeFile(string httpFile,string folderLocation)
{
string[] files = System.IO.Directory.GetFiles(folderLocation);
foreach (string s in files)
{
if (System.IO.Path.GetFileName(s).ToString() == httpFile)
{
return false;
}
}
return true;
}
}
}
解决方法
几天后进入“ HTTP服务器模式”,我为我的问题找到了有效的解决方案,因此将其发布在这里。此外,尽管您边走边学,但我了解API的工作原理以及所问的问题并不清楚。
public async void GetPackages(string Feed,string Path,string filter,string retrievePath)
{
//Putting a constraint on the Year and Month the file is put on the Http-Site;
//Need to feed in Api Storage Path {Feed} and {Path} + a place where to download the latest Zip file
// e.g. {downloadPath}
int yearCheck = DateTime.Now.Year;
int monthCheck = DateTime.Now.Month - 2;
string uri = $"{_httpClient.BaseAddress}/api/storage/{Feed}/{Path}";
string responseText;
var artifacts = new List<Artifact>();
//After this we need to access the ResApi to get the list of all files included in the {Feed} Directory
//At the moment this gives an Error. Though Executes the script correctly
responseText = await _httpClient.GetStringAsync(uri);
var response = JsonConvert.DeserializeObject<GetPackagesResponse>(responseText);
if (response.Children.Count < 1)
return;
//Looping through the Array Children to find all Zip files from the last 3 Months.
foreach (var item in response.Children)
{
if (item.Folder)
continue;
var package = item.Uri.TrimStart('/');
var fullPath = $"{uri}{item.Uri}";
//The URI which used for downloading the particular .zip file.
//Additionally we will filter by the last Modified time. Mainly we loop through each element,//Check the lastModified field and then check if this agrees our criteria.
var downloadUri = $"{_httpClient.BaseAddress}/{Feed}/{Path}/";
var lastModified = await GetLastModified(fullPath);
//Last Modified field is checked againts the year (needs to be current year) and
// needs to be created in the last two months specified in variables
if((int)lastModified.Year == yearCheck && (int)lastModified.Month>monthCheck)
artifacts.Add(
new Artifact(package,downloadUri,lastModified)
);
}
//Filtering a list only with the needed files for update.
foreach (var articact in artifacts)
{
if (articact.Package.ToString().Contains(filter))
{
filteredList.Add(articact);
}
}
//Creating a new list which is sorted by the LastModified field.
List<Artifact> SortedList = filteredList.OrderByDescending(o => o.LastModified.DayOfYear).ToList();
//Downloading all the Files which does match the criteria. We should only Retrieve one File Mainly
// ArtifactRetrieve.DownloadDataArtifactory(SortedList[0].DownloadUri,SortedList[0].Package);
// This one would be the first element in the list and should be the latest one availble.
//for (int i=0; i< SortedList.Count; i++)
ArtifactRetrieve.DownloadDataArtifactory(SortedList[0].DownloadUri,SortedList[0].Package,retrievePath);
}
我的结论:
- 除了在脚本中添加循环外,没有其他方法可以监视站点,这意味着如果我正确的话,System.FileWatcher不能在http服务器上模仿。 (可能是错误的假设)。
- 我可以检查日期并在列表中对检索到的数据进行排序。这样一来,更多的控件就可以知道我正在下载哪些数据,无论是最新的还是最早的。