I'm trying to automate a task. I have to repeat the task 14 times and want my application to do it for me. my application runs once without a problem. Can anyone explain how I do this without duplicating code?
My application scrapes html from a url and then saves it to a location on my computer. (the html is an email signature). I need the application to do this for 14 different people.
Example.
//person1
string urlAddress1 = "http://www.url.com/person1";
filestream 1 = @"C:\Users\ellio\Desktop\test\person1.htm
//person2
string urlAddress2 = "http://www.url.com/person2";
filestream 2 = @"C:\Users\ellio\Desktop\test\person2.htm
etc
using System;
using System.Net;
using System.Text;
using System.IO;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace GetSignatureHtml
{
class Program
{
static void Main(string[] args)
{
string urlAddress = "http://www.url.com/person1";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(urlAddress);
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response.StatusCode == HttpStatusCode.OK)
{
Stream receiveStream = response.GetResponseStream();
StreamReader readStream = null;
if (response.CharacterSet == null)
{
readStream = new StreamReader(receiveStream);
}
else
{
readStream = new StreamReader(receiveStream, Encoding.GetEncoding(response.CharacterSet));
}
string data = readStream.ReadToEnd();
using (FileStream fs = new FileStream(@"C:\Users\ellio\Desktop\test\person1.htm", FileMode.Create))
{
using (StreamWriter w = new StreamWriter(fs, Encoding.UTF8))
{
w.WriteLine(data);
}
}
Console.Write(data);
Console.ReadKey(true);
response.Close();
readStream.Close();
}
}
}
}
I'd suggest passing an argument into the Main
method which is a file containing all the URLs to fetch. You can then read all those URLs into a list or array (e.g. with File.ReadAllLines
, and infer the output file from the URL. Move most of your current code into a method that just accepts the URL, so the code will look something like this:
class Program
{
static void Main(string[] args)
{
if (args.Length != 1)
{
// Display some error message here
return;
}
string[] urls = File.ReadAllLines(args[0]);
foreach (var url in urls)
{
DownloadUrl(url);
}
}
private static void DownloadUrl(string url)
{
// Put most of your current code in here.
// You need to infer the name of the file to save -
// consider using new Uri(url), then Uri.LocalPath
}
}
Then you just need to put the URLs into a text file, and specify that when you run the code.
Note that this could all be done with just one URL per command line argument, but by the time you've got 14 URLs on a command line, it's going to be a bit painful to check. I tend to find that using a file for the data makes life easier.
As asides on the rest of the code, you should use a using
statement for the response, and I'd encourage you to use File.WriteAllText
as a simpler way of creating the file. You might want to look at using HttpClient
as an alternative approach to the whole thing, although I'm not sure whether it applies the response encoding in the same way that you do.
See more on this question at Stackoverflow