Thursday, August 31, 2017

OneDrive Problems and Windows 10 Creator Update


Another departure from my normal blog posts (which are usually developer-related about .NET, C# and database stuff). The title of this post is a bit misleading ... there is probably absolutely nothing wrong with the Creator Update and OneDrive, but I panicked and over-reacted. Imagine that!

First, just in case you're lazy and don't want to bother reading my lovely narrative below ...

TL;DR 
How to get your OneDrive "re-set" if things get screwed up:
  1. Exit OneDrive from the right-click options.
  2. Go to the C:\Users\YourUserName\AppData\Local\Microsoft\OneDrive\settings. Make a copy of the settings folder somewhere, just in case it gets screwed up.
  3. Delete everything in the folder (so, you still want the settings folder, but it should be empty).
  4. Start OneDrive back up again in one of two ways:
    • Search for OneDrive in the Start Menu and run it, which I didn't do.
    • Go to C:\Users\YourUserName\AppData\Local\Microsoft\OneDrive\Update and run the OneDriveSetup.exe. This is what I did (since it was the first thing suggested to me). I'm guessing that running OneDrive.exe would probably run the OneDriveSetup.exe anyway, since it knows that there's nothing in the settings folder!
  5. Once OneDrive has started:
    • It will ask you for your OneDrive login credentials.
    • It will ask you where your local OneDrive folder is (don't worry, it didn't lose it or wipe it out or anything). None of your files get lost.
    • Then, you get to the part where you specify the folders to sync.
    • And everything works fine after that!
OK, now that we got *that* out of the way, here's what happened to cause this whole mess:

I had updated (unknowingly) to the Creator Update over the weekend. It's a totally misleading description for the update, because it says "Update for Windows 10 Version 1607 for x64-based Systems (KBxxxxxxx)".  The Creator Update is version 1703, so you would *think* that the description would be something like "Update for Windows 10 to Version 1703" ... but Noooo.   I don't remember the KB number for that particular update (I subsequently got another Update today, Monday, that is apparently also the Creator Update, which I have not yet applied ... it's KB4033637, but good luck finding anything about it).

The Update totally got my One Drive all confused (or so I thought), and rather than try and figure out how to fix it, I panicked and just went back to my previous Windows version. You can easily do that by going to Settings | Update and Security | Recovery and then choose Get Started under the heading "Go back to an earlier build" (be careful with this one, though ... because if it's not done soon after the update, any changes to your system subsequent to that will be lost when it reverts to the earlier build). Everything looked fine, so I just went on about my business (well, messing around with my computer on a Saturday isn’t quite business).

Later in the day, I got an email from OneDrive about a lot of files in my Recycle Bin.  I went to OneDrive online and saw that at least 800 files had been plopped in the online Recycle Bin!!! These are photos (I take a lot of pictures, I'm a photography hobbyist).  Well, I sure knew that it had happened because of the hijinks of the Creator Update and so, silly me, I moved them out of the Recycle Bin and back where they belonged. And then it was time for dinner and movie.

Several hours later, I checked it before I went to bed and OneDrive was still downloading files. I went to look at the files locally and saw that I was getting dangerously low on C disk drive space. What happened?!?  It was downloading *EVERYTHING*!!!!!!  And I had *NOT* had it set up that way previously. Under my Photography folder, of the almost 6000 files, in 38 folders, I had OneDrive set up to only sync this year’s photos (less than 800 files). I immediately Paused the Sync and went to bed. Deal with it in the morning …

So now it's Sunday morning ... I figured that I could just go to the Settings and put things back the way they were (only syncing the current year's photo folders).  Nope!!  I couldn’t access Settings while Syncing was paused. OK, let’s try again.  Un-paused the Sync, went to the Settings and tried to uncheck one lousy folder … but the OK button stubbornly remained disabled. I made sure that I did this on one of the folders that had already synced overnight, just in case that was the problem. It still didn’t work. OK, let's go to File Explorer and access the Settings via right-clicks. But, still no joy ... there were absolutely *NO* OneDrive related *anything* in File Explorer (such as “View on OneDrive.com”, etc. … nothing!!).

I finally got tired of Googling for an answer and asked around on a few MVP-related resources. I almost instantly got a reply from one of my fellow MVPs, Mike Halsey (@MikeHalsey on Twitter … or Google him if you're interested, you'll get a lot of hits). He suggested I try the following steps and *it worked*!!! And, I'll repeat it again, even though it's at the top of the post, I added to it just a bit:
  1. Exit OneDrive from the right-click options.
  2. Go to the C:\Users\YourUserName\AppData\Local\Microsoft\OneDrive\settings. Make a copy of the settings folder somewhere, just in case it gets screwed up.
  3. Delete everything in the folder (so, you still want the settings folder, but it should be empty).
  4. Start OneDrive back up again in one of two ways:
    • Search for OneDrive in the Start Menu and run it, which I didn't do.
    • Go to C:\Users\YourUserName\AppData\Local\Microsoft\OneDrive\Update and run the OneDriveSetup.exe. This is what I did (since Mike suggested that first). I'm guessing that running OneDrive.exe would probably run the OneDriveSetup.exe anyway, since it knows that there's nothing in the settings folder!
  5. Once OneDrive has started:
    • It will ask you for your OneDrive login credentials.
    • It will ask you where your local OneDrive folder is (don't worry, it didn't lose it or wipe it out or anything). None of your files get lost. This is the part that freaked me out when the Creator Update started asking me this right off the bat. I really thought that all was lost and, like I said, I panicked. Silly me.
    • Then, you get to the part where you specify the folders to sync.
    • And everything works fine after that!
Pretty easy, although it was a bit nerve-wracking. ;0)  Thanks, Mike!!

I will probably go ahead and try the Creator Update again, because I *really* want the OneDrive Placeholders, like we had back in Windows 8.1. I think the weird stuff I was seeing really wasn't a problem. Although, I won't swear to that!

Sunday, May 28, 2017

Accessing Oracle Databases in .NET

We recently went through hell trying to get an Oracle provider installed that works and is not a huge set of files that are impossible to install. We are primarily a SQL Server "shop" and hadn't had to use Oracle in a long, long time. So, this was quite an ordeal. I've been using SQL Server for 17 years (no expert, but I know my way around) ... but Oracle always mystifies me. It's just so different!

Anyway, I was not the one looking for a valid provider, it was my husband (we work together) ... and he finally found something that worked. And, luckily, he found it before he pulled out all his hair (and he *does* have a lot of it)!  You'll need to use this NuGet package (it's only a 2.5 MB download) and quite easy to install (as most NuGet packages are):

https://www.nuget.org/packages/Oracle.ManagedDataAccess/

If you went to the above link, you'll see that all you have to do is install the package from the NuGet Package Manager Console command line. The Package Manager Console has been built into Visual Studio (2012 and later). Find it under Tools | NuGet Package Manager | Package Manager Console. If, for some reason, you do not see the NuGet stuff under Tools (either because you have a VS earlier than 2012 or it just wasn't installed when you installed VS), then it can be installed manually as shown here:

https://docs.microsoft.com/en-us/nuget/guides/install-nuget#nuget-package-manager-in-visual-studio

I should also mention that when you add the NuGet Oracle.ManagedDataAccess package, it'll add a section to the config file that contains a sample connection string:

<oracle.manageddataaccess.client>
<version number="*">
<dataSources>
<dataSource alias="SampleDataSource" descriptor="(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=localhost)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=ORCL))) "/>
</dataSources>
</version>
</oracle.manageddataaccess.client>

We added the following <connectionStrings> setting in the config:

<connectionStrings>
<add name="Oracle" connectionString="DATA SOURCE=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=XXX.XXX.XXX.XXX)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=MyDatabase)));User id=MyID;Password=MyPwd;"/>
</connectionStrings>

Obviously, change the XXX.XXX.XXX.XXX to your server's IP address, as well as changing MyDatabase, MyID and MyPwd to your own settings.

That should do the trick! Happy coding!  =0)

Friday, March 31, 2017

Fire and Forget

Lately, I've run across a few questions in the Forums about being able to spin off a thread that might contain a long-running process that doesn't need to be monitored in any way. It's what's called a "Fire and Forget" process. This is useful for processes where the caller doesn't need to have any direct feedback from the process, it just needs to initiate that process and then move on. I can see where this could be quite useful for running some SQL scripts or Stored Procedures on a database.

For example, say that you have a UI (WinForms or WPF) and a button click (or whatever) to start the Processes. You don't want to block the UI thread, so that's an important thing to keep in mind. Let's also say that we have another class we use that contains all the processes that we want to Fire and Forget.  An excellent way to deal with all this is to initiate Fire and Forget threads using Task.Run().

In order to simulate this, I'm going to write to a TextBox from the UI thread, and use a Console.WriteLine() in the Fire and Forget processes to show the progress of each call. When running your application from Visual Studio, you can see the output from the Console.WriteLine() in the Output window (look for it under "Debug | Windows | Output" if you don't already have it show up when you're debugging).

So, first look at this code:

Utils util = new Utils();
Console.WriteLine("Run Tasks in 'for' loop ...");
for (int i = 0; i < 10; i++)
{
Task.Run(() =>
{
util.FireAndForget(i);
});

this.TextBox1.Text += string.Format("Started Thread {0} ...\r\n", i);
}

And here is the FireAndForget() method in the Utils class:

public void FireAndForget(int i)
{
Console.WriteLine("Starting Task #{0}", i);
// Simulate long running thread, but make them random time periods
var rand = new Random();
Thread.Sleep(rand.Next(10000)); // 10 seconds or less
Console.WriteLine("Completed Task #{0}", i);
}

Running this code, you can see in the UI that the TextBox sequentially lists all 10 threads as having started, and watching the Output window, you can see that the TextBox shows all 10 before the Output window shows all 10 (in other words, each running in a separate thread). And they are Completed at different times.

But, wait ... something is wrong!!  Notice in the Output window that you will often see Tasks with duplicate numbers! And, if you comment out the line for setting the TextBox1.Text, you will see every single Task has the number 10!! Why is this happening?

It has to do with something called Closures. And it's because the i variable used in Task.Run(()=>{ util.FireAndForget(i) }); by design, uses the current value of i (10), not the value of i when the delegate was created (0 thru 9). Closures close over variables, not over values. That's a quote from Eric Lippert's blog post, which will probably explain things a lot better than I can. See it here (and note that he has a link to Part 2 also): https://blogs.msdn.microsoft.com/ericlippert/2009/11/12/closing-over-the-loop-variable-considered-harmful/

The reason that setting the TextBox.Text in the UI seems to not affect it as much is simply because of the delay that the UI thread takes to update UI, in case you were wondering ...

There are two ways around this:

for (int i = 0; i < 10; i++)
{
// By creating a new variable each time through the for loop
// and using that instead, you can avoid the problem
int ii = i;
Task.Run(() =>
{
util.FireAndForget(ii);
});

this.TextBox1.Text += string.Format("Started Thread {0} ...\r\n", i);
}

If you've looked at the link to Eric's blog post, you'll see that Microsoft decided to fix this issue in C# 5, but *only* in the foreach, not in the for. So, in the foreach version, you could do it like this with no problems:

//Console.WriteLine("Run Tasks in 'foreach' loop ...");
int[] iArray = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 };
foreach (int i in iArray)
{
Task.Run(() =>
{
util.FireAndForget(i);
});
this.TextBox1.Text += string.Format("Started Thread {0} ...\r\n", i);
}

So, I diverged a bit from the original intent of this post, showing both the Fire And Forget process, and the little "gotcha" that you might have encountered had you used a for instead of a foreach.

Happy coding!  =0)

Sunday, February 26, 2017

It's All About The Data

Every application has to deal with *some* kind of data. Where it's stored externally and how it's used in the application can vary widely ... but in this blog post, I will deal with SQL Server database storage and reading the data into either a DataSet, DataTable or a List of objects. I will *not* talk about Entity Framework nor LINQ-to-SQL, mainly because I pretty much only use DataSets.

I have seen many posts on Forums, blogs and elsewhere, recommending the use of the Load() method of a DataSet or DataTable. The Load() method takes an IDataReader parameter (so, in this case, I'd use a SqlDataReader). Supposedly, according to the many times that I've seen this recommended, it is supposed to be the fastest way to do this.

Personally, I have always used a SqlDataAdapter and its Fill() method to put the result set(s) returned from a SQL call into a DataTable or a DataSet. And I always thought that this was the fastest (although I had never tested that).

A couple of days ago, I saw this Load() method recommended again, several times ... and again I wondered about the performance of the Load() vs the Fill(), so I decided that it's about time to run some performance tests to put the question to rest, once and for all. And, while I was at it, I also decided to throw in a test putting data into a List<T> in addition to the DataSet/DataTable tests (not for my benefit, since I seldom use List<T> in this manner; but for you, Dear Reader).

So, first, here is the benchmarking code:

private void TestLoadvsFillvsList()
{
DataTable dtGlobal;
DataSet dsGlobal;
decimal LoadMilli;
decimal FillMilli;
decimal ListMilli;

Stopwatch oWatch = new Stopwatch();

// The DataTable.Load(IDataReader) method:
using (SqlConnection conn = newSqlConnection(this.ConnectionString))
{
SqlCommand sc = new SqlCommand("select * from Logdata", conn);
conn.Open();
this.dtGlobal = new DataTable();

oWatch.Start();
SqlDataReader dr = sc.ExecuteReader();
this.dtGlobal.Load(dr);
oWatch.Stop();

// If you have multiple SELECTs in your SqlCommand, you can put the multiple result sets
// into a DataSet with syntax similar to the following examples:
//this.dsGlobal.Load(dr, LoadOption.OverwriteChanges, dtLogData, dtMessage);
//this.dsGlobal.Load(dr, LoadOption.OverwriteChanges, "Table1", "Table2");
}
LoadMilli = oWatch.ElapsedMilliseconds;
oWatch.Reset();
Console.WriteLine("Load Time {0} milliseconds, Row Count, {1}", LoadMilli, this.dtGlobal.Rows.Count);

// The DataAdapter.Fill(DataSet/DataTable) method:
using (SqlConnection conn = new SqlConnection(this.ConnectionString))
{
SqlCommand sc = new SqlCommand("select * from Logdata", conn);
conn.Open();
this.dsGlobal = new DataSet();

oWatch.Start();
SqlDataAdapter da = new SqlDataAdapter(sc);
da.Fill(this.dsGlobal); // could also Fill(this.dtGlobal)
oWatch.Stop();
}
FillMilli = oWatch.ElapsedMilliseconds;
oWatch.Reset();
Console.WriteLine("Fill Time {0} milliseconds, Row Count, {1}", FillMilli, this.dsGlobal.Tables[0].Rows.Count);

// The while (dr.Read()) method:
List<LogData> logList = new List<LogData>();
using (SqlConnection conn = new SqlConnection(this.ConnectionString))
{
SqlCommand sc = new SqlCommand("select * from Logdata", conn);
conn.Open();

LogData oLog; ;
oWatch.Start();
SqlDataReader dr = sc.ExecuteReader();
while (dr.Read())
{
oLog = new LogData();
oLog.logdatakey = (long)dr["logdatakey"];
oLog.logdatetime = (DateTime)dr["logdatetime"];
oLog.message = dr["message"].ToString();
oLog.category = dr["category"].ToString();
logList.Add(oLog);
}
oWatch.Stop();
}
ListMilli = oWatch.ElapsedMilliseconds;
oWatch.Reset();
Console.WriteLine("List Time {0} milliseconds, Row Count, {1}", ListMilli, logList.Count);
Console.WriteLine("------------------------------------------------");

if (LoadMilli > FillMilli)
Console.WriteLine("Fill is {0:0.00} faster than Load", LoadMilli / FillMilli);
else
Console.WriteLine("Load is {0:0.00} faster than Fill", FillMilli / LoadMilli);

Console.WriteLine("A List of objects is faster than either one!", FillMilli / LoadMilli);
Console.WriteLine("List is {0:0.00} times faster than Load", LoadMilli / ListMilli);
Console.WriteLine("List is {0:0.00} times faster than Fill", FillMilli / ListMilli);
Console.WriteLine("------------------------------------------------");
}
public class LogData
{
public long logdatakey { get; set; }
public DateTime logdatetime { get; set; }
public string message { get; set; }
public string category { get; set; }
}

OK, so now notice in the last set of Console.WriteLine() statements above where I state that using a List<T> is *always* faster than using a DataSet/DataTable (and by a lot, as I'll get to in a minute). What this tells me is that if you have no use for DataSets at all, then you'll do just fine using a SqlDataReader to add your data to a List<T>, and then use your List elsewhere in your processing.

I used a sample size of 425,376 rows in the database table. The average time for the 3 methods were approximately as follows, in milliseconds:

Load 4040
Fill 3030
List 2020

So, doing the math:
The Fill about 1.33 times faster than the Load.
The List is about 2 times faster than the Load.
The List is about 1.5 times faster than the Fill.

I feel vindicated! I can now safely reply to these forum posts that, indeed, the .Fill() is significantly faster than the .Load()!!

Happy coding!! :0)

Tuesday, January 17, 2017

A Lot of Misbehaving - Bad Computer!!

Last week, both of my computers decided to misbehave at almost the same time. Windows Defender was the first to misbehave by hogging 40% of my CPU, which left my fan running constantly and my computer getting pretty warm ... this was on my Surface Pro 4, running Windows 10. Next, Windows Update did nearly the same thing, again hogging CPU and causing a constantly running fan and excess heat … this was on my Dell Venue 11 Pro, running Windows 8.1. Luckily for me, the Dell didn’t start having problems until after I had resolved the problems on the Surface, but I only got a day of breathing room. Talk about a frustrating week!!! So, how did I fix these naughty computers? Read on …

Windows Defender

The weird thing about this problem was that this computer wasn’t even idle at the time … Defender isn’t supposed to be running any kind of Scan while the computer is in use, and I was *definitely* using it when this mess started! But, I left it running, assuming it would finish up soon … 4 hours later, it was still running!  I realized later (when trying a Quick Scan on my other computer), that my scans only take about 3-8 minutes, not hours and hours and hours!! That should have been my first clue. I Googled and saw several suggestions, none of which helped:

  • I tried rebooting, didn’t help.
  • I set it to exclude scanning the file MsMPEng.exe, didn’t help.
  • I tried scanning with sfc (System File Check, using /verifyonly rather than /scannow), it told me I had no problems.

I hibernated overnight, because I didn’t want it overheating during the night. Obviously, when I brought it back from hibernation the next morning, it continued running at 40% CPU, as expected, and I shut it down again.

Later that day, I ran Defender from its UI on my *other* computer and I noticed that it shows a progress bar and a running count of the files it is scanning (I guess I never bothered to run it from the UI before). So, I decided to try a Quick Scan on *this* computer, even though Defender was still scanning (the 40% CPU). Lo and behold, the UI showed that it got stuck after scanning 12,471 files … number 12,472 just stayed up there and now my CPU usage rose to nearly 80%!  Yikes!!  I shut it down again, and worked the rest of the day from my other computer (I do all of my development work on VM’s that are hosted somewhere else on our network, so I could actually work on stuff from either computer).

Over the course of the day, I periodically started it up just to check my email or to try something I found while Googling (and hoping that just the act of restarting it might eventually magically make it stop), and then would shut it down again because no magic happened and the CPU was still running high. The only thing different that I did before shutting it down that night was to Disable the Scheduled Scan in the Task Scheduler, so it would at least not keep trying to run a scan.

The next morning after startup, it continued misbehaving and I continued shutting it down and starting it back up again to check my email. After a couple of rounds of this up and down, finally magic *did* happen! All of a sudden it started working. I think that Disabling the Scheduled Scan definitely helped. But the real kicker was my discovery from a fellow MVP (Derek Knight) as to *why* I probably had the problem in the first place.

Back in the day (ending with Windows 7), a full ShutDown was always better than a Restart because, basically, when you started back up from a full ShutDown it was starting up totally “from scratch”, like a cold boot. A Restart didn’t shut everything down before it came back up and didn’t do all of the “checks” that a full start up did. But now, starting with Windows 8, it’s totally the opposite behavior! And the problem that I was having was most likely related to always doing a ShutDown. Derek says that “Defender has an annoying habit of getting hung up on scanning a non existent driver that doesn't exist any longer. A full Restart loads a new instance of the drivers. That normally solves it. This can also happen when you shut down, if Fast Startup is enabled. After 3 or 4 start ups, some existing drivers don't fully load so Defender sees them as corrupt. They are in memory but not physically able to be scanned.”

Fast Startup was most likely the culprit … just prior to all hell breaking loose with my Defender problems, I had been having a lot of issues/warnings about a corrupted bluetooth mouse driver. Restarting seemed to solve the problem, but sometimes it didn’t (probably when I ShutDown instead of Restarted). Anyway, read this very interesting article from “How-To Geek” about Fast Startup. I turned mine off and have not had any problems since:

http://www.howtogeek.com/243901/the-pros-and-cons-of-windows-10s-fast-startup-mode/

Windows Update

No sooner had I fixed the Defender problem on my Surface, then my Dell Venue started misbehaving with Windows Update. Arrrrgggghhhh!!!   =0(   Apparently, Update was getting stuck while “Checking for Updates”. Maybe it was stuck on a file download or maybe it was something else, I don’t know what. I only know that it would run up my CPU to about 25-30% and never stop!

So, I ended up shutting off Windows Update until I could find a solution. To turn it off go to Control Panel | System and Security | Windows Update | Change Settings and choose “Never check for updates” … like this:

I had found a few suggestions to try while Googling, like running the built-in Update Troubleshooter, which you can find by typing “troubleshooter” in the Start | Search box and click the Troubleshooter result. Then start Troubleshooting by clicking on System and Security | Fix Problems With Windows Update:

It didn’t work for me, but it’s a good starting place.

Next, it was “How-to Geek” to the rescue again:

http://www.howtogeek.com/247380/how-to-fix-windows-update-when-it-gets-stuck/

The How-To Geek offers several things you can try to fix the problem. I had already tried one or two of them that I had found from my previous Googling. I decided that I was so sick and tired of trying every little suggestion I found when none of them ever worked, that I was going to skip to the end of the article and try the sure-fire, easy way (according to How-To Geek, it has always worked for them in the past when none of the other solutions did) … and that is using the third-party tool called WSUS Offline Update. The link to download it is provided in the above article, as well as instructions as to what to do with it once you’ve downloaded it. What WSUS does is download a bunch of Windows Updates, depending on which Windows version you’ve got and which downloads you’ve already installed. Be forewarned that it may take a while to download if you’re way behind on updates. Mine took all night to download (but, my Internet connection stinks … so YMMV). Installing the Updates took a *lot* less time than downloading them. But, it’s the end result that matters and when all is said and done, this *worked*!!!! No more stalling when it’s trying to Check for Updates!

Oh, and don’t forget to change the Windows Updates settings back to “Install Updates Automatically”.

Wednesday, December 28, 2016

Always Use Environment.SpecialFolders

Never, *ever*, hard-code a path to directories like User Documents or AppData!! As far back as Windows XP, a user could move those directories off of the C drive (from C:\Users\MyName\……) and put them on another drive. This is mainly done for two reasons:

  1. To save space on the C drive.
  2. To keep data separate from the OS and other installed programs.

So, in your application, if you hard-coded a path to C:\Users\MyName\Documents you could be in trouble if you tried to write a file there, and the user had “re-mapped” the folder elsewhere … because you’d get a “Directory Not Found” exception. Instead, use the Environment.SpecialFolders enum. (Please note that this probably does NOT apply to UWP apps … and since I don’t write those currently, I can’t speak authoritatively about the correct way to determine these folders in a UWP app.)

Here’s an example:

string DocFolder = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
string MyFile = "MyTestFile.txt";
File.WriteAllText(Path.Combine(DocFolder, MyFile), "This is my test data...");

The above code-snippet will result in the file being written where a user expects it to be written.

There are a number of these SpecialFolders, such as:

Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData);
Environment.GetFolderPath(Environment.SpecialFolder.CommonDocuments);

You can explore them yourself using Intellisense to see what else there is.

Another handy File/Directory trick to have up your sleeve, is the use of the Path.GetTempPath() and Path.GetTempFileName() methods.  There is typically a Temp directory in the User’s local AppData folder, so again it’s very important to use Path.GetTempPath() if you intend to write/read a file to this Temp directory.

The Path.GetTempFileName() method is great if you don’t really care about the name of the file you want to write, if it truly is a quickie temp file that you will use right away and then delete. What the GetTempFileName() method does is create a zero-byte file for you and returns the path to it, including the randomly-generated name it used. Here’s an example of how you might use it:

string temp = Path.GetTempFileName();
try
{
File.WriteAllText(temp, "This is data in a temporary file!");
// Then do something with that file, whatever your temporary needs are
}
finally
{
File.Delete(temp);
}

One last thing that I should mention. I have not found a way to access the Downloads folder using the Environment.SpecialFolders enum nor any other built-in C# methods. A fellow named Ray Koopa has a Code Project article in which he has written a wrapper class to access all the Special Folders, Downloads included. He also has created a NuGet package for his wrapper. You can find it all here:

https://www.codeproject.com/articles/878605/getting-all-special-folders-in-net 

Happy Coding! =0)

Tuesday, November 29, 2016

Microsoft MVP Summit 2016

I've been a Microsoft MVP since 2003. I still consider myself basically a C# MVP, but Microsoft has rolled up a lot of developer categories under one umbrella now: Visual Studio and Development Technologies. As you may know, being a Microsoft MVP means helping the community, in my case, the .NET community of developers. I do this with my blog posts and the many questions that I try to answer on the MSDN forums: https://social.msdn.microsoft.com/Forums/vstudio/en-US/home.

Many of my readers have probably heard of the annual Summit in which a few thousand MVPs from around the world descend on the Microsoft campus for 4-5 days of geekiness. I have attended every Summit since I've been an MVP. This year's Summit was interesting though ... since Microsoft has gone Open Source with most of .NET's source code, not everything at the Summit this year was under NDA (Non-Disclosure Agreement). It was hard to keep track of what was and what wasn't, so I decided it was easier just to not talk about anything when I got back.

But we *were* told this: that a lot of what we were hearing about at the Summit would become public the following week at the Connect conference in NYC. And so it did ... the conference was streamed live at the time, but the videos are available at https://connectevent.microsoft.com/ ... watch and be amazed!

On the last day that I was at the Summit, I participated in the MVP Hackathon, where we get to write code and explore new features alongside Microsoft engineers. Kinda geeky and fun!  Here's a public link about the event: https://blogs.msdn.microsoft.com/webdev/2016/11/22/mvp-hackathon-2016/  I'm not in the group picture that was taken at the end of the day because I had to leave after lunch (pizza!) to drive home. It's not a long drive (100 miles), but I wanted to get home before dark (and the sun sets early this time of year) ... I really don't like to drive at night. During the Hackathon, I was learning how to write extensions to Visual Studio ... not a new thing, but something that I hadn't done before, so it was new to me. The other people in my group doing extensions wrote some awesome stuff: Nico Vermier, David Gardiner, Cecilia Wiren and Terje Sandstrom. Their stuff is all on GitHub, so take a look at the Hackathon link above and check out some of the great extensions that they wrote. I never finished my extension ... because I had to leave early and didn't really have anything specific in mind that I wanted to accomplish, I just wanted to learn how to do it. I should have finished my experimentation when I got home, but life and work get in the way sometimes. ;)

Happy coding ...