private static async Task ApplyAsync(CancellationToken cancellationToken, List <IFilePatchAction> actions, Action <DirectoryPatchPhaseProgress> progressCallback) { var progress = new DirectoryPatchPhaseProgress(); var secondPhaseProgress = new DirectoryPatchPhaseProgress(); secondPhaseProgress.SetTotals(actions.Count(m => m.PatchSize == 0), actions.Count(m => m.PatchSize == 0)); secondPhaseProgress.State = DirectoryPatchPhaseProgress.States.Started; progress.SetTotals(actions.Count, (from a in actions select a.PatchSize).Sum()); progress.State = DirectoryPatchPhaseProgress.States.Started; progressCallback(progress); var list = new List <Task>(); foreach (var action in actions) { list.Add(action.ExecuteAsync().ContinueWith(task => { progress.AdvanceItem(action.PatchSize); progressCallback(progress); }, cancellationToken)); } await Task.WhenAll(list); progress.State = DirectoryPatchPhaseProgress.States.Finished; progressCallback(progress); }
internal async Task Analyze(CancellationToken cancellationToken, Action <IFilePatchAction> callback, Action <DirectoryPatchPhaseProgress> progressCallback, string instructions_hash) { // Download instructions await PatchSource.Load("instructions.json", instructions_hash, cancellationToken, (done, total, totalThreads) => { }); // Open downloaded instructions.json and copy its contents to headerFileContents string headerFileContents; using (var file = File.Open(PatchSource.GetSystemPath("instructions.json"), FileMode.Open, FileAccess.Read, FileShare.ReadWrite)) using (var streamReader = new StreamReader(file, Encoding.UTF8)) { headerFileContents = streamReader.ReadToEnd(); } // Deserialize JSON data from headerFileContents List <FilePatchInstruction> instructions = JsonConvert.DeserializeObject <List <FilePatchInstruction> >(headerFileContents); // Initialize progress-related variables var progress = new DirectoryPatchPhaseProgress(); var paths = instructions.Select(i => Path.Combine(_targetPath, i.Path)); var sizes = paths.Select(p => !File.Exists(p) ? 0 : new FileInfo(p).Length); progress.SetTotals(instructions.Count, sizes.Sum()); progress.State = DirectoryPatchPhaseProgress.States.Started; progressCallback(progress); // This will ensure that all instruction hashes are at the BOTTOM of the order list. // These must be last as these are actions against complete files instructions = instructions.OrderBy(x => x.OldHash != "").ToList(); // Process each instruction in instructions.json foreach (var pair in instructions.Zip(sizes, (i, s) => new { Instruction = i, Size = s })) { var instruction = pair.Instruction; cancellationToken.ThrowIfCancellationRequested(); string targetFilePath = Path.Combine(_targetPath, instruction.Path); // Determine action(s) to take based on instruction; any new actions get passed to the callback await BuildFilePatchAction(instruction, targetFilePath, callback); // Update progress progress.AdvanceItem(pair.Size); progressCallback(progress); } // We're done here; update our State and update progress progress.State = DirectoryPatchPhaseProgress.States.Finished; progressCallback(progress); }
private static async Task Apply(List <IFilePatchAction> actions, Action <DirectoryPatchPhaseProgress> progressCallback) { // Initialize progress-related variables var progress = new DirectoryPatchPhaseProgress(); progress.SetTotals(actions.Count, (from a in actions select a.PatchSize).Sum()); progress.State = DirectoryPatchPhaseProgress.States.Started; progressCallback(progress); // Execute every patch action foreach (var action in actions) { // Execute action await action.Execute(); // Update progress progress.AdvanceItem(action.PatchSize); progressCallback(progress); } // We're done here; update our State and update progress progress.State = DirectoryPatchPhaseProgress.States.Finished; progressCallback(progress); }
internal async Task AnalyzeAsync(CancellationToken cancellationToken, Action <IFilePatchAction> callback, Action <DirectoryPatchPhaseProgress> progressCallback, string instructions_hash) { await PatchSource.LoadAsync("instructions.json", instructions_hash, cancellationToken, (done, total, totalThreads) => { }); string headerFileContents; using (var file = File.Open(PatchSource.GetSystemPath("instructions.json"), FileMode.Open, FileAccess.Read, FileShare.ReadWrite)) using (var streamReader = new StreamReader(file, Encoding.UTF8)) { headerFileContents = streamReader.ReadToEnd(); } List <FilePatchInstruction> instructions = JsonConvert.DeserializeObject <List <FilePatchInstruction> >(headerFileContents); var progress = new DirectoryPatchPhaseProgress(); var paths = instructions.Select(i => Path.Combine(_targetPath, i.Path)); var sizes = paths.Select(p => !File.Exists(p) ? 0 : new FileInfo(p).Length); progress.SetTotals(instructions.Count, sizes.Sum()); progress.State = DirectoryPatchPhaseProgress.States.Started; progressCallback(progress); foreach (var pair in instructions.Zip(sizes, (i, s) => new { Instruction = i, Size = s })) { var instruction = pair.Instruction; var size = pair.Size; cancellationToken.ThrowIfCancellationRequested(); string targetFilePath = Path.Combine(_targetPath, instruction.Path); await BuildFilePatchAction(instruction, targetFilePath, callback); progress.AdvanceItem(size); progressCallback(progress); } progress.State = DirectoryPatchPhaseProgress.States.Finished; progressCallback(progress); }
public DirectoryPatchPhaseProgressItem(DirectoryPatchPhaseProgress parent) { _parent = parent; _done = 0; _total = 0; }
public DirectoryPatchPhaseProgressItem(DirectoryPatchPhaseProgress parent) { Parent = parent; _Done = 0; _Total = 0; }
private static async Task Apply(CancellationToken cancellationToken, List <IFilePatchAction> actions, Action <DirectoryPatchPhaseProgress> progressCallback) { List <IFilePatchAction> tmpActions = actions.ToList(); // Create a copy of our list List <BackgroundWorker> bgWorkers = new List <BackgroundWorker>(); var guid = Guid.NewGuid(); // Initialize progress-related variables var progress = new DirectoryPatchPhaseProgress(); var secondPhaseProgress = new DirectoryPatchPhaseProgress(); secondPhaseProgress.SetTotals(tmpActions.Count(m => m.PatchSize == 0), tmpActions.Count(m => m.PatchSize == 0)); secondPhaseProgress.State = DirectoryPatchPhaseProgress.States.Started; progress.SetTotals(actions.Count, (from a in actions select a.PatchSize).Sum()); progress.State = DirectoryPatchPhaseProgress.States.Started; progressCallback(progress); // Create our workers, but clamp the value to 4, we dont want to cook peoples PC's // (on testing with using all cores, it maxed out my 24 core server, but it did patch in about 30 seconds!) Logger.Instance.Write( $"Spawning Background Workers - Detected {Environment.ProcessorCount} processors, using {(Environment.ProcessorCount > MaximumDeltaThreads ? MaximumDeltaThreads : Environment.ProcessorCount)} of them"); for (var i = 0; i < (Environment.ProcessorCount > MaximumDeltaThreads ? MaximumDeltaThreads : Environment.ProcessorCount); i++) { Logger.Instance.Write($"Spawning new background worker for task with an ID of {i}"); bgWorkers.Add(new BackgroundWorker() { WorkerSupportsCancellation = true }); } foreach (var backgroundWorker in bgWorkers) { Logger.Instance.Write($"Assigning DoWork methods to bgworker"); backgroundWorker.DoWork += async(sender, args) => { // While there are still some in the array to use. while (tmpActions.Any(checker => !checker.IsComplete) && !backgroundWorker.CancellationPending && !backgroundWorker.CancellationPending) { // Execute action IFilePatchAction thisAction; // Lock our tmpActions variable so we have unique access to it now lock (tmpActions) { if (tmpActions.Any(filePatchAction => !filePatchAction.IsComplete && !filePatchAction.IsActive && filePatchAction.PatchSize > 0)) { // If an action is not complete, not active and is above zero patch size thisAction = tmpActions.DefaultIfEmpty(null).FirstOrDefault(filePatchAction => !filePatchAction.IsComplete && !filePatchAction.IsActive && filePatchAction.PatchSize > 0); } else // We have no actions that are above a patch size of zero, do we have any that are zero that still need doing? if (tmpActions.Any(l => !l.IsComplete && !l.IsActive && l.PatchSize == 0)) { thisAction = tmpActions.DefaultIfEmpty(null).FirstOrDefault(filePatchAction => !filePatchAction.IsComplete && !filePatchAction.IsActive && filePatchAction.PatchSize == 0); } else { // We're done, break out of our loop to close this thread break; } // Grab an action that is not complete, and not active and has a file size if (thisAction == null) { continue; } thisAction.IsActive = true; } Logger.Instance.Write($"Starting action with file size of {thisAction.PatchSize}"); try { await thisAction.Execute(); } catch (Exception ex) { Logger.Instance.Write($"Error while attempting to apply a patch, error:\r\n{ex.Message}\r\n{ex.StackTrace}"); // Throw a new exception to let the gui know. throw new Exception($"Unable to apply patch, we ran into an error\r\n{ex.Message}"); } // Complete this action lock (tmpActions) { // Patch Size will ONLY equal ZERO when the instruction is one that expects the file to be present // An operation such as moving the file, renaming the file or setting it's metadata timestamp. // We have to check for zero, to ensure that the asyncronus method of performing this action // does not attempt to move/rename/etc a file before it has been actually downloaded. if (thisAction.PatchSize == 0) { secondPhaseProgress.AdvanceItem(1); progressCallback(secondPhaseProgress); } else { progress.AdvanceItem(thisAction.PatchSize); progressCallback(progress); } thisAction.IsComplete = true; thisAction.IsActive = false; } } Logger.Instance.Write("Background worker terminated"); }; backgroundWorker.RunWorkerAsync(); } while (tmpActions.Any(action => !action.IsComplete)) { await Task.Delay(1000); if (cancellationToken.IsCancellationRequested) { foreach (var bgWorker in bgWorkers) { bgWorker.CancelAsync(); } break; } } // Dispose of all of our workers foreach (var b in bgWorkers) { b.Dispose(); } // We're done here; update our State and update progress progress.State = DirectoryPatchPhaseProgress.States.Finished; progressCallback(progress); }