/// <summary> /// データ削除 /// データ削除にはコストがかかるので注意! /// そして何故かこの関数は削除するごとに重くなる性質があるらしい(何故?) /// </summary> /// <param name="key"></param> /// <param name="value"></param> public void Remove(TKey key, TValue value) { TValue data; NativeMultiHashMapIterator <TKey> iterator; if (nativeMultiHashMap.TryGetFirstValue(key, out data, out iterator)) { do { if (data.Equals(value)) { // 削除 nativeMultiHashMap.Remove(iterator); var cnt = useKeyDict[key] - 1; if (cnt == 0) { useKeyDict.Remove(key); } break; } }while (nativeMultiHashMap.TryGetNextValue(out data, ref iterator)); } nativeLength = NativeCount; }
public void NativeMultiHashMap_RemoveKeyAndValue() { var hashMap = new NativeMultiHashMap <int, long> (1, Allocator.Temp); hashMap.Add(10, 0); hashMap.Add(10, 1); hashMap.Add(10, 2); hashMap.Add(20, 2); hashMap.Add(20, 2); hashMap.Add(20, 1); hashMap.Add(20, 2); hashMap.Add(20, 1); hashMap.Remove(10, 1L); ExpectValues(hashMap, 10, new [] { 0L, 2L }); ExpectValues(hashMap, 20, new [] { 1L, 1L, 2L, 2L, 2L }); hashMap.Remove(20, 2L); ExpectValues(hashMap, 10, new [] { 0L, 2L }); ExpectValues(hashMap, 20, new [] { 1L, 1L }); hashMap.Remove(20, 1L); ExpectValues(hashMap, 10, new [] { 0L, 2L }); ExpectValues(hashMap, 20, new long [0]); hashMap.Dispose(); }
internal void RemoveState(TStateKey stateKey) { var predecessorQueue = new NativeQueue <TStateKey>(Allocator.Temp); // State Info StateInfoLookup.Remove(stateKey); // Actions if (ActionLookup.TryGetFirstValue(stateKey, out var actionKey, out var actionIterator)) { do { var stateActionPair = new StateActionPair <TStateKey, TActionKey>(stateKey, actionKey); // Action Info ActionInfoLookup.Remove(stateActionPair); // Results if (ResultingStateLookup.TryGetFirstValue(stateActionPair, out var resultingStateKey, out var resultIterator)) { do { // Remove Predecessor Link if (PredecessorGraph.TryGetFirstValue(resultingStateKey, out var predecessorKey, out var predecessorIterator)) { predecessorQueue.Clear(); do { if (!stateKey.Equals(predecessorKey)) { predecessorQueue.Enqueue(predecessorKey); } } while (PredecessorGraph.TryGetNextValue(out predecessorKey, ref predecessorIterator)); // Reset Predecessors PredecessorGraph.Remove(resultingStateKey); // Requeue Predecessors while (predecessorQueue.TryDequeue(out var queuedPredecessorKey)) { PredecessorGraph.Add(resultingStateKey, queuedPredecessorKey); } } // Action Result Info StateTransitionInfoLookup.Remove(new StateTransition <TStateKey, TActionKey>(stateKey, stateActionPair.ActionKey, resultingStateKey)); } while (ResultingStateLookup.TryGetNextValue(out resultingStateKey, ref resultIterator)); ResultingStateLookup.Remove(stateActionPair); } } while (ActionLookup.TryGetNextValue(out actionKey, ref actionIterator)); ActionLookup.Remove(stateKey); } // Predecessors PredecessorGraph.Remove(stateKey); predecessorQueue.Dispose(); }
public void NativeMultiHashMap_RemoveOnEmptyMap_DoesNotThrow() { var hashMap = new NativeMultiHashMap <int, int>(0, Allocator.Temp); Assert.DoesNotThrow(() => hashMap.Remove(0)); Assert.DoesNotThrow(() => hashMap.Remove(-425196)); Assert.DoesNotThrow(() => hashMap.Remove(0, 0)); Assert.DoesNotThrow(() => hashMap.Remove(-425196, 0)); hashMap.Dispose(); }
public void NativeMultiHashMap_RemoveKeyValueDoesntDeallocate() { var hashMap = new NativeMultiHashMap <int, int> (1, Allocator.Temp); hashMap.Add(5, 1); hashMap.Remove(5, 5); GCAllocRecorder.ValidateNoGCAllocs(() => { hashMap.Remove(5, 1); }); hashMap.Dispose(); }
private void RemoveFew(out int newSize) { Assert.IsTrue(_keySize > 1); _nativeMultiHashMap.Remove(_keyData[0]); _checkingResults.Remove(_keyData[0]); _nativeMultiHashMap.Remove(_keyData[_keySize - 1]); _checkingResults.Remove(_keyData[_keySize - 1]); newSize = (_keySize - 2) * _valueSize; Assert.AreEqual(newSize, _nativeMultiHashMap.Length); Assert.AreEqual(newSize, Length(_checkingResults)); }
/// <summary> /// Destroys all entities described in the <see cref="EntityChangeSet"/> /// </summary> /// <remarks> /// Since building the <see cref="NativeMultiHashMap{TEntityGuidComponent, Entity}"/> the entire world is expensive /// this method will incrementally update the map based on the destroyed entities. /// </remarks> private static unsafe void ApplyDestroyEntities( EntityManager entityManager, EntityChangeSet changeSet, NativeMultiHashMap <int, Entity> packedEntities, NativeMultiHashMap <EntityGuid, Entity> entityGuidToEntity) { for (var i = changeSet.Entities.Length - changeSet.DestroyedEntityCount; i < changeSet.Entities.Length; i++) { if (!packedEntities.TryGetFirstValue(i, out var entity, out var iterator)) { continue; } do { // Perform incremental updates on the entityGuidToEntity map to avoid a full rebuild. // @NOTE We do NOT remove from the `entityToEntityGuid` here since the LinkedEntityGroup removal will need it to map back groups. entityGuidToEntity.Remove(changeSet.Entities[i], entity); if (entityManager.EntityComponentStore->Exists(entity)) { entityManager.DestroyEntity(entity); } else { Debug.LogWarning($"DestroyEntity({entity}) but it does not exist."); } }while (packedEntities.TryGetNextValue(out entity, ref iterator)); } }
public static NativeArray <CollisionTriggerData>?CollectCollisionTriggerData(int frame, ref NativeMultiHashMap <Entity, CollisionTriggerData> nativeMultiHashMap, Entity e) { var count = nativeMultiHashMap.CountValuesForKey(e); NativeArray <CollisionTriggerData> collisions = new NativeArray <CollisionTriggerData>(count, Allocator.Temp); int idx = 0; if (nativeMultiHashMap.TryGetFirstValue(e, out var collInfo1, out var it)) { do { if (collInfo1.Frame < frame) { collInfo1.State = CollisionState.Exit; nativeMultiHashMap.Remove(it); } else if (collInfo1.Frame == frame) { // MBRIAU: What are we trying to clean up here? collInfo1.State = collInfo1.State == CollisionState.Stay ? CollisionState.Stay : CollisionState.Enter; } collisions[idx++] = collInfo1; }while (nativeMultiHashMap.TryGetNextValue(out collInfo1, ref it)); } return(collisions); }
public void NativeMultiHashMap_RemoveKeyValueThrowsInvalidParam() { var hashMap = new NativeMultiHashMap <int, long>(1, Allocator.Temp); Assert.Throws <ArgumentException>(() => hashMap.Remove(5, 5)); hashMap.Dispose(); }
public void Execute() { // Gather selected states together with predecessor input states if (SelectedStatesByHorizon.TryGetFirstValue(Horizon, out var stateKey, out var iterator)) { do { PredecessorInputStates.TryAdd(stateKey, default); }while (SelectedStatesByHorizon.TryGetNextValue(out stateKey, ref iterator)); } // Write to output list OutputStates.Clear(); var keys = PredecessorInputStates.GetKeyArray(Allocator.Temp); for (int i = 0; i < keys.Length; i++) { OutputStates.Add(keys[i]); } keys.Dispose(); // Clear out containers PredecessorInputStates.Clear(); SelectedStatesByHorizon.Remove(Horizon); }
internal void Prune() { var minimumReachableDepthMap = new NativeHashMap <TStateKey, int>(PlanGraph.Size, Allocator.Temp); using (var queue = new NativeQueue <StateHorizonPair <TStateKey> >(Allocator.Temp)) { PlanGraph.GetReachableDepthMap(RootStateKey, minimumReachableDepthMap, queue); } var stateKeyArray = PlanGraph.StateInfoLookup.GetKeyArray(Allocator.Temp); foreach (var stateKey in stateKeyArray) { if (!minimumReachableDepthMap.TryGetValue(stateKey, out _)) { // Graph containers PlanGraph.RemoveState(stateKey); StateDepthLookup.Remove(stateKey); BinnedStateKeyLookup.Remove(stateKey.GetHashCode(), stateKey); // State data for the key m_StateManager.DestroyState(stateKey); } } stateKeyArray.Dispose(); minimumReachableDepthMap.Dispose(); }
public bool Remove(int id) { if (!IdIndex.TryGetValue(id, out MeshSource source)) { return(false); } NativeList <int2> tileCoords = NativeBuildUtitls.GetOverlappingTiles(BuildSettings, source.Info.Bounds); for (int i = 0; i < tileCoords.Length; i++) { int2 coord = tileCoords[i]; int sourceId; NativeMultiHashMapIterator <int2> iterator; bool found = TileSources.TryGetFirstValue(coord, out sourceId, out iterator); while (found && sourceId == id) { TileSources.Remove(iterator); found = TileSources.TryGetNextValue(out sourceId, ref iterator); } } IdIndex.Remove(source.Id); CustomIdToIdIndex.Remove(source.Info.CustomData); source.Dispose(); return(true); }
public void NativeMultiHashMap_ForEach_Throws_When_Modified() { using (var container = new NativeMultiHashMap <int, int>(32, Allocator.TempJob)) { for (int i = 0; i < 30; ++i) { container.Add(i, 30 + i); container.Add(i, 60 + i); } Assert.Throws <ObjectDisposedException>(() => { foreach (var kv in container) { container.Add(10, 10); } }); Assert.Throws <ObjectDisposedException>(() => { foreach (var kv in container) { container.Remove(1); } }); } }
public void NativeMultiHashMap_ForEach_Throws_When_Modified() { using (var container = new NativeMultiHashMap <int, int>(32, Allocator.TempJob)) { for (int i = 0; i < 30; ++i) { container.Add(i, 30 + i); container.Add(i, 60 + i); } #if UNITY_2020_2_OR_NEWER Assert.Throws <ObjectDisposedException>(() => #else Assert.Throws <InvalidOperationException>(() => #endif { foreach (var kv in container) { container.Add(10, 10); } }); #if UNITY_2020_2_OR_NEWER Assert.Throws <ObjectDisposedException>(() => #else Assert.Throws <InvalidOperationException>(() => #endif { foreach (var kv in container) { container.Remove(1); } }); } }
internal static void UnregisterSubScene(Scene gameObjectScene, SubScene subScene) { if (!SubSceneLookup.IsCreated) { return; } SubSceneLookup.Remove(gameObjectScene.handle, new SubSceneData(subScene.SceneGUID, subScene.AutoLoadScene)); }
public static bool Remove <K, T>(this NativeMultiHashMap <K, T> HashMap, K Key, T Value) where T : struct, IEquatable <T> where K : struct, IEquatable <K> { if (HashMap.SelectIterator(Key, Value, out var It)) { HashMap.Remove(It); return(true); } return(false); }
static public void UpdateEntityRegion(Entity _entity, NativeMultiHashMap <int, Entity> _regionQuadrant, int _currentHashRegion, float3 _newPosition, int _mapWidth) { int _newHashRegion = Mathf.RoundToInt(_newPosition.x * _mapWidth + _newPosition.y); if (_currentHashRegion != _newHashRegion) { _regionQuadrant.Remove(_currentHashRegion, _entity); _currentHashRegion = _newHashRegion; _regionQuadrant.Add(_currentHashRegion, _entity); } }
public void RemoveFoodTypeFromZone(int zone, int foodIndex, DataLocation location) { if (organismsByFoodTypeInZones.TryGetFirstValue(new int2(zone, foodIndex), out DataLocation value, out var iterator)) { do { if (value.dataType == location.dataType && value.dataIndex == location.dataIndex) { organismsByFoodTypeInZones.Remove(iterator); return; } } while (organismsByFoodTypeInZones.TryGetNextValue(out value, ref iterator)); } }
public void NativeHashMap_RemoveFromMultiHashMap() { var hashMap = new NativeMultiHashMap <int, int> (16, Allocator.Temp); int iSquared; // Make sure inserting values work for (int i = 0; i < 8; ++i) { hashMap.Add(i, i * i); } for (int i = 0; i < 8; ++i) { hashMap.Add(i, i); } Assert.AreEqual(16, hashMap.Capacity, "HashMap grew larger than expected"); // Make sure reading the inserted values work for (int i = 0; i < 8; ++i) { NativeMultiHashMapIterator <int> it; Assert.IsTrue(hashMap.TryGetFirstValue(i, out iSquared, out it), "Failed get value from hash table"); Assert.AreEqual(iSquared, i, "Got the wrong value from the hash table"); Assert.IsTrue(hashMap.TryGetNextValue(out iSquared, ref it), "Failed get value from hash table"); Assert.AreEqual(iSquared, i * i, "Got the wrong value from the hash table"); } for (int rm = 0; rm < 8; ++rm) { Assert.AreEqual(2, hashMap.Remove(rm)); NativeMultiHashMapIterator <int> it; Assert.IsFalse(hashMap.TryGetFirstValue(rm, out iSquared, out it), "Failed to remove value from hash table"); for (int i = rm + 1; i < 8; ++i) { Assert.IsTrue(hashMap.TryGetFirstValue(i, out iSquared, out it), "Failed get value from hash table"); Assert.AreEqual(iSquared, i, "Got the wrong value from the hash table"); Assert.IsTrue(hashMap.TryGetNextValue(out iSquared, ref it), "Failed get value from hash table"); Assert.AreEqual(iSquared, i * i, "Got the wrong value from the hash table"); } } // Make sure entries were freed for (int i = 0; i < 8; ++i) { hashMap.Add(i, i * i); } for (int i = 0; i < 8; ++i) { hashMap.Add(i, i); } Assert.AreEqual(16, hashMap.Capacity, "HashMap grew larger than expected"); hashMap.Dispose(); }
public void RemoveEdge(Node child, Node parent) { var found = _nodeToParent.TryGetFirstValue(child, out var edge, out var it); while (found) { if (edge.Parent.Equals(parent)) { _nodeToParent.Remove(it); return; } found = _nodeToParent.TryGetNextValue(out edge, ref it); } }
public void RemoveReference(int index, int numRefs = 1) { if (index == 0) { return; } var newCount = m_SharedComponentRefCount[index] -= numRefs; Assert.IsTrue(newCount >= 0); if (newCount != 0) { return; } var typeIndex = m_SharedComponentType[index]; var hashCode = GetHashCodeFast(m_SharedComponentData[index], typeIndex); object sharedComponent = m_SharedComponentData[index]; (sharedComponent as IDisposable)?.Dispose(); m_SharedComponentData[index] = null; m_SharedComponentType[index] = -1; m_SharedComponentVersion[index] = m_FreeListIndex; m_FreeListIndex = index; int itemIndex; NativeMultiHashMapIterator <int> iter; if (!m_HashLookup.TryGetFirstValue(hashCode, out itemIndex, out iter)) { #if ENABLE_UNITY_COLLECTIONS_CHECKS throw new System.ArgumentException("RemoveReference didn't find element in in hashtable"); #endif } do { if (itemIndex == index) { m_HashLookup.Remove(iter); break; } }while (m_HashLookup.TryGetNextValue(out itemIndex, ref iter)) ; }
/// <summary> /// 当目标骨骼需要移除的时候使用,目前还不完善,先不要使用 /// </summary> /// <param name="target">要移除的目标骨骼</param> public void RemoveBone(DynamicBone target) { int index = boneList.IndexOf(target); if (index == -1) { return; } //TODO:移除骨骼的逻辑 boneList.RemoveAt(index); int curHeadIndex = target.HeadInfo.Index; //移除Bone的相关Collider的关系 boneColliderMatchMap.Remove(curHeadIndex); //是否是队列中末尾对象 bool isEndTarget = curHeadIndex == headInfoList.Length - 1; if (isEndTarget) { headInfoList.RemoveAtSwapBack(curHeadIndex); headTransformAccessArray.RemoveAtSwapBack(curHeadIndex); for (int i = MaxParticleLimit - 1; i >= 0; i--) { int dataOffset = curHeadIndex * MaxParticleLimit + i; particleInfoList.RemoveAtSwapBack(dataOffset); particleTransformAccessArray.RemoveAtSwapBack(dataOffset); } } else { //将最末列的HeadInfo 索引设置为当前将要移除的HeadInfo 索引 DynamicBone lastTarget = boneList[boneList.Count - 1]; HeadInfo lastHeadInfo = lastTarget.ResetHeadIndexAndDataOffset(curHeadIndex); headInfoList.RemoveAtSwapBack(curHeadIndex); headInfoList[curHeadIndex] = lastHeadInfo; headTransformAccessArray.RemoveAtSwapBack(curHeadIndex); for (int i = MaxParticleLimit - 1; i >= 0; i--) { int dataOffset = curHeadIndex * MaxParticleLimit + i; particleInfoList.RemoveAtSwapBack(dataOffset); particleTransformAccessArray.RemoveAtSwapBack(dataOffset); } } target.ClearJobData(); }
public void RemoveReference(int index, int numRefs = 1) { if (index == 0) { return; } var newCount = m_SharedComponentRefCount[index] -= numRefs; Assert.IsTrue(newCount >= 0); if (newCount != 0) { return; } var typeIndex = m_SharedComponentType[index]; var hashCode = TypeManager.GetHashCode(m_SharedComponentData[index], typeIndex); object sharedComponent = m_SharedComponentData[index]; (sharedComponent as IRefCounted)?.Release(); m_SharedComponentData[index] = null; m_SharedComponentType[index] = -1; m_SharedComponentVersion[index] = m_FreeListIndex; m_FreeListIndex = index; int itemIndex; NativeMultiHashMapIterator <int> iter; if (m_HashLookup.TryGetFirstValue(hashCode, out itemIndex, out iter)) { do { if (itemIndex == index) { m_HashLookup.Remove(iter); return; } } while (m_HashLookup.TryGetNextValue(out itemIndex, ref iter)) ; } ThrowIndeterministicHash(sharedComponent); }
public void RemoveReference(int index) { if (index == 0) { return; } var newCount = --m_SharedComponentRefCount[index]; Assert.IsTrue(newCount >= 0); if (newCount != 0) { return; } var typeIndex = m_SharedComponentType[index]; var fastLayout = TypeManager.GetComponentType(typeIndex).FastEqualityLayout; var hashCode = GetHashCodeFast(m_SharedComponentData[index], fastLayout); m_SharedComponentData[index] = null; m_SharedComponentType[index] = -1; m_SharedComponentVersion[index] = m_FreeListIndex; m_FreeListIndex = index; int itemIndex; NativeMultiHashMapIterator <int> iter; if (!m_HashLookup.TryGetFirstValue(hashCode, out itemIndex, out iter)) { #if ENABLE_UNITY_COLLECTIONS_CHECKS throw new System.ArgumentException("RemoveReference didn't find element in in hashtable"); #endif } do { if (itemIndex == index) { m_HashLookup.Remove(iter); break; } }while (m_HashLookup.TryGetNextValue(out itemIndex, ref iter)) ; }
private void UpdateFriends(int currentMobIndex) { // _centerMassJobHandle.Complete(); mobFriends.Remove(currentMobIndex); // _friendDictionary[currentMobIndex].Clear(); for (int otherMobIndex = 0; otherMobIndex < mobCount; otherMobIndex++) { float dist = Vector2.Distance(allMobData[currentMobIndex].Position, allMobData[otherMobIndex].Position); if (dist < findFriendRadius) { mobFriends.Add(currentMobIndex, allMobData[otherMobIndex]); // _friendDictionary[currentMobIndex].Add(allMobData[otherMobIndex]); } } // print($"Updated mob friends for: {currentMobIndex}"); }
public void Execute() { var space = Completed.Count * 20; // 20 is approximate queue length while (Completed.Count > 0) { var idx = Completed.Dequeue(); if (AllQueues.TryGetFirstValue(idx, out _, out _)) { AllQueues.Remove(idx); } } while (AllQueues.Capacity - AllQueues.Length < space) { AllQueues.Capacity *= 2; } }
public void NativeMultiHashMap_IsEmpty() { var container = new NativeMultiHashMap <int, int>(0, Allocator.Persistent); Assert.IsTrue(container.IsEmpty); container.Add(0, 0); Assert.IsFalse(container.IsEmpty); Assert.AreEqual(1, container.Capacity); ExpectedCount(ref container, 1); container.Remove(0, 0); Assert.IsTrue(container.IsEmpty); container.Add(0, 0); container.Clear(); Assert.IsTrue(container.IsEmpty); container.Dispose(); }
public void RemoveReference(int index) { if (index == 0) { return; } var newCount = --m_SharedComponentRefCount[index]; if (newCount != 0) { return; } var typeIndex = m_SharedComponentType[index]; var fastLayout = TypeManager.GetComponentType(typeIndex).FastEqualityLayout; var hashCode = GetHashCodeFast(m_SharedComponentData[index], fastLayout); m_SharedComponentData[index] = null; m_SharedComponentType[index] = -1; int itemIndex; NativeMultiHashMapIterator <int> iter; if (!m_HashLookup.TryGetFirstValue(hashCode, out itemIndex, out iter)) { return; } do { if (itemIndex != index) { continue; } m_HashLookup.Remove(iter); break; }while (m_HashLookup.TryGetNextValue(out itemIndex, ref iter)); }
public unsafe bool HandleClientDisconnect(Room room, int connectionId, bool serverDisconnect = false) { room.ValidityCheck(); if (room.Server.ConnectionId == connectionId) { // The server just disconnected, tell all the clients in the room foreach (Client client in connectedClients.GetValuesForKey(room.RoomId)) { // Disconnects the client connections[client.ConnectionId].Disconnect(driver); connections[client.ConnectionId] = default(NetworkConnection); } NativeArray <FixedString32> ServerAddressToRoomKeys = serverAddressToRoomID.GetKeyArray(Allocator.Temp); foreach (FixedString32 key in ServerAddressToRoomKeys) { if (serverAddressToRoomID[key].Equals(room)) { // Remove ourself from the reverse lookup table serverAddressToRoomID.Remove(key); break; } } ServerAddressToRoomKeys.Dispose(); // Delete the room for (int i = 0; i < rooms.Length; i++) { if (rooms[i].RoomId == room.RoomId) { rooms.RemoveAtSwapBack(i); break; } } connectedClients.Remove(room.RoomId); // Release roomId since the room should be considered useless ReleasedRooms.AddNoResize(room); room.IsNotValid = true; return(true); } else if (ContainsConnectedID(room.RoomId, connectionId)) { // A client is attempting to disconnect Client client = default(Client); foreach (Client c in connectedClients.GetValuesForKey(room.RoomId)) { if (c.ConnectionId == connectionId) { client = c; } } if (serverDisconnect) { // The server requested this disconnect. Just throw them out. connections[connectionId].Disconnect(driver); } else { // The client was the one that disconnected. Notify the server! DataStreamWriter writer; writer = driver.BeginSend(pipeline, connections[connectionId]); // Write the message type suffixed writer.WriteByte((byte)MessageType.ClientDisconnect); // Write the connectionId of the client that disconnected at the beginning of the buffer. writer.WriteInt(connectionId); // Send the message to the server driver.EndSend(writer); } // Remove the disconnected client from the list client.IsDisconnect = true; return(true); } return(false); }
internal void UpdateBlobAssetForGameObject <TB>(int ownerId, NativeArray <Hash128> newBlobHashes) where TB : struct { var leftLength = newBlobHashes.Length; var toInc = new NativeArray <Hash128>(leftLength, Allocator.Temp); var toDec = new NativeArray <Hash128>(m_HashByOwner.CountValuesForKey(ownerId), Allocator.Temp); var curLeftIndex = 0; var curIncIndex = 0; var curDecIndex = 0; var leftRes = curLeftIndex < leftLength; var rightRes = m_HashByOwner.TryGetFirstValue(ownerId, out var rightHash, out var it); var maxHash = new Hash128(UInt32.MaxValue, UInt32.MaxValue, UInt32.MaxValue, UInt32.MaxValue); // We will parse newBlobHashes, considered the left part and the store hashes for this ownerId, considered the right part // in order to build a list of BlobAssets to increment (the ones only present in left part) and the ones to decrement // (only present in the right part). If a hash is present on both side, we do not change its RefCounter do { var leftHash = leftRes ? newBlobHashes[curLeftIndex] : maxHash; rightHash = rightRes ? rightHash : maxHash; // Both side equal? We are synchronized, step next for both sides if (rightHash == leftHash) { leftRes = ++curLeftIndex < leftLength; rightRes = m_HashByOwner.TryGetNextValue(out rightHash, ref it); continue; } // More items on the left, add them to the "toAdd" list if (leftHash < rightHash) { do { // Get left hash leftHash = newBlobHashes[curLeftIndex++]; // Add to "toInc" toInc[curIncIndex++] = leftHash; // Check if there's more left item leftRes = curLeftIndex < leftLength; } while (leftRes && (leftHash < rightHash)); } // More items on the right, add them to the "toRemove" list else { do { // Add to "toDec" toDec[curDecIndex++] = rightHash; // Get next right item rightRes = m_HashByOwner.TryGetNextValue(out rightHash, ref it); } while (rightRes && leftHash > rightHash); } } while (leftRes || rightRes); // Increment each hash in "toInc" if they exist, add them to the RefCounter hash if they are new for (int i = 0; i < curIncIndex; i++) { var hash = toInc[i]; if (m_RefCounterPerBlobHash.TryGetValue(hash, out var counter)) { m_RefCounterPerBlobHash[hash] = counter + 1; } else { m_RefCounterPerBlobHash.Add(hash, 1); } } // Decrement each hash in "toDec", remove the BlobAsset if it reaches 0 for (int i = 0; i < curDecIndex; i++) { // Decrement the hash of the previously assigned Blob Asset var hash = toDec[i]; var oldHashCount = --m_RefCounterPerBlobHash[hash]; // If it reaches 0, we dispose the Blob Asset and remove the counter if (oldHashCount == 0) { Remove <TB>(hash, true); m_RefCounterPerBlobHash.Remove(hash); } } // Clear the former list of BlobAsset hashes and replace by the new one m_HashByOwner.Remove(ownerId); for (int i = 0; i < leftLength; i++) { m_HashByOwner.Add(ownerId, newBlobHashes[i]); } }