public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
From: "Steven Shi" <steven.shi@intel.com>
To: "Lin, Derek (HPS SW)" <derek.lin2@hpe.com>,
	"devel@edk2.groups.io" <devel@edk2.groups.io>
Cc: "Feng, Bob C" <bob.c.feng@intel.com>,
	"Gao, Liming" <liming.gao@intel.com>
Subject: Re: BaseTools --hash malfunction after migrate from stable201905 to stable201908
Date: Tue, 24 Sep 2019 08:12:29 +0000	[thread overview]
Message-ID: <06C8AB66E78EE34A949939824ABE2B3140183293@shsmsx102.ccr.corp.intel.com> (raw)
In-Reply-To: TU4PR8401MB12453B59C967AF0DDAE16DD8C2880@TU4PR8401MB1245.NAMPRD84.PROD.OUTLOOK.COM


[-- Attachment #1.1: Type: text/plain, Size: 3177 bytes --]

Hi Derek,
I took a look at this issue, and add back the -hash functionality in this branch: https://github.com/shijunjing/edk2/tree/hashcache_v1. The attachment is the patch based on latest edk2.
The current -hash fix performance is not as good as the edk2-stable201905,  because the edk2-stable201905 -hash doesn't parse source code header files and not include the header files as the module's hash dependency which is not sound. After add the header files dependency, the -hash become slow.
Let me know whether the attached fix works for you. I might continue to tune its performance.


Thanks

Steven Shi
Intel\SSG\FID\Firmware Infrastructure

From: Shi, Steven
Sent: Monday, September 23, 2019 3:52 PM
To: 'Lin, Derek (HPS SW)' <derek.lin2@hpe.com>; devel@edk2.groups.io
Cc: Feng, Bob C <bob.c.feng@intel.com>; Gao, Liming (liming.gao@intel.com) <liming.gao@intel.com>
Subject: RE: BaseTools --hash malfunction after migrate from stable201905 to stable201908

Hi Derek,
Thank you to raise this issue. We will fix it and add back the --hash original functionality.


Thanks
Steven

From: Lin, Derek (HPS SW) [mailto:derek.lin2@hpe.com]
Sent: Friday, September 20, 2019 2:54 PM
To: devel@edk2.groups.io<mailto:devel@edk2.groups.io>
Cc: Feng, Bob C <bob.c.feng@intel.com<mailto:bob.c.feng@intel.com>>; Shi, Steven <steven.shi@intel.com<mailto:steven.shi@intel.com>>
Subject: BaseTools --hash malfunction after migrate from stable201905 to stable201908

Hi BaseTools experts,

We saw the clean build performance improvement after updating from edk2-stable201905 to edk2-stable201908, which is promising.
But we found the incremental build time increase 25%~35%, which is not good.

We're building server platform. And we use --hash to speed up incremental build as described in https://github.com/BobCF/edk2/wiki/Incremental-Build
After upgrading to edk2-stable201908, none of AutoGen is skipped when I did an incremental build without any code change.

Dig into the issue, it is caused by commit https://github.com/tianocore/edk2/commit/0e7e7a264cd80ab71ea0f9e9da2d0617d4b539c4
>From the code change logic, it seems like it require both --hash and --binary-source to perform cache based incremental build. However, when we only have --hash in build flag, it is not functional.


Here's a build time log retrieved from -y report.log.

                                        201905 incremental |  201908 incremental |      201905 clean |     201908 clean
Build Duration:                00:03:44                        00:04:40                            00:07:12                 00:06:19
AutoGen Duration:          00:02:40                        00:03:15                            00:04:28                 00:03:09
Make Duration:                00:00:36                        00:00:39                            00:01:21                 00:01:34
GenFds Duration:             00:00:27                        00:00:28                            00:01:22                 00:01:18

We could see the AutoGen Duration has almost no change between clean and incremental build in 201908.

Could we fix this?

Thanks,
Derek


[-- Attachment #1.2: Type: text/html, Size: 10923 bytes --]

[-- Attachment #2: 0001-Fix-the-hash-functionality-for-increamental-build.patch --]
[-- Type: application/octet-stream, Size: 23859 bytes --]

From 24f30a6bfa2cc31eab58f8699e475a0329fd4659 Mon Sep 17 00:00:00 2001
From: Steven <steven.shi@intel.com>
Date: Tue, 24 Sep 2019 15:55:56 +0800
Subject: [PATCH] Fix the --hash functionality for increamental build

Current --hash option has no effect and cannot skip module build
in the increamental build. This patch add back the original
functionality.

Signed-off-by: Steven Shi <steven.shi@intel.com>
---
 .../Source/Python/AutoGen/AutoGenWorker.py    |   5 +
 BaseTools/Source/Python/AutoGen/CacheIR.py    |   3 +
 BaseTools/Source/Python/AutoGen/DataPipe.py   |   2 +
 .../Source/Python/AutoGen/ModuleAutoGen.py    | 114 ++++++++++--------
 BaseTools/Source/Python/Common/GlobalData.py  |   5 -
 BaseTools/Source/Python/build/build.py        |  81 ++++---------
 6 files changed, 98 insertions(+), 112 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
index 94ea61a487..aec843a444 100755
--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
@@ -184,6 +184,7 @@ class AutoGenWorkerInProcess(mp.Process):
             GlobalData.gDisableIncludePathCheck = False
             GlobalData.gFdfParser = self.data_pipe.Get("FdfParser")
             GlobalData.gDatabasePath = self.data_pipe.Get("DatabasePath")
+            GlobalData.gUseHashCache = self.data_pipe.Get("UseHashCache")
             GlobalData.gBinCacheSource = self.data_pipe.Get("BinCacheSource")
             GlobalData.gBinCacheDest = self.data_pipe.Get("BinCacheDest")
             GlobalData.gCacheIR = self.share_data
@@ -240,6 +241,10 @@ class AutoGenWorkerInProcess(mp.Process):
                     Ma.GenPreMakefileHash(GlobalData.gCacheIR)
                     if Ma.CanSkipbyPreMakefileCache(GlobalData.gCacheIR):
                         continue
+                elif GlobalData.gUseHashCache and CommandTarget in [None, "", "all"]:
+                    Ma.GenModuleIncrmtlHash(GlobalData.gCacheIR)
+                    if Ma.CanSkipbyIncrmtlCache(GlobalData.gCacheIR):
+                        continue
 
                 Ma.CreateCodeFile(False)
                 Ma.CreateMakeFile(False,GenFfsList=FfsCmd.get((Ma.MetaFile.Path, Ma.Arch),[]))
diff --git a/BaseTools/Source/Python/AutoGen/CacheIR.py b/BaseTools/Source/Python/AutoGen/CacheIR.py
index 715be5273c..71012aafbb 100755
--- a/BaseTools/Source/Python/AutoGen/CacheIR.py
+++ b/BaseTools/Source/Python/AutoGen/CacheIR.py
@@ -27,3 +27,6 @@ class ModuleBuildCacheIR():
         self.CacheCrash = False
         self.PreMakeCacheHit = False
         self.MakeCacheHit = False
+        # Hash Cache used for increamental build
+        self.IncrmtlCacheHit = False
+
diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source/Python/AutoGen/DataPipe.py
index 8b8cfd1c51..4baeeb1d85 100755
--- a/BaseTools/Source/Python/AutoGen/DataPipe.py
+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py
@@ -159,6 +159,8 @@ class MemoryDataPipe(DataPipe):
 
         self.DataContainer = {"LogLevel": EdkLogger.GetLevel()}
 
+        self.DataContainer = {"UseHashCache":GlobalData.gUseHashCache}
+
         self.DataContainer = {"BinCacheSource":GlobalData.gBinCacheSource}
 
         self.DataContainer = {"BinCacheDest":GlobalData.gBinCacheDest}
diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
index fad5bab0f2..bb650b61b4 100755
--- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
@@ -1731,6 +1731,12 @@ class ModuleAutoGen(AutoGen):
             for File in gDict[(self.MetaFile.Path, self.Arch)].AutoGenFileList:
                 CopyFileOnChange(str(File), CacheDebugDir)
 
+        for Root, Dirs, Files in os.walk(self.DebugDir):
+            for File in Files:
+                if File.lower().endswith('.dll'):
+                    NewFile = path.join(self.DebugDir, File)
+                    CopyFileOnChange(NewFile, CacheDebugDir)
+
         return True
 
     ## Create makefile for the module and its dependent libraries
@@ -1914,50 +1920,15 @@ class ModuleAutoGen(AutoGen):
                     self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE)
         return RetVal
 
-    def GenModuleHash(self):
-        # Initialize a dictionary for each arch type
-        if self.Arch not in GlobalData.gModuleHash:
-            GlobalData.gModuleHash[self.Arch] = {}
-
-        # Early exit if module or library has been hashed and is in memory
-        if self.Name in GlobalData.gModuleHash[self.Arch]:
-            return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8')
-
-        # Initialze hash object
-        m = hashlib.md5()
-
-        # Add Platform level hash
-        m.update(GlobalData.gPlatformHash.encode('utf-8'))
-
-        # Add Package level hash
-        if self.DependentPackageList:
-            for Pkg in sorted(self.DependentPackageList, key=lambda x: x.PackageName):
-                if Pkg.PackageName in GlobalData.gPackageHash:
-                    m.update(GlobalData.gPackageHash[Pkg.PackageName].encode('utf-8'))
-
-        # Add Library hash
-        if self.LibraryAutoGenList:
-            for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.Name):
-                if Lib.Name not in GlobalData.gModuleHash[self.Arch]:
-                    Lib.GenModuleHash()
-                m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].encode('utf-8'))
-
-        # Add Module self
-        with open(str(self.MetaFile), 'rb') as f:
-            Content = f.read()
-        m.update(Content)
-
-        # Add Module's source files
-        if self.SourceFileList:
-            for File in sorted(self.SourceFileList, key=lambda x: str(x)):
-                f = open(str(File), 'rb')
-                Content = f.read()
-                f.close()
-                m.update(Content)
-
-        GlobalData.gModuleHash[self.Arch][self.Name] = m.hexdigest()
-
-        return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8')
+    def GenModuleIncrmtlHash(self, gDict):
+        self.GenModuleFilesHash(gDict)
+        self.GenPreMakefileHash(gDict)
+        if not (self.MetaFile.Path, self.Arch) in gDict or \
+           not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest:
+            EdkLogger.quiet("[cache warning]: Cannot generate PreMakefileHashHexDigest for module %s[%s]" %(self.MetaFile.Path, self.Arch))
+            return None
+        else:
+            return gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest.encode('utf-8')
 
     def GenModuleFilesHash(self, gDict):
         # Early exit if module or library has been hashed and is in memory
@@ -2077,8 +2048,6 @@ class ModuleAutoGen(AutoGen):
             for Pkg in sorted(self.DependentPackageList, key=lambda x: x.PackageName):
                 if (Pkg.PackageName, 'PackageHash') in gDict:
                     m.update(gDict[(Pkg.PackageName, 'PackageHash')].encode('utf-8'))
-                else:
-                    EdkLogger.quiet("[cache warning]: %s PackageHash needed by %s[%s] is missing" %(Pkg.PackageName, self.MetaFile.Name, self.Arch))
 
         # Add Library hash
         if self.LibraryAutoGenList:
@@ -2241,6 +2210,53 @@ class ModuleAutoGen(AutoGen):
 
         return gDict[(self.MetaFile.Path, self.Arch)]
 
+
+    ## Decide whether we can skip the left autogen and make process
+    def CanSkipbyIncrmtlCache(self, gDict):
+        if not GlobalData.gUseHashCache:
+            return False
+
+        # Disable incremental cache if binary cache is enabled
+        if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest:
+            return False
+
+        if not (self.MetaFile.Path, self.Arch) in gDict:
+            return False
+
+        if gDict[(self.MetaFile.Path, self.Arch)].IncrmtlCacheHit:
+            return True
+
+        if gDict[(self.MetaFile.Path, self.Arch)].CacheCrash:
+            return False
+
+        # If Module is binary, do not skip by cache
+        if self.IsBinaryModule:
+            return False
+
+        # .inc is contains binary information so do not skip by hash as well
+        for f_ext in self.SourceFileList:
+            if '.inc' in str(f_ext):
+                return False
+
+        # Early exit for libraries that haven't yet finished building
+        HashFile = path.join(self.BuildDir, self.Name + ".hash")
+        if not os.path.exists(HashFile):
+            return False
+
+        with open(HashFile, "rb") as f:
+            if self.GenModuleIncrmtlHash(gDict) == f.read():
+                IncrmtlCacheHit = True
+            else:
+                IncrmtlCacheHit = False
+
+        if IncrmtlCacheHit:
+            with GlobalData.cache_lock:
+                IR = gDict[(self.MetaFile.Path, self.Arch)]
+                IR.IncrmtlCacheHit = IncrmtlCacheHit
+                gDict[(self.MetaFile.Path, self.Arch)] = IR
+            print("[hash hit]:", self.MetaFile.Path, self.Arch)
+        return IncrmtlCacheHit
+
     ## Decide whether we can skip the left autogen and make process
     def CanSkipbyPreMakefileCache(self, gDict):
         if not GlobalData.gBinCacheSource:
@@ -2500,7 +2516,7 @@ class ModuleAutoGen(AutoGen):
     ## Decide whether we can skip the ModuleAutoGen process
     def CanSkipbyCache(self, gDict):
         # Hashing feature is off
-        if not GlobalData.gBinCacheSource:
+        if not GlobalData.gUseHashCache:
             return False
 
         if self in GlobalData.gBuildHashSkipTracking:
@@ -2520,6 +2536,10 @@ class ModuleAutoGen(AutoGen):
         if not (self.MetaFile.Path, self.Arch) in gDict:
             return False
 
+        if gDict[(self.MetaFile.Path, self.Arch)].IncrmtlCacheHit:
+            GlobalData.gBuildHashSkipTracking[self] = True
+            return True
+
         if gDict[(self.MetaFile.Path, self.Arch)].PreMakeCacheHit:
             GlobalData.gBuildHashSkipTracking[self] = True
             return True
diff --git a/BaseTools/Source/Python/Common/GlobalData.py b/BaseTools/Source/Python/Common/GlobalData.py
index 8eb72aa1d6..34f76e595b 100755
--- a/BaseTools/Source/Python/Common/GlobalData.py
+++ b/BaseTools/Source/Python/Common/GlobalData.py
@@ -109,11 +109,6 @@ gModuleHash = {}
 gEnableGenfdsMultiThread = True
 gSikpAutoGenCache = set()
 
-# Dictionary for tracking Module build status as success or failure
-# Top Dict:     Key: Arch Type              Value: Dictionary
-# Second Dict:  Key: AutoGen Obj    Value: 'SUCCESS'\'FAIL'\'FAIL_METAFILE'
-gModuleBuildTracking = dict()
-
 # Dictionary of booleans that dictate whether a module or
 # library can be skiped
 # Top Dict:     Key: Arch Type              Value: Dictionary
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index bcd832c525..ecd2dfa3b5 100755
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -612,9 +612,9 @@ class BuildTask:
             self.CompleteFlag = True
 
             # Run hash operation post dependency, to account for libs
-            if GlobalData.gUseHashCache and self.BuildItem.BuildObject.IsLibrary:
+            if GlobalData.gUseHashCache:
                 HashFile = path.join(self.BuildItem.BuildObject.BuildDir, self.BuildItem.BuildObject.Name + ".hash")
-                SaveFileOnChange(HashFile, self.BuildItem.BuildObject.GenModuleHash(), True)
+                SaveFileOnChange(HashFile, self.BuildItem.BuildObject.GenModuleIncrmtlHash(GlobalData.gCacheIR), True)
         except:
             #
             # TRICK: hide the output of threads left running, so that the user can
@@ -631,14 +631,6 @@ class BuildTask:
             BuildTask._ErrorMessage = "%s broken\n    %s [%s]" % \
                                       (threading.currentThread().getName(), Command, WorkingDir)
 
-        # Set the value used by hash invalidation flow in GlobalData.gModuleBuildTracking to 'SUCCESS'
-        # If Module or Lib is being tracked, it did not fail header check test, and built successfully
-        if (self.BuildItem.BuildObject in GlobalData.gModuleBuildTracking and
-           GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject] != 'FAIL_METAFILE' and
-           not BuildTask._ErrorFlag.isSet()
-           ):
-            GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject] = 'SUCCESS'
-
         # indicate there's a thread is available for another build task
         BuildTask._RunningQueueLock.acquire()
         BuildTask._RunningQueue.pop(self.BuildItem)
@@ -841,6 +833,10 @@ class Build():
                         PcdMa.GenPreMakefileHash(share_data)
                         if PcdMa.CanSkipbyPreMakefileCache(share_data):
                             continue
+                    elif GlobalData.gUseHashCache and self.Target in [None, "", "all"]:
+                        PcdMa.GenModuleIncrmtlHash(share_data)
+                        if PcdMa.CanSkipbyIncrmtlCache(share_data):
+                            continue
 
                     PcdMa.CreateCodeFile(False)
                     PcdMa.CreateMakeFile(False,GenFfsList = DataPipe.Get("FfsCommand").get((PcdMa.MetaFile.Path, PcdMa.Arch),[]))
@@ -1160,38 +1156,6 @@ class Build():
                 EdkLogger.error("Postbuild", POSTBUILD_ERROR, 'Postbuild process is not success!')
             EdkLogger.info("\n- Postbuild Done -\n")
 
-    ## Error handling for hash feature
-    #
-    # On BuildTask error, iterate through the Module Build tracking
-    # dictionary to determine wheather a module failed to build. Invalidate
-    # the hash associated with that module by removing it from storage.
-    #
-    #
-    def invalidateHash(self):
-        # Only for hashing feature
-        if not GlobalData.gUseHashCache:
-            return
-
-        # GlobalData.gModuleBuildTracking contains only modules or libs that cannot be skipped by hash
-        for Ma in GlobalData.gModuleBuildTracking:
-            # Skip invalidating for Successful Module/Lib builds
-            if GlobalData.gModuleBuildTracking[Ma] == 'SUCCESS':
-                continue
-
-            # The module failed to build, failed to start building, or failed the header check test from this point on
-
-            # Remove .hash from build
-            ModuleHashFile = os.path.join(Ma.BuildDir, Ma.Name + ".hash")
-            if os.path.exists(ModuleHashFile):
-                os.remove(ModuleHashFile)
-
-            # Remove .hash file from cache
-            if GlobalData.gBinCacheDest:
-                FileDir = os.path.join(GlobalData.gBinCacheDest, Ma.PlatformInfo.OutputDir, Ma.BuildTarget + "_" + Ma.ToolChain, Ma.Arch, Ma.SourceDir, Ma.MetaFile.BaseName)
-                HashFile = os.path.join(FileDir, Ma.Name + '.hash')
-                if os.path.exists(HashFile):
-                    os.remove(HashFile)
-
     ## Build a module or platform
     #
     # Create autogen code and makefile for a module or platform, and the launch
@@ -1836,7 +1800,7 @@ class Build():
                     CmdListDict = self._GenFfsCmd(Wa.ArchList)
 
                 # Add Platform and Package level hash in share_data for module hash calculation later
-                if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest:
+                if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest or GlobalData.gUseHashCache:
                     GlobalData.gCacheIR[('PlatformHash')] = GlobalData.gPlatformHash
                     for PkgName in GlobalData.gPackageHash.keys():
                         GlobalData.gCacheIR[(PkgName, 'PackageHash')] = GlobalData.gPackageHash[PkgName]
@@ -1872,6 +1836,10 @@ class Build():
                                     self.HashSkipModules.append(Ma)
                                     EdkLogger.quiet("cache hit: %s[%s]" % (Ma.MetaFile.Path, Ma.Arch))
                                     continue
+                            elif GlobalData.gUseHashCache and self.Target in [None, "", "all"]:
+                                Ma.GenModuleIncrmtlHash(GlobalData.gCacheIR)
+                                if Ma.CanSkipbyIncrmtlCache(GlobalData.gCacheIR):
+                                    continue
 
                             # Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run', 'fds'
                             if self.Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
@@ -1905,8 +1873,6 @@ class Build():
                                         Ma.PrintFirstMakeCacheMissFile(GlobalData.gCacheIR)
 
                             self.BuildModules.append(Ma)
-                            # Initialize all modules in tracking to 'FAIL'
-                            GlobalData.gModuleBuildTracking[Ma] = 'FAIL'
                     self.AutoGenTime += int(round((time.time() - AutoGenStart)))
                     MakeStart = time.time()
                     for Ma in self.BuildModules:
@@ -1917,7 +1883,6 @@ class Build():
                             # we need a full version of makefile for platform
                             ExitFlag.set()
                             BuildTask.WaitForComplete()
-                            self.invalidateHash()
                             Pa.CreateMakeFile(False)
                             EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                         # Start task scheduler
@@ -1927,7 +1892,6 @@ class Build():
                     # in case there's an interruption. we need a full version of makefile for platform
                     Pa.CreateMakeFile(False)
                     if BuildTask.HasError():
-                        self.invalidateHash()
                         EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                     self.MakeTime += int(round((time.time() - MakeStart)))
 
@@ -1940,7 +1904,6 @@ class Build():
                 self.BuildModules = []
                 self.MakeTime += int(round((time.time() - MakeContiue)))
                 if BuildTask.HasError():
-                    self.invalidateHash()
                     EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
 
                 self.BuildReport.AddPlatformReport(Wa, MaList)
@@ -1993,7 +1956,6 @@ class Build():
                     # Save MAP buffer into MAP file.
                     #
                     self._SaveMapFile (MapBuffer, Wa)
-        self.invalidateHash()
 
     def _GenFfsCmd(self,ArchList):
         # convert dictionary of Cmd:(Inf,Arch)
@@ -2105,7 +2067,7 @@ class Build():
             CmdListDict = self._GenFfsCmd(Wa.ArchList)
 
         # Add Platform and Package level hash in share_data for module hash calculation later
-        if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest:
+        if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest or GlobalData.gUseHashCache:
             GlobalData.gCacheIR[('PlatformHash')] = GlobalData.gPlatformHash
             for PkgName in GlobalData.gPackageHash.keys():
                 GlobalData.gCacheIR[(PkgName, 'PackageHash')] = GlobalData.gPackageHash[PkgName]
@@ -2142,7 +2104,7 @@ class Build():
                 ModuleCodaFile[(ma.MetaFile.File,ma.MetaFile.Root,ma.Arch,ma.MetaFile.Path)] = [item.Target for item in ma.CodaTargetList]
             Pa.DataPipe.DataContainer = {"ModuleCodaFile":ModuleCodaFile}
             for Module in ModuleList:
-                        # Get ModuleAutoGen object to generate C code file and makefile
+                # Get ModuleAutoGen object to generate C code file and makefile
                 Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
 
                 if Ma is None:
@@ -2152,9 +2114,6 @@ class Build():
                     Ma.Workspace = Wa
                     PcdMaList.append(Ma)
                 TotalModules.append(Ma)
-                # Initialize all modules in tracking to 'FAIL'
-                GlobalData.gModuleBuildTracking[Ma] = 'FAIL'
-
 
             mqueue = mp.Queue()
             for m in Pa.GetAllModuleInfo:
@@ -2165,8 +2124,12 @@ class Build():
             autogen_rt, errorcode = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList,GlobalData.gCacheIR)
 
             # Skip cache hit modules
-            if GlobalData.gBinCacheSource:
+            if GlobalData.gUseHashCache:
                 for Ma in TotalModules:
+                    if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.gCacheIR and \
+                        GlobalData.gCacheIR[(Ma.MetaFile.Path, Ma.Arch)].IncrmtlCacheHit:
+                            self.HashSkipModules.append(Ma)
+                            continue
                     if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.gCacheIR and \
                         GlobalData.gCacheIR[(Ma.MetaFile.Path, Ma.Arch)].PreMakeCacheHit:
                             self.HashSkipModules.append(Ma)
@@ -2224,7 +2187,9 @@ class Build():
                     for IR in GlobalData.gCacheIR.keys():
                         if 'PlatformHash' in IR or 'PackageHash' in IR:
                             continue
-                        if GlobalData.gCacheIR[IR].PreMakeCacheHit or GlobalData.gCacheIR[IR].MakeCacheHit:
+                        if GlobalData.gCacheIR[IR].PreMakeCacheHit or \
+                           GlobalData.gCacheIR[IR].MakeCacheHit or \
+                           GlobalData.gCacheIR[IR].IncrmtlCacheHit:
                             CacheHitMa.add(IR)
                         else:
                             # There might be binary module or module which has .inc files, not count for cache miss
@@ -2242,7 +2207,6 @@ class Build():
                             # we need a full version of makefile for platform
                             ExitFlag.set()
                             BuildTask.WaitForComplete()
-                            self.invalidateHash()
                             Pa.CreateMakeFile(False)
                             EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                         # Start task scheduler
@@ -2252,7 +2216,6 @@ class Build():
                     # in case there's an interruption. we need a full version of makefile for platform
 
                     if BuildTask.HasError():
-                        self.invalidateHash()
                         EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                     self.MakeTime += int(round((time.time() - MakeStart)))
 
@@ -2274,7 +2237,6 @@ class Build():
                 # has been signaled.
                 #
                 if BuildTask.HasError():
-                    self.invalidateHash()
                     EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
 
                 # Create MAP file when Load Fix Address is enabled.
@@ -2315,7 +2277,6 @@ class Build():
                     #
                     self._SaveMapFile(MapBuffer, Wa)
                 self.CreateGuidedSectionToolsFile(Wa)
-        self.invalidateHash()
     ## Generate GuidedSectionTools.txt in the FV directories.
     #
     def CreateGuidedSectionToolsFile(self,Wa):
-- 
2.22.0.windows.1


  parent reply	other threads:[~2019-09-24  8:12 UTC|newest]

Thread overview: 6+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2019-09-20  6:54 BaseTools --hash malfunction after migrate from stable201905 to stable201908 Lin, Derek (HPS SW)
2019-09-23  7:51 ` Steven Shi
2019-09-24  8:12 ` Steven Shi [this message]
2019-09-27  7:57   ` [edk2-devel] " Lin, Derek (HPS SW)
2019-09-29  2:31     ` Steven Shi
2019-10-21  5:27       ` Steven Shi

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-list from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=06C8AB66E78EE34A949939824ABE2B3140183293@shsmsx102.ccr.corp.intel.com \
    --to=devel@edk2.groups.io \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox