* [Patch 00/11 V4] Enable multiple process AutoGen
@ 2019-07-29 8:44 Bob Feng
2019-07-29 8:44 ` [Patch 01/11] BaseTools: Singleton the object to handle build conf file Bob Feng
` (11 more replies)
0 siblings, 12 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
In order to improve the build performance, we implemented
multiple-processes AutoGen. This change will reduce 20% time
for AutoGen phase.
The design document can be got from:
https://edk2.groups.io/g/devel/files/Designs/2019/0627/Multiple-thread-AutoGen.pdf
This patch serial pass the build of Ovmf, MinKabylake, MinPurley, packages
under Edk2 repository and intel client and server platforms.
V4:
Add one more patch 11/11 to enhance this feature. 1-10 are the same as V3
1. Set Log queue maxsize as thread number * 10
2. enhance ModuleUniqueBaseName function
3. fix bugs of build option pcd in sub Process
4. enhance error handling. Handle the exception of
KeyboardInterrup and exceptions happen in subprocess.
5. fix the issue of shared fixed pcd between module and lib.
6. fix bug in the function of duplicate modules handling.
V3:
1. Fixed incremental build issue.
2. Set AutoGen worker number to be align with "-n THREADNUMBER"
3. Enable block log queue.
V2:
1. The first version missed autogen related commit
from e812a812c1a0800c49e11507cb46222351520cc7. V2 add those commit
back.
2. Move CreateAsBuildInf into AutoGenWorker process
3. Save GlobalVar_<platform guid>_<arch>.bin to build folder.
4. Regenerate patches based on master bb824f685d
Feng, Bob C (11):
BaseTools: Singleton the object to handle build conf file
BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions
BaseTools: Add functions to get platform scope build options
BaseTools: Decouple AutoGen Objects
BaseTools: Enable Multiple Process AutoGen
BaseTools: Add shared data for processes
BaseTools: Add LogAgent to support multiple process Autogen
BaseTools: Move BuildOption parser out of build.py
BaseTools: Add the support for python 2
BaseTools: Enable block queue log agent.
BaseTools: Enhance Multiple-Process AutoGen
BaseTools/Source/Python/AutoGen/AutoGen.py | 4227 +----------------
.../Source/Python/AutoGen/AutoGenWorker.py | 257 +
.../Source/Python/AutoGen/BuildEngine.py | 22 +
BaseTools/Source/Python/AutoGen/DataPipe.py | 160 +
BaseTools/Source/Python/AutoGen/GenC.py | 6 +-
.../Source/Python/AutoGen/ModuleAutoGen.py | 1903 ++++++++
.../Python/AutoGen/ModuleAutoGenHelper.py | 619 +++
.../Source/Python/AutoGen/PlatformAutoGen.py | 1512 ++++++
.../Source/Python/AutoGen/WorkspaceAutoGen.py | 907 ++++
BaseTools/Source/Python/Common/EdkLogger.py | 119 +-
BaseTools/Source/Python/Common/Misc.py | 1 -
.../Python/Common/TargetTxtClassObject.py | 28 +-
.../Python/Common/ToolDefClassObject.py | 6 +-
BaseTools/Source/Python/GenFds/GenFds.py | 4 +-
.../Python/GenFds/GenFdsGlobalVariable.py | 54 +-
.../Python/PatchPcdValue/PatchPcdValue.py | 1 -
.../Source/Python/Workspace/DscBuildData.py | 38 +-
.../Source/Python/Workspace/InfBuildData.py | 39 +
.../Python/Workspace/WorkspaceCommon.py | 4 +
.../Python/Workspace/WorkspaceDatabase.py | 3 +
BaseTools/Source/Python/build/BuildReport.py | 4 +-
BaseTools/Source/Python/build/build.py | 370 +-
BaseTools/Source/Python/build/buildoptions.py | 92 +
23 files changed, 5921 insertions(+), 4455 deletions(-)
create mode 100644 BaseTools/Source/Python/AutoGen/AutoGenWorker.py
create mode 100644 BaseTools/Source/Python/AutoGen/DataPipe.py
create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
create mode 100644 BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
create mode 100644 BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
create mode 100644 BaseTools/Source/Python/build/buildoptions.py
--
2.20.1.windows.1
^ permalink raw reply [flat|nested] 18+ messages in thread
* [Patch 01/11] BaseTools: Singleton the object to handle build conf file
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions Bob Feng
` (10 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
The build config files are target.txt, build rule, tooldef
During a build, the config is not changed, so the object to
handle them need to be singleton.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 33 ++----------
.../Source/Python/AutoGen/BuildEngine.py | 22 ++++++++
.../Python/Common/TargetTxtClassObject.py | 2 +
.../Python/Common/ToolDefClassObject.py | 6 ++-
BaseTools/Source/Python/GenFds/GenFds.py | 4 +-
.../Python/GenFds/GenFdsGlobalVariable.py | 54 ++++++++-----------
.../Source/Python/Workspace/DscBuildData.py | 8 +--
BaseTools/Source/Python/build/build.py | 29 +++-------
8 files changed, 62 insertions(+), 96 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 2df055a109f7..c5b3fbb0a87f 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -22,11 +22,12 @@ from . import GenC
from . import GenMake
from . import GenDepex
from io import BytesIO
from .StrGather import *
-from .BuildEngine import BuildRule
+from .BuildEngine import BuildRuleObj as BuildRule
+from .BuildEngine import gDefaultBuildRuleFile,AutoGenReqBuildRuleVerNum
import shutil
from Common.LongFilePathSupport import CopyLongFilePath
from Common.BuildToolError import *
from Common.DataType import *
from Common.Misc import *
@@ -76,16 +77,10 @@ gEfiVarStoreGuidPattern = re.compile("\s*guid\s*=\s*({.*?{.*?}\s*})")
## Mapping Makefile type
gMakeTypeMap = {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"}
-## Build rule configuration file
-gDefaultBuildRuleFile = 'build_rule.txt'
-
-## Build rule default version
-AutoGenReqBuildRuleVerNum = "0.1"
-
## default file name for AutoGen
gAutoGenCodeFileName = "AutoGen.c"
gAutoGenHeaderFileName = "AutoGen.h"
gAutoGenStringFileName = "%(module_name)sStrDefs.h"
gAutoGenStringFormFileName = "%(module_name)sStrDefs.hpk"
@@ -1970,32 +1965,10 @@ class PlatformAutoGen(AutoGen):
## Return the build options specific for EDKII modules in this platform
@cached_property
def EdkIIBuildOption(self):
return self._ExpandBuildOption(self.Platform.BuildOptions, EDKII_NAME)
- ## Parse build_rule.txt in Conf Directory.
- #
- # @retval BuildRule object
- #
- @cached_property
- def BuildRule(self):
- BuildRuleFile = None
- if TAB_TAT_DEFINES_BUILD_RULE_CONF in self.Workspace.TargetTxt.TargetTxtDictionary:
- BuildRuleFile = self.Workspace.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_BUILD_RULE_CONF]
- if not BuildRuleFile:
- BuildRuleFile = gDefaultBuildRuleFile
- RetVal = BuildRule(BuildRuleFile)
- if RetVal._FileVersion == "":
- RetVal._FileVersion = AutoGenReqBuildRuleVerNum
- else:
- if RetVal._FileVersion < AutoGenReqBuildRuleVerNum :
- # If Build Rule's version is less than the version number required by the tools, halting the build.
- EdkLogger.error("build", AUTOGEN_ERROR,
- ExtraData="The version number [%s] of build_rule.txt is less than the version number required by the AutoGen.(the minimum required version number is [%s])"\
- % (RetVal._FileVersion, AutoGenReqBuildRuleVerNum))
- return RetVal
-
## Summarize the packages used by modules in this platform
@cached_property
def PackageList(self):
RetVal = set()
for La in self.LibraryAutoGenList:
@@ -3149,11 +3122,11 @@ class ModuleAutoGen(AutoGen):
return RetVal
@cached_property
def BuildRules(self):
RetVal = {}
- BuildRuleDatabase = self.PlatformInfo.BuildRule
+ BuildRuleDatabase = BuildRule
for Type in BuildRuleDatabase.FileTypeList:
#first try getting build rule by BuildRuleFamily
RuleObject = BuildRuleDatabase[Type, self.BuildType, self.Arch, self.BuildRuleFamily]
if not RuleObject:
# build type is always module type, but ...
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index 14e61140e7ba..bb9153447793 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -18,10 +18,13 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.GlobalData import *
from Common.BuildToolError import *
from Common.Misc import tdict, PathClass
from Common.StringUtils import NormPath
from Common.DataType import *
+from Common.TargetTxtClassObject import TargetTxt
+gDefaultBuildRuleFile = 'build_rule.txt'
+AutoGenReqBuildRuleVerNum = '0.1'
import Common.EdkLogger as EdkLogger
## Convert file type to file list macro name
#
@@ -581,10 +584,29 @@ class BuildRule:
_ExtraDependency : ParseCommonSubSection,
_Command : ParseCommonSubSection,
_UnknownSection : SkipSection,
}
+def GetBuildRule():
+ BuildRuleFile = None
+ if TAB_TAT_DEFINES_BUILD_RULE_CONF in TargetTxt.TargetTxtDictionary:
+ BuildRuleFile = TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_BUILD_RULE_CONF]
+ if not BuildRuleFile:
+ BuildRuleFile = gDefaultBuildRuleFile
+ RetVal = BuildRule(BuildRuleFile)
+ if RetVal._FileVersion == "":
+ RetVal._FileVersion = AutoGenReqBuildRuleVerNum
+ else:
+ if RetVal._FileVersion < AutoGenReqBuildRuleVerNum :
+ # If Build Rule's version is less than the version number required by the tools, halting the build.
+ EdkLogger.error("build", AUTOGEN_ERROR,
+ ExtraData="The version number [%s] of build_rule.txt is less than the version number required by the AutoGen.(the minimum required version number is [%s])"\
+ % (RetVal._FileVersion, AutoGenReqBuildRuleVerNum))
+ return RetVal
+
+BuildRuleObj = GetBuildRule()
+
# This acts like the main() function for the script, unless it is 'import'ed into another
# script.
if __name__ == '__main__':
import sys
EdkLogger.Initialize()
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 9d7673b41bb5..79a5acc01074 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -144,10 +144,12 @@ class TargetTxtClassObject(object):
def TargetTxtDict(ConfDir):
Target = TargetTxtClassObject()
Target.LoadTargetTxtFile(os.path.normpath(os.path.join(ConfDir, gDefaultTargetTxtFile)))
return Target
+TargetTxt = TargetTxtDict(os.path.join(os.getenv("WORKSPACE"),"Conf"))
+
##
#
# This acts like the main() function for the script, unless it is 'import'ed into another
# script.
#
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index 4fa364942cad..063fa005840a 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -12,11 +12,11 @@ from __future__ import absolute_import
import Common.LongFilePathOs as os
import re
from . import EdkLogger
from .BuildToolError import *
-from Common.TargetTxtClassObject import TargetTxtDict
+from Common.TargetTxtClassObject import TargetTxt
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.Misc import PathClass
from Common.StringUtils import NormPath
import Common.GlobalData as GlobalData
from Common import GlobalData
@@ -261,11 +261,11 @@ class ToolDefClassObject(object):
# @param ConfDir: Conf dir
#
# @retval ToolDef An instance of ToolDefClassObject() with loaded tools_def.txt
#
def ToolDefDict(ConfDir):
- Target = TargetTxtDict(ConfDir)
+ Target = TargetTxt
ToolDef = ToolDefClassObject()
if TAB_TAT_DEFINES_TOOL_CHAIN_CONF in Target.TargetTxtDictionary:
ToolsDefFile = Target.TargetTxtDictionary[TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
if ToolsDefFile:
ToolDef.LoadToolDefFile(os.path.normpath(ToolsDefFile))
@@ -273,10 +273,12 @@ def ToolDefDict(ConfDir):
ToolDef.LoadToolDefFile(os.path.normpath(os.path.join(ConfDir, gDefaultToolsDefFile)))
else:
ToolDef.LoadToolDefFile(os.path.normpath(os.path.join(ConfDir, gDefaultToolsDefFile)))
return ToolDef
+ToolDef = ToolDefDict((os.path.join(os.getenv("WORKSPACE"),"Conf")))
+
##
#
# This acts like the main() function for the script, unless it is 'import'ed into another
# script.
#
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 5888997761bb..51943411ad1f 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -18,11 +18,11 @@ from glob import glob
from struct import unpack
from linecache import getlines
from io import BytesIO
import Common.LongFilePathOs as os
-from Common.TargetTxtClassObject import TargetTxtClassObject
+from Common.TargetTxtClassObject import TargetTxt
from Common.DataType import *
import Common.GlobalData as GlobalData
from Common import EdkLogger
from Common.StringUtils import NormPath
from Common.Misc import DirCache, PathClass, GuidStructureStringToGuidString
@@ -205,12 +205,10 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
GenFdsGlobalVariable.ConfDir = ConfDirectoryPath
if not GlobalData.gConfDirectory:
GlobalData.gConfDirectory = GenFdsGlobalVariable.ConfDir
BuildConfigurationFile = os.path.normpath(os.path.join(ConfDirectoryPath, "target.txt"))
if os.path.isfile(BuildConfigurationFile) == True:
- TargetTxt = TargetTxtClassObject()
- TargetTxt.LoadTargetTxtFile(BuildConfigurationFile)
# if no build target given in command line, get it from target.txt
if not GenFdsGlobalVariable.TargetName:
BuildTargetList = TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_TARGET]
if len(BuildTargetList) != 1:
EdkLogger.error("GenFds", OPTION_VALUE_INVALID, ExtraData="Only allows one instance for Target.")
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index f43743dff4d1..037828ea1cca 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -20,13 +20,13 @@ from array import array
from Common.BuildToolError import COMMAND_FAILURE,GENFDS_ERROR
from Common import EdkLogger
from Common.Misc import SaveFileOnChange
-from Common.TargetTxtClassObject import TargetTxtClassObject
-from Common.ToolDefClassObject import ToolDefClassObject, ToolDefDict
-from AutoGen.BuildEngine import BuildRule
+from Common.TargetTxtClassObject import TargetTxt
+from Common.ToolDefClassObject import ToolDef
+from AutoGen.BuildEngine import BuildRuleObj
import Common.DataType as DataType
from Common.Misc import PathClass
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.MultipleWorkspace import MultipleWorkspace as mws
import Common.GlobalData as GlobalData
@@ -93,35 +93,25 @@ class GenFdsGlobalVariable:
#
@staticmethod
def _LoadBuildRule():
if GenFdsGlobalVariable.__BuildRuleDatabase:
return GenFdsGlobalVariable.__BuildRuleDatabase
- BuildConfigurationFile = os.path.normpath(os.path.join(GenFdsGlobalVariable.ConfDir, "target.txt"))
- TargetTxt = TargetTxtClassObject()
- if os.path.isfile(BuildConfigurationFile) == True:
- TargetTxt.LoadTargetTxtFile(BuildConfigurationFile)
- if DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF in TargetTxt.TargetTxtDictionary:
- BuildRuleFile = TargetTxt.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF]
- if not BuildRuleFile:
- BuildRuleFile = 'Conf/build_rule.txt'
- GenFdsGlobalVariable.__BuildRuleDatabase = BuildRule(BuildRuleFile)
- ToolDefinitionFile = TargetTxt.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
- if ToolDefinitionFile == '':
- ToolDefinitionFile = "Conf/tools_def.txt"
- if os.path.isfile(ToolDefinitionFile):
- ToolDef = ToolDefClassObject()
- ToolDef.LoadToolDefFile(ToolDefinitionFile)
- ToolDefinition = ToolDef.ToolsDefTxtDatabase
- if DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY in ToolDefinition \
- and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY] \
- and ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]:
- GenFdsGlobalVariable.BuildRuleFamily = ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]
+ GenFdsGlobalVariable.__BuildRuleDatabase = BuildRuleObj
+ ToolDefinitionFile = TargetTxt.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
+ if ToolDefinitionFile == '':
+ ToolDefinitionFile = "Conf/tools_def.txt"
+ if os.path.isfile(ToolDefinitionFile):
+ ToolDefinition = ToolDef.ToolsDefTxtDatabase
+ if DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY in ToolDefinition \
+ and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY] \
+ and ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]:
+ GenFdsGlobalVariable.BuildRuleFamily = ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]
- if DataType.TAB_TOD_DEFINES_FAMILY in ToolDefinition \
- and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY] \
- and ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]:
- GenFdsGlobalVariable.ToolChainFamily = ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]
+ if DataType.TAB_TOD_DEFINES_FAMILY in ToolDefinition \
+ and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY] \
+ and ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]:
+ GenFdsGlobalVariable.ToolChainFamily = ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]
return GenFdsGlobalVariable.__BuildRuleDatabase
## GetBuildRules
# @param Inf: object of InfBuildData
# @param Arch: current arch
@@ -837,11 +827,11 @@ class GenFdsGlobalVariable:
# @param KeyStringList Filter for inputs of section generation
# @param CurrentArchList Arch list
# @param NameGuid The Guid name
#
def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
- ToolDb = ToolDefDict(GenFdsGlobalVariable.ConfDir).ToolsDefTxtDatabase
+ ToolDb = ToolDef.ToolsDefTxtDatabase
# if user not specify filter, try to deduce it from global data.
if KeyStringList is None or KeyStringList == []:
Target = GenFdsGlobalVariable.TargetName
ToolChain = GenFdsGlobalVariable.ToolChainTag
if ToolChain not in ToolDb['TOOL_CHAIN_TAG']:
@@ -853,19 +843,19 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
if GenFdsGlobalVariable.GuidToolDefinition:
if NameGuid in GenFdsGlobalVariable.GuidToolDefinition:
return GenFdsGlobalVariable.GuidToolDefinition[NameGuid]
- ToolDefinition = ToolDefDict(GenFdsGlobalVariable.ConfDir).ToolsDefTxtDictionary
+ ToolDefinition = ToolDef.ToolsDefTxtDictionary
ToolPathTmp = None
ToolOption = None
ToolPathKey = None
ToolOptionKey = None
KeyList = None
- for ToolDef in ToolDefinition.items():
- if NameGuid.lower() == ToolDef[1].lower():
- KeyList = ToolDef[0].split('_')
+ for tool_def in ToolDefinition.items():
+ if NameGuid.lower() == tool_def[1].lower():
+ KeyList = tool_def[0].split('_')
Key = KeyList[0] + \
'_' + \
KeyList[1] + \
'_' + \
KeyList[2]
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 985f8775259d..e7ec2aba57d2 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -17,12 +17,12 @@ from Common.StringUtils import *
from Common.DataType import *
from Common.Misc import *
from types import *
from Common.Expression import *
from CommonDataClass.CommonClass import SkuInfoClass
-from Common.TargetTxtClassObject import TargetTxtClassObject
-from Common.ToolDefClassObject import ToolDefClassObject
+from Common.TargetTxtClassObject import TargetTxt
+from Common.ToolDefClassObject import ToolDef
from .MetaDataTable import *
from .MetaFileTable import *
from .MetaFileParser import *
from .WorkspaceCommon import GetDeclaredPcd
@@ -3260,19 +3260,15 @@ class DscBuildData(PlatformBuildClassObject):
@property
def ToolChainFamily(self):
self._ToolChainFamily = TAB_COMPILER_MSFT
BuildConfigurationFile = os.path.normpath(os.path.join(GlobalData.gConfDirectory, "target.txt"))
if os.path.isfile(BuildConfigurationFile) == True:
- TargetTxt = TargetTxtClassObject()
- TargetTxt.LoadTargetTxtFile(BuildConfigurationFile)
ToolDefinitionFile = TargetTxt.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
if ToolDefinitionFile == '':
ToolDefinitionFile = "tools_def.txt"
ToolDefinitionFile = os.path.normpath(mws.join(self.WorkspaceDir, 'Conf', ToolDefinitionFile))
if os.path.isfile(ToolDefinitionFile) == True:
- ToolDef = ToolDefClassObject()
- ToolDef.LoadToolDefFile(ToolDefinitionFile)
ToolDefinition = ToolDef.ToolsDefTxtDatabase
if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \
or self._Toolchain not in ToolDefinition[TAB_TOD_DEFINES_FAMILY] \
or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self._Toolchain]:
self._ToolChainFamily = TAB_COMPILER_MSFT
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 6bc528974db1..07693b97359e 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -28,12 +28,12 @@ import threading
from optparse import OptionParser
from subprocess import *
from Common import Misc as Utils
from Common.LongFilePathSupport import OpenLongFilePath as open
-from Common.TargetTxtClassObject import TargetTxtClassObject
-from Common.ToolDefClassObject import ToolDefClassObject
+from Common.TargetTxtClassObject import TargetTxt
+from Common.ToolDefClassObject import ToolDef
from Common.DataType import *
from Common.BuildVersion import gBUILD_VERSION
from AutoGen.AutoGen import *
from Common.BuildToolError import *
from Workspace.WorkspaceDatabase import WorkspaceDatabase
@@ -714,12 +714,12 @@ class Build():
if self.SkuId:
GlobalData.gSKUID_CMD = self.SkuId
self.ConfDirectory = BuildOptions.ConfDirectory
self.SpawnMode = True
self.BuildReport = BuildReport(BuildOptions.ReportFile, BuildOptions.ReportType)
- self.TargetTxt = TargetTxtClassObject()
- self.ToolDef = ToolDefClassObject()
+ self.TargetTxt = TargetTxt
+ self.ToolDef = ToolDef
self.AutoGenTime = 0
self.MakeTime = 0
self.GenFdsTime = 0
GlobalData.BuildOptionPcd = BuildOptions.OptionPcd if BuildOptions.OptionPcd else []
#Set global flag for build mode
@@ -814,12 +814,12 @@ class Build():
EdkLogger.quiet("%-16s = %s" % ("PREBUILD", self.Prebuild))
if self.Postbuild:
EdkLogger.quiet("%-16s = %s" % ("POSTBUILD", self.Postbuild))
if self.Prebuild:
self.LaunchPrebuild()
- self.TargetTxt = TargetTxtClassObject()
- self.ToolDef = ToolDefClassObject()
+ self.TargetTxt = TargetTxt
+ self.ToolDef = ToolDef
if not (self.LaunchPrebuildFlag and os.path.exists(self.PlatformBuildPath)):
self.InitBuild()
EdkLogger.info("")
os.chdir(self.WorkspaceDir)
@@ -827,27 +827,10 @@ class Build():
## Load configuration
#
# This method will parse target.txt and get the build configurations.
#
def LoadConfiguration(self):
- #
- # Check target.txt and tools_def.txt and Init them
- #
- BuildConfigurationFile = os.path.normpath(os.path.join(GlobalData.gConfDirectory, gBuildConfiguration))
- if os.path.isfile(BuildConfigurationFile) == True:
- StatusCode = self.TargetTxt.LoadTargetTxtFile(BuildConfigurationFile)
-
- ToolDefinitionFile = self.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
- if ToolDefinitionFile == '':
- ToolDefinitionFile = gToolsDefinition
- ToolDefinitionFile = os.path.normpath(mws.join(self.WorkspaceDir, 'Conf', ToolDefinitionFile))
- if os.path.isfile(ToolDefinitionFile) == True:
- StatusCode = self.ToolDef.LoadToolDefFile(ToolDefinitionFile)
- else:
- EdkLogger.error("build", FILE_NOT_FOUND, ExtraData=ToolDefinitionFile)
- else:
- EdkLogger.error("build", FILE_NOT_FOUND, ExtraData=BuildConfigurationFile)
# if no ARCH given in command line, get it from target.txt
if not self.ArchList:
self.ArchList = self.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_TARGET_ARCH]
self.ArchList = tuple(self.ArchList)
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
2019-07-29 8:44 ` [Patch 01/11] BaseTools: Singleton the object to handle build conf file Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 15:03 ` [edk2-devel] " Philippe Mathieu-Daudé
2019-07-29 8:44 ` [Patch 03/11] BaseTools: Add functions to get platform scope build options Bob Feng
` (9 subsequent siblings)
11 siblings, 1 reply; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
The WorkspaceAutoGen.__InitWorker function is too long, it's hard
to read and understand.
This patch is to separate the __InitWorker into multiple small ones.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 247 +++++++++++++--------
1 file changed, 152 insertions(+), 95 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index c5b3fbb0a87f..9e06bb942126 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -333,13 +333,58 @@ class WorkspaceAutoGen(AutoGen):
self._GuidDict = {}
# there's many relative directory operations, so ...
os.chdir(self.WorkspaceDir)
+ self.MergeArch()
+ self.ValidateBuildTarget()
+
+ EdkLogger.info("")
+ if self.ArchList:
+ EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
+ EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
+ EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
+
+ EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
+ if BuildModule:
+ EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
+
+ if self.FdfFile:
+ EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
+
+ EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
+
+ if Progress:
+ Progress.Start("\nProcessing meta-data")
#
- # Merge Arch
+ # Mark now build in AutoGen Phase
#
+ GlobalData.gAutoGenPhase = True
+ self.ProcessModuleFromPdf()
+ self.ProcessPcdType()
+ self.ProcessMixedPcd()
+ self.GetPcdsFromFDF()
+ self.CollectAllPcds()
+ self.GeneratePkgLevelHash()
+ #
+ # Check PCDs token value conflict in each DEC file.
+ #
+ self._CheckAllPcdsTokenValueConflict()
+ #
+ # Check PCD type and definition between DSC and DEC
+ #
+ self._CheckPcdDefineAndType()
+
+ self.CreateBuildOptionsFile()
+ self.CreatePcdTokenNumberFile()
+ self.CreateModuleHashInfo()
+ GlobalData.gAutoGenPhase = False
+
+ #
+ # Merge Arch
+ #
+ def MergeArch(self):
if not self.ArchList:
ArchList = set(self.Platform.SupArchList)
else:
ArchList = set(self.ArchList) & set(self.Platform.SupArchList)
if not ArchList:
@@ -349,57 +394,49 @@ class WorkspaceAutoGen(AutoGen):
SkippedArchList = set(self.ArchList).symmetric_difference(set(self.Platform.SupArchList))
EdkLogger.verbose("\nArch [%s] is ignored because the platform supports [%s] only!"
% (" ".join(SkippedArchList), " ".join(self.Platform.SupArchList)))
self.ArchList = tuple(ArchList)
- # Validate build target
+ # Validate build target
+ def ValidateBuildTarget(self):
if self.BuildTarget not in self.Platform.BuildTargets:
EdkLogger.error("build", PARAMETER_INVALID,
ExtraData="Build target [%s] is not supported by the platform. [Valid target: %s]"
% (self.BuildTarget, " ".join(self.Platform.BuildTargets)))
-
-
- # parse FDF file to get PCDs in it, if any
+ @cached_property
+ def FdfProfile(self):
if not self.FdfFile:
self.FdfFile = self.Platform.FlashDefinition
- EdkLogger.info("")
- if self.ArchList:
- EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
- EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
- EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
-
- EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
- if BuildModule:
- EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
-
+ FdfProfile = None
if self.FdfFile:
- EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
-
- EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
-
- if Progress:
- Progress.Start("\nProcessing meta-data")
-
- if self.FdfFile:
- #
- # Mark now build in AutoGen Phase
- #
- GlobalData.gAutoGenPhase = True
Fdf = FdfParser(self.FdfFile.Path)
Fdf.ParseFile()
GlobalData.gFdfParser = Fdf
- GlobalData.gAutoGenPhase = False
- PcdSet = Fdf.Profile.PcdDict
if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.FdDict:
FdDict = Fdf.Profile.FdDict[Fdf.CurrentFdName]
for FdRegion in FdDict.RegionList:
if str(FdRegion.RegionType) is 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
if int(FdRegion.Offset) % 8 != 0:
EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
- ModuleList = Fdf.Profile.InfList
- self.FdfProfile = Fdf.Profile
+ FdfProfile = Fdf.Profile
+ else:
+ if self.FdTargetList:
+ EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
+ self.FdTargetList = []
+ if self.FvTargetList:
+ EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
+ self.FvTargetList = []
+ if self.CapTargetList:
+ EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
+ self.CapTargetList = []
+
+ return FdfProfile
+
+ def ProcessModuleFromPdf(self):
+
+ if self.FdfProfile:
for fvname in self.FvTargetList:
if fvname.upper() not in self.FdfProfile.FvDict:
EdkLogger.error("build", OPTION_VALUE_INVALID,
"No such an FV in FDF file: %s" % fvname)
@@ -407,64 +444,60 @@ class WorkspaceAutoGen(AutoGen):
# but the path (self.MetaFile.Path) is the real path
for key in self.FdfProfile.InfDict:
if key == 'ArchTBD':
MetaFile_cache = defaultdict(set)
for Arch in self.ArchList:
- Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
+ Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
for Pkey in Current_Platform_cache.Modules:
MetaFile_cache[Arch].add(Current_Platform_cache.Modules[Pkey].MetaFile)
for Inf in self.FdfProfile.InfDict[key]:
ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
for Arch in self.ArchList:
if ModuleFile in MetaFile_cache[Arch]:
break
else:
- ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
+ ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
if not ModuleData.IsBinaryModule:
EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
else:
for Arch in self.ArchList:
if Arch == key:
- Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
MetaFileList = set()
for Pkey in Platform.Modules:
MetaFileList.add(Platform.Modules[Pkey].MetaFile)
for Inf in self.FdfProfile.InfDict[key]:
ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
if ModuleFile in MetaFileList:
continue
- ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
+ ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
if not ModuleData.IsBinaryModule:
EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
- else:
- PcdSet = {}
- ModuleList = []
- self.FdfProfile = None
- if self.FdTargetList:
- EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
- self.FdTargetList = []
- if self.FvTargetList:
- EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
- self.FvTargetList = []
- if self.CapTargetList:
- EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
- self.CapTargetList = []
-
- # apply SKU and inject PCDs from Flash Definition file
+
+
+ # parse FDF file to get PCDs in it, if any
+ def GetPcdsFromFDF(self):
+
+ if self.FdfProfile:
+ PcdSet = self.FdfProfile.PcdDict
+ # handle the mixed pcd in FDF file
+ for key in PcdSet:
+ if key in GlobalData.MixedPcd:
+ Value = PcdSet[key]
+ del PcdSet[key]
+ for item in GlobalData.MixedPcd[key]:
+ PcdSet[item] = Value
+ self.VerifyPcdDeclearation(PcdSet)
+
+ def ProcessPcdType(self):
for Arch in self.ArchList:
- Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
- PlatformPcds = Platform.Pcds
- self._GuidDict = Platform._GuidDict
- SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
- BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
- SourcePcdDict_Keys = SourcePcdDict.keys()
- BinaryPcdDict_Keys = BinaryPcdDict.keys()
-
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ Platform.Pcds
# generate the SourcePcdDict and BinaryPcdDict
- PGen = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
+ PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
if BuildData.Arch != Arch:
continue
if BuildData.MetaFile.Ext == '.inf':
for key in BuildData.Pcds:
@@ -483,11 +516,11 @@ class WorkspaceAutoGen(AutoGen):
BuildData.Pcds[key].Type = PcdInPlatform.Type
BuildData.Pcds[key].Pending = False
else:
#Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
if BuildData.Pcds[key].Pending:
- MGen = ModuleAutoGen(self, BuildData.MetaFile, Target, Toolchain, Arch, self.MetaFile)
+ MGen = ModuleAutoGen(self, BuildData.MetaFile, self.BuildTarget, self.ToolChain, Arch, self.MetaFile)
if MGen and MGen.IsLibrary:
if MGen in PGen.LibraryAutoGenList:
ReferenceModules = MGen.ReferenceModules
for ReferenceModule in ReferenceModules:
if ReferenceModule.MetaFile in Platform.Modules:
@@ -497,10 +530,24 @@ class WorkspaceAutoGen(AutoGen):
if PcdInReferenceModule.Type:
BuildData.Pcds[key].Type = PcdInReferenceModule.Type
BuildData.Pcds[key].Pending = False
break
+ def ProcessMixedPcd(self):
+ for Arch in self.ArchList:
+ SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
+ BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
+ SourcePcdDict_Keys = SourcePcdDict.keys()
+ BinaryPcdDict_Keys = BinaryPcdDict.keys()
+
+ # generate the SourcePcdDict and BinaryPcdDict
+ PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
+ for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
+ if BuildData.Arch != Arch:
+ continue
+ if BuildData.MetaFile.Ext == '.inf':
+ for key in BuildData.Pcds:
if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].Type:
if BuildData.IsBinaryModule:
BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
else:
SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
@@ -514,12 +561,11 @@ class WorkspaceAutoGen(AutoGen):
elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Type:
SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[key].Type:
SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
- else:
- pass
+
#
# A PCD can only use one type for all source modules
#
for i in SourcePcdDict_Keys:
for j in SourcePcdDict_Keys:
@@ -588,27 +634,38 @@ class WorkspaceAutoGen(AutoGen):
del BuildData.Pcds[key]
BuildData.Pcds[newkey] = Value
break
break
- # handle the mixed pcd in FDF file
- for key in PcdSet:
- if key in GlobalData.MixedPcd:
- Value = PcdSet[key]
- del PcdSet[key]
- for item in GlobalData.MixedPcd[key]:
- PcdSet[item] = Value
+ #Collect package set information from INF of FDF
+ @cached_property
+ def PkgSet(self):
+ if not self.FdfFile:
+ self.FdfFile = self.Platform.FlashDefinition
- #Collect package set information from INF of FDF
+ if self.FdfFile:
+ ModuleList = self.FdfProfile.InfList
+ else:
+ ModuleList = []
+ Pkgs = {}
+ for Arch in self.ArchList:
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
PkgSet = set()
for Inf in ModuleList:
ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
if ModuleFile in Platform.Modules:
continue
- ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
+ ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
PkgSet.update(ModuleData.Packages)
- Pkgs = list(PkgSet) + list(PGen.PackageList)
+ Pkgs[Arch] = list(PkgSet) + list(PGen.PackageList)
+ return Pkgs
+
+ def VerifyPcdDeclearation(self,PcdSet):
+ for Arch in self.ArchList:
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ Pkgs = self.PkgSet[Arch]
DecPcds = set()
DecPcdsKey = set()
for Pkg in Pkgs:
for Pcd in Pkg.Pcds:
DecPcds.add((Pcd[0], Pcd[1]))
@@ -636,37 +693,33 @@ class WorkspaceAutoGen(AutoGen):
PARSER_ERROR,
"Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (Guid, Name),
File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
)
+ def CollectAllPcds(self):
- Pa = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
+ for Arch in self.ArchList:
+ Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
#
# Explicitly collect platform's dynamic PCDs
#
Pa.CollectPlatformDynamicPcds()
Pa.CollectFixedAtBuildPcds()
self.AutoGenObjectList.append(Pa)
- #
- # Generate Package level hash value
- #
+ #
+ # Generate Package level hash value
+ #
+ def GeneratePkgLevelHash(self):
+ for Arch in self.ArchList:
GlobalData.gPackageHash = {}
if GlobalData.gUseHashCache:
- for Pkg in Pkgs:
+ for Pkg in self.PkgSet[Arch]:
self._GenPkgLevelHash(Pkg)
- #
- # Check PCDs token value conflict in each DEC file.
- #
- self._CheckAllPcdsTokenValueConflict()
-
- #
- # Check PCD type and definition between DSC and DEC
- #
- self._CheckPcdDefineAndType()
+ def CreateBuildOptionsFile(self):
#
# Create BuildOptions Macro & PCD metafile, also add the Active Platform and FDF file.
#
content = 'gCommandLineDefines: '
content += str(GlobalData.gCommandLineDefines)
@@ -681,27 +734,31 @@ class WorkspaceAutoGen(AutoGen):
content += 'Flash Image Definition: '
content += str(self.FdfFile)
content += TAB_LINE_BREAK
SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'), content, False)
+ def CreatePcdTokenNumberFile(self):
#
# Create PcdToken Number file for Dynamic/DynamicEx Pcd.
#
PcdTokenNumber = 'PcdTokenNumber: '
- if Pa.PcdTokenNumber:
- if Pa.DynamicPcdList:
- for Pcd in Pa.DynamicPcdList:
- PcdTokenNumber += TAB_LINE_BREAK
- PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
- PcdTokenNumber += ' : '
- PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
+ for Arch in self.ArchList:
+ Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
+ if Pa.PcdTokenNumber:
+ if Pa.DynamicPcdList:
+ for Pcd in Pa.DynamicPcdList:
+ PcdTokenNumber += TAB_LINE_BREAK
+ PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
+ PcdTokenNumber += ' : '
+ PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'), PcdTokenNumber, False)
+ def CreateModuleHashInfo(self):
#
# Get set of workspace metafiles
#
- AllWorkSpaceMetaFiles = self._GetMetaFiles(Target, Toolchain, Arch)
+ AllWorkSpaceMetaFiles = self._GetMetaFiles(self.BuildTarget, self.ToolChain)
#
# Retrieve latest modified time of all metafiles
#
SrcTimeStamp = 0
@@ -759,11 +816,11 @@ class WorkspaceAutoGen(AutoGen):
f.close()
m.update(Content)
SaveFileOnChange(HashFile, m.hexdigest(), False)
GlobalData.gPackageHash[Pkg.PackageName] = m.hexdigest()
- def _GetMetaFiles(self, Target, Toolchain, Arch):
+ def _GetMetaFiles(self, Target, Toolchain):
AllWorkSpaceMetaFiles = set()
#
# add fdf
#
if self.FdfFile:
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 03/11] BaseTools: Add functions to get platform scope build options
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
2019-07-29 8:44 ` [Patch 01/11] BaseTools: Singleton the object to handle build conf file Bob Feng
2019-07-29 8:44 ` [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 04/11] BaseTools: Decouple AutoGen Objects Bob Feng
` (8 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
These functions are used for get platform scope
build options. They will be used in later patches.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 10 +++++++++-
.../Source/Python/Workspace/DscBuildData.py | 20 +++++++++++++++++++
.../Source/Python/Workspace/InfBuildData.py | 10 ++++++++++
3 files changed, 39 insertions(+), 1 deletion(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 9e06bb942126..792beed65e6b 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -2485,11 +2485,19 @@ class PlatformAutoGen(AutoGen):
if Attr != 'PATH':
BuildOptions[Tool][Attr] += " " + Options[Key]
else:
BuildOptions[Tool][Attr] = Options[Key]
return BuildOptions
-
+ def GetGlobalBuildOptions(self,Module):
+ ModuleTypeOptions = self.Platform.GetBuildOptionsByPkg(Module, Module.ModuleType)
+ ModuleTypeOptions = self._ExpandBuildOption(ModuleTypeOptions)
+ if Module in self.Platform.Modules:
+ PlatformModule = self.Platform.Modules[str(Module)]
+ PlatformModuleOptions = self._ExpandBuildOption(PlatformModule.BuildOptions)
+ else:
+ PlatformModuleOptions = {}
+ return ModuleTypeOptions, PlatformModuleOptions
## Append build options in platform to a module
#
# @param Module The module to which the build options will be appended
#
# @retval options The options appended with build options in platform
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index e7ec2aba57d2..dd5c3c2bd1f2 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1222,11 +1222,31 @@ class DscBuildData(PlatformBuildClassObject):
self._BuildOptions[CurKey] = Option
else:
if ' ' + Option not in self._BuildOptions[CurKey]:
self._BuildOptions[CurKey] += ' ' + Option
return self._BuildOptions
+ def GetBuildOptionsByPkg(self, Module, ModuleType):
+ local_pkg = os.path.split(Module.LocalPkg())[0]
+ if self._ModuleTypeOptions is None:
+ self._ModuleTypeOptions = OrderedDict()
+ if ModuleType not in self._ModuleTypeOptions:
+ options = OrderedDict()
+ self._ModuleTypeOptions[ ModuleType] = options
+ RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch]
+ for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
+ if Dummy2 not in (TAB_COMMON,local_pkg.upper(),"EDKII"):
+ continue
+ Type = Dummy3
+ if Type.upper() == ModuleType.upper():
+ Key = (ToolChainFamily, ToolChain)
+ if Key not in options or not ToolChain.endswith('_FLAGS') or Option.startswith('='):
+ options[Key] = Option
+ else:
+ if ' ' + Option not in options[Key]:
+ options[Key] += ' ' + Option
+ return self._ModuleTypeOptions[ModuleType]
def GetBuildOptionsByModuleType(self, Edk, ModuleType):
if self._ModuleTypeOptions is None:
self._ModuleTypeOptions = OrderedDict()
if (Edk, ModuleType) not in self._ModuleTypeOptions:
options = OrderedDict()
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 60970cd92836..da35391d3aff 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -817,11 +817,21 @@ class InfBuildData(ModuleBuildClassObject):
for Token in TokenList:
TemporaryDictionary[Arch, ModuleType] = TemporaryDictionary[Arch, ModuleType] + Token.strip() + ' '
for Arch, ModuleType in TemporaryDictionary:
RetVal[Arch, ModuleType] = TemporaryDictionary[Arch, ModuleType]
return RetVal
+ def LocalPkg(self):
+ module_path = self.MetaFile.File
+ subdir = os.path.split(module_path)[0]
+ TopDir = ""
+ while subdir:
+ subdir,TopDir = os.path.split(subdir)
+ for file_name in os.listdir(os.path.join(self.MetaFile.Root,TopDir)):
+ if file_name.upper().endswith("DEC"):
+ pkg = os.path.join(TopDir,file_name)
+ return pkg
@cached_class_function
def GetGuidsUsedByPcd(self):
self.Pcds
return self._GuidsUsedByPcd
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 04/11] BaseTools: Decouple AutoGen Objects
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (2 preceding siblings ...)
2019-07-29 8:44 ` [Patch 03/11] BaseTools: Add functions to get platform scope build options Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 05/11] BaseTools: Enable Multiple Process AutoGen Bob Feng
` (7 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Steven Shi, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
1. Separate the AutoGen.py into 3 small py files.
One is for AutoGen base class, one is for WorkspaceAutoGen class
and PlatformAutoGen class, and the one for ModuleAutoGen class.
2. Create a new class DataPipe to store the Platform scope settings.
Create a new class PlatformInfo to provide the same interface
as PlatformAutoGen. PlatformInfo class is initialized by
DataPipe instance.
Create a new class WorkspaceInfo to provide the same interface
as WorkspaceAutoGen. WorkspaceInfo class is initialized by
DataPipe instance.
3. Change ModuleAutoGen to depends on DataPipe, PlatformInfo and
WorkspaceInfo. Remove the dependency of ModuleAutoGen to PlatformAutoGen.
Cc: Liming Gao <liming.gao@intel.com>
Cc: Steven Shi <steven.shi@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 4265 +----------------
BaseTools/Source/Python/AutoGen/DataPipe.py | 147 +
BaseTools/Source/Python/AutoGen/GenC.py | 2 +-
.../Source/Python/AutoGen/ModuleAutoGen.py | 1908 ++++++++
.../Python/AutoGen/ModuleAutoGenHelper.py | 616 +++
.../Source/Python/AutoGen/PlatformAutoGen.py | 1493 ++++++
.../Source/Python/AutoGen/WorkspaceAutoGen.py | 905 ++++
BaseTools/Source/Python/Common/Misc.py | 1 -
.../Python/PatchPcdValue/PatchPcdValue.py | 1 -
.../Source/Python/Workspace/DscBuildData.py | 10 +-
.../Source/Python/Workspace/InfBuildData.py | 29 +
.../Python/Workspace/WorkspaceCommon.py | 4 +
.../Python/Workspace/WorkspaceDatabase.py | 3 +
BaseTools/Source/Python/build/BuildReport.py | 4 +-
BaseTools/Source/Python/build/build.py | 51 +-
15 files changed, 5190 insertions(+), 4249 deletions(-)
create mode 100644 BaseTools/Source/Python/AutoGen/DataPipe.py
create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
create mode 100644 BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
create mode 100644 BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 792beed65e6b..d9ee699d8f30 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -10,226 +10,11 @@
## Import Modules
#
from __future__ import print_function
from __future__ import absolute_import
-import Common.LongFilePathOs as os
-import re
-import os.path as path
-import copy
-import uuid
-
-from . import GenC
-from . import GenMake
-from . import GenDepex
-from io import BytesIO
-
-from .StrGather import *
-from .BuildEngine import BuildRuleObj as BuildRule
-from .BuildEngine import gDefaultBuildRuleFile,AutoGenReqBuildRuleVerNum
-import shutil
-from Common.LongFilePathSupport import CopyLongFilePath
-from Common.BuildToolError import *
-from Common.DataType import *
-from Common.Misc import *
-from Common.StringUtils import *
-import Common.GlobalData as GlobalData
-from GenFds.FdfParser import *
-from CommonDataClass.CommonClass import SkuInfoClass
-from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile
-import Common.VpdInfoFile as VpdInfoFile
-from .GenPcdDb import CreatePcdDatabaseCode
-from Workspace.MetaFileCommentParser import UsageList
-from Workspace.WorkspaceCommon import GetModuleLibInstances
-from Common.MultipleWorkspace import MultipleWorkspace as mws
-from . import InfSectionParser
-import datetime
-import hashlib
-from .GenVar import VariableMgr, var_info
-from collections import OrderedDict
-from collections import defaultdict
-from Workspace.WorkspaceCommon import OrderedListDict
-from Common.ToolDefClassObject import gDefaultToolsDefFile
-
-from Common.caching import cached_property, cached_class_function
-
-## Regular expression for splitting Dependency Expression string into tokens
-gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
-
-## Regular expression for match: PCD(xxxx.yyy)
-gPCDAsGuidPattern = re.compile(r"^PCD\(.+\..+\)$")
-
-#
-# Regular expression for finding Include Directories, the difference between MSFT and INTEL/GCC/RVCT
-# is the former use /I , the Latter used -I to specify include directories
-#
-gBuildOptIncludePatternMsft = re.compile(r"(?:.*?)/I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
-gBuildOptIncludePatternOther = re.compile(r"(?:.*?)-I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
-
-#
-# Match name = variable
-#
-gEfiVarStoreNamePattern = re.compile("\s*name\s*=\s*(\w+)")
-#
-# The format of guid in efivarstore statement likes following and must be correct:
-# guid = {0xA04A27f4, 0xDF00, 0x4D42, {0xB5, 0x52, 0x39, 0x51, 0x13, 0x02, 0x11, 0x3D}}
-#
-gEfiVarStoreGuidPattern = re.compile("\s*guid\s*=\s*({.*?{.*?}\s*})")
-
-## Mapping Makefile type
-gMakeTypeMap = {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"}
-
-
-## default file name for AutoGen
-gAutoGenCodeFileName = "AutoGen.c"
-gAutoGenHeaderFileName = "AutoGen.h"
-gAutoGenStringFileName = "%(module_name)sStrDefs.h"
-gAutoGenStringFormFileName = "%(module_name)sStrDefs.hpk"
-gAutoGenDepexFileName = "%(module_name)s.depex"
-gAutoGenImageDefFileName = "%(module_name)sImgDefs.h"
-gAutoGenIdfFileName = "%(module_name)sIdf.hpk"
-gInfSpecVersion = "0x00010017"
-
-#
-# Template string to generic AsBuilt INF
-#
-gAsBuiltInfHeaderString = TemplateString("""${header_comments}
-
-# DO NOT EDIT
-# FILE auto-generated
-
-[Defines]
- INF_VERSION = ${module_inf_version}
- BASE_NAME = ${module_name}
- FILE_GUID = ${module_guid}
- MODULE_TYPE = ${module_module_type}${BEGIN}
- VERSION_STRING = ${module_version_string}${END}${BEGIN}
- PCD_IS_DRIVER = ${pcd_is_driver_string}${END}${BEGIN}
- UEFI_SPECIFICATION_VERSION = ${module_uefi_specification_version}${END}${BEGIN}
- PI_SPECIFICATION_VERSION = ${module_pi_specification_version}${END}${BEGIN}
- ENTRY_POINT = ${module_entry_point}${END}${BEGIN}
- UNLOAD_IMAGE = ${module_unload_image}${END}${BEGIN}
- CONSTRUCTOR = ${module_constructor}${END}${BEGIN}
- DESTRUCTOR = ${module_destructor}${END}${BEGIN}
- SHADOW = ${module_shadow}${END}${BEGIN}
- PCI_VENDOR_ID = ${module_pci_vendor_id}${END}${BEGIN}
- PCI_DEVICE_ID = ${module_pci_device_id}${END}${BEGIN}
- PCI_CLASS_CODE = ${module_pci_class_code}${END}${BEGIN}
- PCI_REVISION = ${module_pci_revision}${END}${BEGIN}
- BUILD_NUMBER = ${module_build_number}${END}${BEGIN}
- SPEC = ${module_spec}${END}${BEGIN}
- UEFI_HII_RESOURCE_SECTION = ${module_uefi_hii_resource_section}${END}${BEGIN}
- MODULE_UNI_FILE = ${module_uni_file}${END}
-
-[Packages.${module_arch}]${BEGIN}
- ${package_item}${END}
-
-[Binaries.${module_arch}]${BEGIN}
- ${binary_item}${END}
-
-[PatchPcd.${module_arch}]${BEGIN}
- ${patchablepcd_item}
-${END}
-
-[Protocols.${module_arch}]${BEGIN}
- ${protocol_item}
-${END}
-
-[Ppis.${module_arch}]${BEGIN}
- ${ppi_item}
-${END}
-
-[Guids.${module_arch}]${BEGIN}
- ${guid_item}
-${END}
-
-[PcdEx.${module_arch}]${BEGIN}
- ${pcd_item}
-${END}
-
-[LibraryClasses.${module_arch}]
-## @LIB_INSTANCES${BEGIN}
-# ${libraryclasses_item}${END}
-
-${depexsection_item}
-
-${userextension_tianocore_item}
-
-${tail_comments}
-
-[BuildOptions.${module_arch}]
-## @AsBuilt${BEGIN}
-## ${flags_item}${END}
-""")
-## Split command line option string to list
-#
-# subprocess.Popen needs the args to be a sequence. Otherwise there's problem
-# in non-windows platform to launch command
-#
-def _SplitOption(OptionString):
- OptionList = []
- LastChar = " "
- OptionStart = 0
- QuotationMark = ""
- for Index in range(0, len(OptionString)):
- CurrentChar = OptionString[Index]
- if CurrentChar in ['"', "'"]:
- if QuotationMark == CurrentChar:
- QuotationMark = ""
- elif QuotationMark == "":
- QuotationMark = CurrentChar
- continue
- elif QuotationMark:
- continue
-
- if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r", "\n"]:
- if Index > OptionStart:
- OptionList.append(OptionString[OptionStart:Index - 1])
- OptionStart = Index
- LastChar = CurrentChar
- OptionList.append(OptionString[OptionStart:])
- return OptionList
-
-#
-# Convert string to C format array
-#
-def _ConvertStringToByteArray(Value):
- Value = Value.strip()
- if not Value:
- return None
- if Value[0] == '{':
- if not Value.endswith('}'):
- return None
- Value = Value.replace(' ', '').replace('{', '').replace('}', '')
- ValFields = Value.split(',')
- try:
- for Index in range(len(ValFields)):
- ValFields[Index] = str(int(ValFields[Index], 0))
- except ValueError:
- return None
- Value = '{' + ','.join(ValFields) + '}'
- return Value
-
- Unicode = False
- if Value.startswith('L"'):
- if not Value.endswith('"'):
- return None
- Value = Value[1:]
- Unicode = True
- elif not Value.startswith('"') or not Value.endswith('"'):
- return None
-
- Value = eval(Value) # translate escape character
- NewValue = '{'
- for Index in range(0, len(Value)):
- if Unicode:
- NewValue = NewValue + str(ord(Value[Index]) % 0x10000) + ','
- else:
- NewValue = NewValue + str(ord(Value[Index]) % 0x100) + ','
- Value = NewValue + '0}'
- return Value
-
+from Common.DataType import TAB_STAR
## Base class for AutoGen
#
# This class just implements the cache mechanism of AutoGen objects.
#
class AutoGen(object):
@@ -246,10 +31,11 @@ class AutoGen(object):
# @param Toolchain Tool chain name
# @param Arch Target arch
# @param *args The specific class related parameters
# @param **kwargs The specific class related dict parameters
#
+
def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
# check if the object has been created
Key = (Target, Toolchain, Arch, MetaFile)
if Key in cls.__ObjectCache:
# if it exists, just return it directly
@@ -279,4008 +65,49 @@ class AutoGen(object):
## "==" operator
def __eq__(self, Other):
return Other and self.MetaFile == Other
-## Workspace AutoGen class
-#
-# This class is used mainly to control the whole platform build for different
-# architecture. This class will generate top level makefile.
-#
-class WorkspaceAutoGen(AutoGen):
- # call super().__init__ then call the worker function with different parameter count
- def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
- if not hasattr(self, "_Init"):
- self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
- self._Init = True
-
- ## Initialize WorkspaceAutoGen
- #
- # @param WorkspaceDir Root directory of workspace
- # @param ActivePlatform Meta-file of active platform
- # @param Target Build target
- # @param Toolchain Tool chain name
- # @param ArchList List of architecture of current build
- # @param MetaFileDb Database containing meta-files
- # @param BuildConfig Configuration of build
- # @param ToolDefinition Tool chain definitions
- # @param FlashDefinitionFile File of flash definition
- # @param Fds FD list to be generated
- # @param Fvs FV list to be generated
- # @param Caps Capsule list to be generated
- # @param SkuId SKU id from command line
- #
- def _InitWorker(self, WorkspaceDir, ActivePlatform, Target, Toolchain, ArchList, MetaFileDb,
- BuildConfig, ToolDefinition, FlashDefinitionFile='', Fds=None, Fvs=None, Caps=None, SkuId='', UniFlag=None,
- Progress=None, BuildModule=None):
- self.BuildDatabase = MetaFileDb
- self.MetaFile = ActivePlatform
- self.WorkspaceDir = WorkspaceDir
- self.Platform = self.BuildDatabase[self.MetaFile, TAB_ARCH_COMMON, Target, Toolchain]
- GlobalData.gActivePlatform = self.Platform
- self.BuildTarget = Target
- self.ToolChain = Toolchain
- self.ArchList = ArchList
- self.SkuId = SkuId
- self.UniFlag = UniFlag
-
- self.TargetTxt = BuildConfig
- self.ToolDef = ToolDefinition
- self.FdfFile = FlashDefinitionFile
- self.FdTargetList = Fds if Fds else []
- self.FvTargetList = Fvs if Fvs else []
- self.CapTargetList = Caps if Caps else []
- self.AutoGenObjectList = []
- self._GuidDict = {}
-
- # there's many relative directory operations, so ...
- os.chdir(self.WorkspaceDir)
-
- self.MergeArch()
- self.ValidateBuildTarget()
-
- EdkLogger.info("")
- if self.ArchList:
- EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
- EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
- EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
-
- EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
- if BuildModule:
- EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
-
- if self.FdfFile:
- EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
-
- EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
-
- if Progress:
- Progress.Start("\nProcessing meta-data")
- #
- # Mark now build in AutoGen Phase
- #
- GlobalData.gAutoGenPhase = True
- self.ProcessModuleFromPdf()
- self.ProcessPcdType()
- self.ProcessMixedPcd()
- self.GetPcdsFromFDF()
- self.CollectAllPcds()
- self.GeneratePkgLevelHash()
- #
- # Check PCDs token value conflict in each DEC file.
- #
- self._CheckAllPcdsTokenValueConflict()
- #
- # Check PCD type and definition between DSC and DEC
- #
- self._CheckPcdDefineAndType()
-
- self.CreateBuildOptionsFile()
- self.CreatePcdTokenNumberFile()
- self.CreateModuleHashInfo()
- GlobalData.gAutoGenPhase = False
-
- #
- # Merge Arch
- #
- def MergeArch(self):
- if not self.ArchList:
- ArchList = set(self.Platform.SupArchList)
- else:
- ArchList = set(self.ArchList) & set(self.Platform.SupArchList)
- if not ArchList:
- EdkLogger.error("build", PARAMETER_INVALID,
- ExtraData = "Invalid ARCH specified. [Valid ARCH: %s]" % (" ".join(self.Platform.SupArchList)))
- elif self.ArchList and len(ArchList) != len(self.ArchList):
- SkippedArchList = set(self.ArchList).symmetric_difference(set(self.Platform.SupArchList))
- EdkLogger.verbose("\nArch [%s] is ignored because the platform supports [%s] only!"
- % (" ".join(SkippedArchList), " ".join(self.Platform.SupArchList)))
- self.ArchList = tuple(ArchList)
-
- # Validate build target
- def ValidateBuildTarget(self):
- if self.BuildTarget not in self.Platform.BuildTargets:
- EdkLogger.error("build", PARAMETER_INVALID,
- ExtraData="Build target [%s] is not supported by the platform. [Valid target: %s]"
- % (self.BuildTarget, " ".join(self.Platform.BuildTargets)))
- @cached_property
- def FdfProfile(self):
- if not self.FdfFile:
- self.FdfFile = self.Platform.FlashDefinition
-
- FdfProfile = None
- if self.FdfFile:
- Fdf = FdfParser(self.FdfFile.Path)
- Fdf.ParseFile()
- GlobalData.gFdfParser = Fdf
- if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.FdDict:
- FdDict = Fdf.Profile.FdDict[Fdf.CurrentFdName]
- for FdRegion in FdDict.RegionList:
- if str(FdRegion.RegionType) is 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
- if int(FdRegion.Offset) % 8 != 0:
- EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
- FdfProfile = Fdf.Profile
- else:
- if self.FdTargetList:
- EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
- self.FdTargetList = []
- if self.FvTargetList:
- EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
- self.FvTargetList = []
- if self.CapTargetList:
- EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
- self.CapTargetList = []
-
- return FdfProfile
-
- def ProcessModuleFromPdf(self):
-
- if self.FdfProfile:
- for fvname in self.FvTargetList:
- if fvname.upper() not in self.FdfProfile.FvDict:
- EdkLogger.error("build", OPTION_VALUE_INVALID,
- "No such an FV in FDF file: %s" % fvname)
-
- # In DSC file may use FILE_GUID to override the module, then in the Platform.Modules use FILE_GUIDmodule.inf as key,
- # but the path (self.MetaFile.Path) is the real path
- for key in self.FdfProfile.InfDict:
- if key == 'ArchTBD':
- MetaFile_cache = defaultdict(set)
- for Arch in self.ArchList:
- Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
- for Pkey in Current_Platform_cache.Modules:
- MetaFile_cache[Arch].add(Current_Platform_cache.Modules[Pkey].MetaFile)
- for Inf in self.FdfProfile.InfDict[key]:
- ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
- for Arch in self.ArchList:
- if ModuleFile in MetaFile_cache[Arch]:
- break
- else:
- ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
- if not ModuleData.IsBinaryModule:
- EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
-
- else:
- for Arch in self.ArchList:
- if Arch == key:
- Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
- MetaFileList = set()
- for Pkey in Platform.Modules:
- MetaFileList.add(Platform.Modules[Pkey].MetaFile)
- for Inf in self.FdfProfile.InfDict[key]:
- ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
- if ModuleFile in MetaFileList:
- continue
- ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
- if not ModuleData.IsBinaryModule:
- EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
-
-
-
- # parse FDF file to get PCDs in it, if any
- def GetPcdsFromFDF(self):
-
- if self.FdfProfile:
- PcdSet = self.FdfProfile.PcdDict
- # handle the mixed pcd in FDF file
- for key in PcdSet:
- if key in GlobalData.MixedPcd:
- Value = PcdSet[key]
- del PcdSet[key]
- for item in GlobalData.MixedPcd[key]:
- PcdSet[item] = Value
- self.VerifyPcdDeclearation(PcdSet)
-
- def ProcessPcdType(self):
- for Arch in self.ArchList:
- Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
- Platform.Pcds
- # generate the SourcePcdDict and BinaryPcdDict
- PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
- for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
- if BuildData.Arch != Arch:
- continue
- if BuildData.MetaFile.Ext == '.inf':
- for key in BuildData.Pcds:
- if BuildData.Pcds[key].Pending:
- if key in Platform.Pcds:
- PcdInPlatform = Platform.Pcds[key]
- if PcdInPlatform.Type:
- BuildData.Pcds[key].Type = PcdInPlatform.Type
- BuildData.Pcds[key].Pending = False
-
- if BuildData.MetaFile in Platform.Modules:
- PlatformModule = Platform.Modules[str(BuildData.MetaFile)]
- if key in PlatformModule.Pcds:
- PcdInPlatform = PlatformModule.Pcds[key]
- if PcdInPlatform.Type:
- BuildData.Pcds[key].Type = PcdInPlatform.Type
- BuildData.Pcds[key].Pending = False
- else:
- #Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
- if BuildData.Pcds[key].Pending:
- MGen = ModuleAutoGen(self, BuildData.MetaFile, self.BuildTarget, self.ToolChain, Arch, self.MetaFile)
- if MGen and MGen.IsLibrary:
- if MGen in PGen.LibraryAutoGenList:
- ReferenceModules = MGen.ReferenceModules
- for ReferenceModule in ReferenceModules:
- if ReferenceModule.MetaFile in Platform.Modules:
- RefPlatformModule = Platform.Modules[str(ReferenceModule.MetaFile)]
- if key in RefPlatformModule.Pcds:
- PcdInReferenceModule = RefPlatformModule.Pcds[key]
- if PcdInReferenceModule.Type:
- BuildData.Pcds[key].Type = PcdInReferenceModule.Type
- BuildData.Pcds[key].Pending = False
- break
-
- def ProcessMixedPcd(self):
- for Arch in self.ArchList:
- SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
- BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
- SourcePcdDict_Keys = SourcePcdDict.keys()
- BinaryPcdDict_Keys = BinaryPcdDict.keys()
-
- # generate the SourcePcdDict and BinaryPcdDict
- PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
- for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
- if BuildData.Arch != Arch:
- continue
- if BuildData.MetaFile.Ext == '.inf':
- for key in BuildData.Pcds:
- if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].Type:
- if BuildData.IsBinaryModule:
- BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
- else:
- SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
-
- elif TAB_PCDS_PATCHABLE_IN_MODULE in BuildData.Pcds[key].Type:
- if BuildData.MetaFile.Ext == '.inf':
- if BuildData.IsBinaryModule:
- BinaryPcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
- else:
- SourcePcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
-
- elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Type:
- SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
- elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[key].Type:
- SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
-
- #
- # A PCD can only use one type for all source modules
- #
- for i in SourcePcdDict_Keys:
- for j in SourcePcdDict_Keys:
- if i != j:
- Intersections = SourcePcdDict[i].intersection(SourcePcdDict[j])
- if len(Intersections) > 0:
- EdkLogger.error(
- 'build',
- FORMAT_INVALID,
- "Building modules from source INFs, following PCD use %s and %s access method. It must be corrected to use only one access method." % (i, j),
- ExtraData='\n\t'.join(str(P[1]+'.'+P[0]) for P in Intersections)
- )
-
- #
- # intersection the BinaryPCD for Mixed PCD
- #
- for i in BinaryPcdDict_Keys:
- for j in BinaryPcdDict_Keys:
- if i != j:
- Intersections = BinaryPcdDict[i].intersection(BinaryPcdDict[j])
- for item in Intersections:
- NewPcd1 = (item[0] + '_' + i, item[1])
- NewPcd2 = (item[0] + '_' + j, item[1])
- if item not in GlobalData.MixedPcd:
- GlobalData.MixedPcd[item] = [NewPcd1, NewPcd2]
- else:
- if NewPcd1 not in GlobalData.MixedPcd[item]:
- GlobalData.MixedPcd[item].append(NewPcd1)
- if NewPcd2 not in GlobalData.MixedPcd[item]:
- GlobalData.MixedPcd[item].append(NewPcd2)
-
- #
- # intersection the SourcePCD and BinaryPCD for Mixed PCD
- #
- for i in SourcePcdDict_Keys:
- for j in BinaryPcdDict_Keys:
- if i != j:
- Intersections = SourcePcdDict[i].intersection(BinaryPcdDict[j])
- for item in Intersections:
- NewPcd1 = (item[0] + '_' + i, item[1])
- NewPcd2 = (item[0] + '_' + j, item[1])
- if item not in GlobalData.MixedPcd:
- GlobalData.MixedPcd[item] = [NewPcd1, NewPcd2]
- else:
- if NewPcd1 not in GlobalData.MixedPcd[item]:
- GlobalData.MixedPcd[item].append(NewPcd1)
- if NewPcd2 not in GlobalData.MixedPcd[item]:
- GlobalData.MixedPcd[item].append(NewPcd2)
-
- for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
- if BuildData.Arch != Arch:
- continue
- for key in BuildData.Pcds:
- for SinglePcd in GlobalData.MixedPcd:
- if (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName) == SinglePcd:
- for item in GlobalData.MixedPcd[SinglePcd]:
- Pcd_Type = item[0].split('_')[-1]
- if (Pcd_Type == BuildData.Pcds[key].Type) or (Pcd_Type == TAB_PCDS_DYNAMIC_EX and BuildData.Pcds[key].Type in PCD_DYNAMIC_EX_TYPE_SET) or \
- (Pcd_Type == TAB_PCDS_DYNAMIC and BuildData.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET):
- Value = BuildData.Pcds[key]
- Value.TokenCName = BuildData.Pcds[key].TokenCName + '_' + Pcd_Type
- if len(key) == 2:
- newkey = (Value.TokenCName, key[1])
- elif len(key) == 3:
- newkey = (Value.TokenCName, key[1], key[2])
- del BuildData.Pcds[key]
- BuildData.Pcds[newkey] = Value
- break
- break
-
- #Collect package set information from INF of FDF
- @cached_property
- def PkgSet(self):
- if not self.FdfFile:
- self.FdfFile = self.Platform.FlashDefinition
-
- if self.FdfFile:
- ModuleList = self.FdfProfile.InfList
- else:
- ModuleList = []
- Pkgs = {}
- for Arch in self.ArchList:
- Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
- PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
- PkgSet = set()
- for Inf in ModuleList:
- ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
- if ModuleFile in Platform.Modules:
- continue
- ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
- PkgSet.update(ModuleData.Packages)
- Pkgs[Arch] = list(PkgSet) + list(PGen.PackageList)
- return Pkgs
-
- def VerifyPcdDeclearation(self,PcdSet):
- for Arch in self.ArchList:
- Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
- Pkgs = self.PkgSet[Arch]
- DecPcds = set()
- DecPcdsKey = set()
- for Pkg in Pkgs:
- for Pcd in Pkg.Pcds:
- DecPcds.add((Pcd[0], Pcd[1]))
- DecPcdsKey.add((Pcd[0], Pcd[1], Pcd[2]))
-
- Platform.SkuName = self.SkuId
- for Name, Guid,Fileds in PcdSet:
- if (Name, Guid) not in DecPcds:
- EdkLogger.error(
- 'build',
- PARSER_ERROR,
- "PCD (%s.%s) used in FDF is not declared in DEC files." % (Guid, Name),
- File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
- Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
- )
- else:
- # Check whether Dynamic or DynamicEx PCD used in FDF file. If used, build break and give a error message.
- if (Name, Guid, TAB_PCDS_FIXED_AT_BUILD) in DecPcdsKey \
- or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE) in DecPcdsKey \
- or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in DecPcdsKey:
- continue
- elif (Name, Guid, TAB_PCDS_DYNAMIC) in DecPcdsKey or (Name, Guid, TAB_PCDS_DYNAMIC_EX) in DecPcdsKey:
- EdkLogger.error(
- 'build',
- PARSER_ERROR,
- "Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (Guid, Name),
- File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
- Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
- )
- def CollectAllPcds(self):
-
- for Arch in self.ArchList:
- Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
- #
- # Explicitly collect platform's dynamic PCDs
- #
- Pa.CollectPlatformDynamicPcds()
- Pa.CollectFixedAtBuildPcds()
- self.AutoGenObjectList.append(Pa)
-
- #
- # Generate Package level hash value
- #
- def GeneratePkgLevelHash(self):
- for Arch in self.ArchList:
- GlobalData.gPackageHash = {}
- if GlobalData.gUseHashCache:
- for Pkg in self.PkgSet[Arch]:
- self._GenPkgLevelHash(Pkg)
-
-
- def CreateBuildOptionsFile(self):
- #
- # Create BuildOptions Macro & PCD metafile, also add the Active Platform and FDF file.
- #
- content = 'gCommandLineDefines: '
- content += str(GlobalData.gCommandLineDefines)
- content += TAB_LINE_BREAK
- content += 'BuildOptionPcd: '
- content += str(GlobalData.BuildOptionPcd)
- content += TAB_LINE_BREAK
- content += 'Active Platform: '
- content += str(self.Platform)
- content += TAB_LINE_BREAK
- if self.FdfFile:
- content += 'Flash Image Definition: '
- content += str(self.FdfFile)
- content += TAB_LINE_BREAK
- SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'), content, False)
-
- def CreatePcdTokenNumberFile(self):
- #
- # Create PcdToken Number file for Dynamic/DynamicEx Pcd.
- #
- PcdTokenNumber = 'PcdTokenNumber: '
- for Arch in self.ArchList:
- Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
- if Pa.PcdTokenNumber:
- if Pa.DynamicPcdList:
- for Pcd in Pa.DynamicPcdList:
- PcdTokenNumber += TAB_LINE_BREAK
- PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
- PcdTokenNumber += ' : '
- PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
- SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'), PcdTokenNumber, False)
-
- def CreateModuleHashInfo(self):
- #
- # Get set of workspace metafiles
- #
- AllWorkSpaceMetaFiles = self._GetMetaFiles(self.BuildTarget, self.ToolChain)
-
- #
- # Retrieve latest modified time of all metafiles
- #
- SrcTimeStamp = 0
- for f in AllWorkSpaceMetaFiles:
- if os.stat(f)[8] > SrcTimeStamp:
- SrcTimeStamp = os.stat(f)[8]
- self._SrcTimeStamp = SrcTimeStamp
-
- if GlobalData.gUseHashCache:
- m = hashlib.md5()
- for files in AllWorkSpaceMetaFiles:
- if files.endswith('.dec'):
- continue
- f = open(files, 'rb')
- Content = f.read()
- f.close()
- m.update(Content)
- SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'), m.hexdigest(), False)
- GlobalData.gPlatformHash = m.hexdigest()
-
- #
- # Write metafile list to build directory
- #
- AutoGenFilePath = os.path.join(self.BuildDir, 'AutoGen')
- if os.path.exists (AutoGenFilePath):
- os.remove(AutoGenFilePath)
- if not os.path.exists(self.BuildDir):
- os.makedirs(self.BuildDir)
- with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file:
- for f in AllWorkSpaceMetaFiles:
- print(f, file=file)
- return True
-
- def _GenPkgLevelHash(self, Pkg):
- if Pkg.PackageName in GlobalData.gPackageHash:
- return
-
- PkgDir = os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageName)
- CreateDirectory(PkgDir)
- HashFile = os.path.join(PkgDir, Pkg.PackageName + '.hash')
- m = hashlib.md5()
- # Get .dec file's hash value
- f = open(Pkg.MetaFile.Path, 'rb')
- Content = f.read()
- f.close()
- m.update(Content)
- # Get include files hash value
- if Pkg.Includes:
- for inc in sorted(Pkg.Includes, key=lambda x: str(x)):
- for Root, Dirs, Files in os.walk(str(inc)):
- for File in sorted(Files):
- File_Path = os.path.join(Root, File)
- f = open(File_Path, 'rb')
- Content = f.read()
- f.close()
- m.update(Content)
- SaveFileOnChange(HashFile, m.hexdigest(), False)
- GlobalData.gPackageHash[Pkg.PackageName] = m.hexdigest()
-
- def _GetMetaFiles(self, Target, Toolchain):
- AllWorkSpaceMetaFiles = set()
- #
- # add fdf
- #
- if self.FdfFile:
- AllWorkSpaceMetaFiles.add (self.FdfFile.Path)
- for f in GlobalData.gFdfParser.GetAllIncludedFile():
- AllWorkSpaceMetaFiles.add (f.FileName)
- #
- # add dsc
- #
- AllWorkSpaceMetaFiles.add(self.MetaFile.Path)
-
- #
- # add build_rule.txt & tools_def.txt
- #
- AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirectory, gDefaultBuildRuleFile))
- AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirectory, gDefaultToolsDefFile))
-
- # add BuildOption metafile
- #
- AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'BuildOptions'))
-
- # add PcdToken Number file for Dynamic/DynamicEx Pcd
- #
- AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'PcdTokenNumber'))
-
- for Pa in self.AutoGenObjectList:
- AllWorkSpaceMetaFiles.add(Pa.ToolDefinitionFile)
-
- for Arch in self.ArchList:
- #
- # add dec
- #
- for Package in PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch).PackageList:
- AllWorkSpaceMetaFiles.add(Package.MetaFile.Path)
-
- #
- # add included dsc
- #
- for filePath in self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]._RawData.IncludedFiles:
- AllWorkSpaceMetaFiles.add(filePath.Path)
-
- return AllWorkSpaceMetaFiles
-
- def _CheckPcdDefineAndType(self):
- PcdTypeSet = {TAB_PCDS_FIXED_AT_BUILD,
- TAB_PCDS_PATCHABLE_IN_MODULE,
- TAB_PCDS_FEATURE_FLAG,
- TAB_PCDS_DYNAMIC,
- TAB_PCDS_DYNAMIC_EX}
-
- # This dict store PCDs which are not used by any modules with specified arches
- UnusedPcd = OrderedDict()
- for Pa in self.AutoGenObjectList:
- # Key of DSC's Pcds dictionary is PcdCName, TokenSpaceGuid
- for Pcd in Pa.Platform.Pcds:
- PcdType = Pa.Platform.Pcds[Pcd].Type
-
- # If no PCD type, this PCD comes from FDF
- if not PcdType:
- continue
-
- # Try to remove Hii and Vpd suffix
- if PcdType.startswith(TAB_PCDS_DYNAMIC_EX):
- PcdType = TAB_PCDS_DYNAMIC_EX
- elif PcdType.startswith(TAB_PCDS_DYNAMIC):
- PcdType = TAB_PCDS_DYNAMIC
-
- for Package in Pa.PackageList:
- # Key of DEC's Pcds dictionary is PcdCName, TokenSpaceGuid, PcdType
- if (Pcd[0], Pcd[1], PcdType) in Package.Pcds:
- break
- for Type in PcdTypeSet:
- if (Pcd[0], Pcd[1], Type) in Package.Pcds:
- EdkLogger.error(
- 'build',
- FORMAT_INVALID,
- "Type [%s] of PCD [%s.%s] in DSC file doesn't match the type [%s] defined in DEC file." \
- % (Pa.Platform.Pcds[Pcd].Type, Pcd[1], Pcd[0], Type),
- ExtraData=None
- )
- return
- else:
- UnusedPcd.setdefault(Pcd, []).append(Pa.Arch)
-
- for Pcd in UnusedPcd:
- EdkLogger.warn(
- 'build',
- "The PCD was not specified by any INF module in the platform for the given architecture.\n"
- "\tPCD: [%s.%s]\n\tPlatform: [%s]\n\tArch: %s"
- % (Pcd[1], Pcd[0], os.path.basename(str(self.MetaFile)), str(UnusedPcd[Pcd])),
- ExtraData=None
- )
-
- def __repr__(self):
- return "%s [%s]" % (self.MetaFile, ", ".join(self.ArchList))
-
- ## Return the directory to store FV files
- @cached_property
- def FvDir(self):
- return path.join(self.BuildDir, TAB_FV_DIRECTORY)
-
- ## Return the directory to store all intermediate and final files built
- @cached_property
- def BuildDir(self):
- return self.AutoGenObjectList[0].BuildDir
-
- ## Return the build output directory platform specifies
- @cached_property
- def OutputDir(self):
- return self.Platform.OutputDirectory
-
- ## Return platform name
- @cached_property
- def Name(self):
- return self.Platform.PlatformName
-
- ## Return meta-file GUID
- @cached_property
- def Guid(self):
- return self.Platform.Guid
-
- ## Return platform version
- @cached_property
- def Version(self):
- return self.Platform.Version
-
- ## Return paths of tools
- @cached_property
- def ToolDefinition(self):
- return self.AutoGenObjectList[0].ToolDefinition
-
- ## Return directory of platform makefile
- #
- # @retval string Makefile directory
- #
- @cached_property
- def MakeFileDir(self):
- return self.BuildDir
-
- ## Return build command string
- #
- # @retval string Build command string
- #
- @cached_property
- def BuildCommand(self):
- # BuildCommand should be all the same. So just get one from platform AutoGen
- return self.AutoGenObjectList[0].BuildCommand
-
- ## Check the PCDs token value conflict in each DEC file.
- #
- # Will cause build break and raise error message while two PCDs conflict.
- #
- # @return None
- #
- def _CheckAllPcdsTokenValueConflict(self):
- for Pa in self.AutoGenObjectList:
- for Package in Pa.PackageList:
- PcdList = list(Package.Pcds.values())
- PcdList.sort(key=lambda x: int(x.TokenValue, 0))
- Count = 0
- while (Count < len(PcdList) - 1) :
- Item = PcdList[Count]
- ItemNext = PcdList[Count + 1]
- #
- # Make sure in the same token space the TokenValue should be unique
- #
- if (int(Item.TokenValue, 0) == int(ItemNext.TokenValue, 0)):
- SameTokenValuePcdList = []
- SameTokenValuePcdList.append(Item)
- SameTokenValuePcdList.append(ItemNext)
- RemainPcdListLength = len(PcdList) - Count - 2
- for ValueSameCount in range(RemainPcdListLength):
- if int(PcdList[len(PcdList) - RemainPcdListLength + ValueSameCount].TokenValue, 0) == int(Item.TokenValue, 0):
- SameTokenValuePcdList.append(PcdList[len(PcdList) - RemainPcdListLength + ValueSameCount])
- else:
- break;
- #
- # Sort same token value PCD list with TokenGuid and TokenCName
- #
- SameTokenValuePcdList.sort(key=lambda x: "%s.%s" % (x.TokenSpaceGuidCName, x.TokenCName))
- SameTokenValuePcdListCount = 0
- while (SameTokenValuePcdListCount < len(SameTokenValuePcdList) - 1):
- Flag = False
- TemListItem = SameTokenValuePcdList[SameTokenValuePcdListCount]
- TemListItemNext = SameTokenValuePcdList[SameTokenValuePcdListCount + 1]
-
- if (TemListItem.TokenSpaceGuidCName == TemListItemNext.TokenSpaceGuidCName) and (TemListItem.TokenCName != TemListItemNext.TokenCName):
- for PcdItem in GlobalData.MixedPcd:
- if (TemListItem.TokenCName, TemListItem.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem] or \
- (TemListItemNext.TokenCName, TemListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
- Flag = True
- if not Flag:
- EdkLogger.error(
- 'build',
- FORMAT_INVALID,
- "The TokenValue [%s] of PCD [%s.%s] is conflict with: [%s.%s] in %s"\
- % (TemListItem.TokenValue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemNext.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package),
- ExtraData=None
- )
- SameTokenValuePcdListCount += 1
- Count += SameTokenValuePcdListCount
- Count += 1
-
- PcdList = list(Package.Pcds.values())
- PcdList.sort(key=lambda x: "%s.%s" % (x.TokenSpaceGuidCName, x.TokenCName))
- Count = 0
- while (Count < len(PcdList) - 1) :
- Item = PcdList[Count]
- ItemNext = PcdList[Count + 1]
- #
- # Check PCDs with same TokenSpaceGuidCName.TokenCName have same token value as well.
- #
- if (Item.TokenSpaceGuidCName == ItemNext.TokenSpaceGuidCName) and (Item.TokenCName == ItemNext.TokenCName) and (int(Item.TokenValue, 0) != int(ItemNext.TokenValue, 0)):
- EdkLogger.error(
- 'build',
- FORMAT_INVALID,
- "The TokenValue [%s] of PCD [%s.%s] in %s defined in two places should be same as well."\
- % (Item.TokenValue, Item.TokenSpaceGuidCName, Item.TokenCName, Package),
- ExtraData=None
- )
- Count += 1
- ## Generate fds command
- @property
- def GenFdsCommand(self):
- return (GenMake.TopLevelMakefile(self)._TEMPLATE_.Replace(GenMake.TopLevelMakefile(self)._TemplateDict)).strip()
-
- @property
- def GenFdsCommandDict(self):
- FdsCommandDict = {}
- LogLevel = EdkLogger.GetLevel()
- if LogLevel == EdkLogger.VERBOSE:
- FdsCommandDict["verbose"] = True
- elif LogLevel <= EdkLogger.DEBUG_9:
- FdsCommandDict["debug"] = LogLevel - 1
- elif LogLevel == EdkLogger.QUIET:
- FdsCommandDict["quiet"] = True
-
- if GlobalData.gEnableGenfdsMultiThread:
- FdsCommandDict["GenfdsMultiThread"] = True
- if GlobalData.gIgnoreSource:
- FdsCommandDict["IgnoreSources"] = True
-
- FdsCommandDict["OptionPcd"] = []
- for pcd in GlobalData.BuildOptionPcd:
- if pcd[2]:
- pcdname = '.'.join(pcd[0:3])
- else:
- pcdname = '.'.join(pcd[0:2])
- if pcd[3].startswith('{'):
- FdsCommandDict["OptionPcd"].append(pcdname + '=' + 'H' + '"' + pcd[3] + '"')
- else:
- FdsCommandDict["OptionPcd"].append(pcdname + '=' + pcd[3])
-
- MacroList = []
- # macros passed to GenFds
- MacroDict = {}
- MacroDict.update(GlobalData.gGlobalDefines)
- MacroDict.update(GlobalData.gCommandLineDefines)
- for MacroName in MacroDict:
- if MacroDict[MacroName] != "":
- MacroList.append('"%s=%s"' % (MacroName, MacroDict[MacroName].replace('\\', '\\\\')))
- else:
- MacroList.append('"%s"' % MacroName)
- FdsCommandDict["macro"] = MacroList
-
- FdsCommandDict["fdf_file"] = [self.FdfFile]
- FdsCommandDict["build_target"] = self.BuildTarget
- FdsCommandDict["toolchain_tag"] = self.ToolChain
- FdsCommandDict["active_platform"] = str(self)
-
- FdsCommandDict["conf_directory"] = GlobalData.gConfDirectory
- FdsCommandDict["build_architecture_list"] = ','.join(self.ArchList)
- FdsCommandDict["platform_build_directory"] = self.BuildDir
-
- FdsCommandDict["fd"] = self.FdTargetList
- FdsCommandDict["fv"] = self.FvTargetList
- FdsCommandDict["cap"] = self.CapTargetList
- return FdsCommandDict
-
- ## Create makefile for the platform and modules in it
- #
- # @param CreateDepsMakeFile Flag indicating if the makefile for
- # modules will be created as well
- #
- def CreateMakeFile(self, CreateDepsMakeFile=False):
- if not CreateDepsMakeFile:
- return
- for Pa in self.AutoGenObjectList:
- Pa.CreateMakeFile(True)
-
- ## Create autogen code for platform and modules
- #
- # Since there's no autogen code for platform, this method will do nothing
- # if CreateModuleCodeFile is set to False.
- #
- # @param CreateDepsCodeFile Flag indicating if creating module's
- # autogen code file or not
- #
- def CreateCodeFile(self, CreateDepsCodeFile=False):
- if not CreateDepsCodeFile:
- return
- for Pa in self.AutoGenObjectList:
- Pa.CreateCodeFile(True)
-
- ## Create AsBuilt INF file the platform
- #
- def CreateAsBuiltInf(self):
- return
-
-
-## AutoGen class for platform
-#
-# PlatformAutoGen class will process the original information in platform
-# file in order to generate makefile for platform.
-#
-class PlatformAutoGen(AutoGen):
- # call super().__init__ then call the worker function with different parameter count
- def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
- if not hasattr(self, "_Init"):
- self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch)
- self._Init = True
- #
- # Used to store all PCDs for both PEI and DXE phase, in order to generate
- # correct PCD database
- #
- _DynaPcdList_ = []
- _NonDynaPcdList_ = []
- _PlatformPcds = {}
-
- #
- # The priority list while override build option
- #
- PrioList = {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
- "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
- "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
- "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
- "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
- "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
- "0x10011" : 10, # TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
- "0x00011" : 9, # ******_*********_****_COMMANDTYPE_ATTRIBUTE
- "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_***********_ATTRIBUTE
- "0x01101" : 7, # ******_TOOLCHAIN_ARCH_***********_ATTRIBUTE
- "0x10101" : 6, # TARGET_*********_ARCH_***********_ATTRIBUTE
- "0x00101" : 5, # ******_*********_ARCH_***********_ATTRIBUTE
- "0x11001" : 4, # TARGET_TOOLCHAIN_****_***********_ATTRIBUTE
- "0x01001" : 3, # ******_TOOLCHAIN_****_***********_ATTRIBUTE
- "0x10001" : 2, # TARGET_*********_****_***********_ATTRIBUTE
- "0x00001" : 1} # ******_*********_****_***********_ATTRIBUTE (Lowest)
-
- ## Initialize PlatformAutoGen
- #
- #
- # @param Workspace WorkspaceAutoGen object
- # @param PlatformFile Platform file (DSC file)
- # @param Target Build target (DEBUG, RELEASE)
- # @param Toolchain Name of tool chain
- # @param Arch arch of the platform supports
- #
- def _InitWorker(self, Workspace, PlatformFile, Target, Toolchain, Arch):
- EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen platform [%s] [%s]" % (PlatformFile, Arch))
- GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (PlatformFile, Arch, Toolchain, Target)
-
- self.MetaFile = PlatformFile
- self.Workspace = Workspace
- self.WorkspaceDir = Workspace.WorkspaceDir
- self.ToolChain = Toolchain
- self.BuildTarget = Target
- self.Arch = Arch
- self.SourceDir = PlatformFile.SubDir
- self.FdTargetList = self.Workspace.FdTargetList
- self.FvTargetList = self.Workspace.FvTargetList
- # get the original module/package/platform objects
- self.BuildDatabase = Workspace.BuildDatabase
- self.DscBuildDataObj = Workspace.Platform
-
- # flag indicating if the makefile/C-code file has been created or not
- self.IsMakeFileCreated = False
-
- self._DynamicPcdList = None # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
- self._NonDynamicPcdList = None # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
-
- self._AsBuildInfList = []
- self._AsBuildModuleList = []
-
- self.VariableInfo = None
-
- if GlobalData.gFdfParser is not None:
- self._AsBuildInfList = GlobalData.gFdfParser.Profile.InfList
- for Inf in self._AsBuildInfList:
- InfClass = PathClass(NormPath(Inf), GlobalData.gWorkspace, self.Arch)
- M = self.BuildDatabase[InfClass, self.Arch, self.BuildTarget, self.ToolChain]
- if not M.IsBinaryModule:
- continue
- self._AsBuildModuleList.append(InfClass)
- # get library/modules for build
- self.LibraryBuildDirectoryList = []
- self.ModuleBuildDirectoryList = []
-
- return True
-
- ## hash() operator of PlatformAutoGen
- #
- # The platform file path and arch string will be used to represent
- # hash value of this object
- #
- # @retval int Hash value of the platform file path and arch
- #
- @cached_class_function
- def __hash__(self):
- return hash((self.MetaFile, self.Arch))
-
- @cached_class_function
- def __repr__(self):
- return "%s [%s]" % (self.MetaFile, self.Arch)
-
- ## Create autogen code for platform and modules
- #
- # Since there's no autogen code for platform, this method will do nothing
- # if CreateModuleCodeFile is set to False.
- #
- # @param CreateModuleCodeFile Flag indicating if creating module's
- # autogen code file or not
- #
- @cached_class_function
- def CreateCodeFile(self, CreateModuleCodeFile=False):
- # only module has code to be created, so do nothing if CreateModuleCodeFile is False
- if not CreateModuleCodeFile:
- return
-
- for Ma in self.ModuleAutoGenList:
- Ma.CreateCodeFile(True)
-
- ## Generate Fds Command
- @cached_property
- def GenFdsCommand(self):
- return self.Workspace.GenFdsCommand
-
- ## Create makefile for the platform and modules in it
- #
- # @param CreateModuleMakeFile Flag indicating if the makefile for
- # modules will be created as well
- #
- def CreateMakeFile(self, CreateModuleMakeFile=False, FfsCommand = {}):
- if CreateModuleMakeFile:
- for Ma in self._MaList:
- key = (Ma.MetaFile.File, self.Arch)
- if key in FfsCommand:
- Ma.CreateMakeFile(True, FfsCommand[key])
- else:
- Ma.CreateMakeFile(True)
-
- # no need to create makefile for the platform more than once
- if self.IsMakeFileCreated:
- return
-
- # create library/module build dirs for platform
- Makefile = GenMake.PlatformMakefile(self)
- self.LibraryBuildDirectoryList = Makefile.GetLibraryBuildDirectoryList()
- self.ModuleBuildDirectoryList = Makefile.GetModuleBuildDirectoryList()
-
- self.IsMakeFileCreated = True
-
- @property
- def AllPcdList(self):
- return self.DynamicPcdList + self.NonDynamicPcdList
- ## Deal with Shared FixedAtBuild Pcds
- #
- def CollectFixedAtBuildPcds(self):
- for LibAuto in self.LibraryAutoGenList:
- FixedAtBuildPcds = {}
- ShareFixedAtBuildPcdsSameValue = {}
- for Module in LibAuto.ReferenceModules:
- for Pcd in set(Module.FixedAtBuildPcds + LibAuto.FixedAtBuildPcds):
- DefaultValue = Pcd.DefaultValue
- # Cover the case: DSC component override the Pcd value and the Pcd only used in one Lib
- if Pcd in Module.LibraryPcdList:
- Index = Module.LibraryPcdList.index(Pcd)
- DefaultValue = Module.LibraryPcdList[Index].DefaultValue
- key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
- if key not in FixedAtBuildPcds:
- ShareFixedAtBuildPcdsSameValue[key] = True
- FixedAtBuildPcds[key] = DefaultValue
- else:
- if FixedAtBuildPcds[key] != DefaultValue:
- ShareFixedAtBuildPcdsSameValue[key] = False
- for Pcd in LibAuto.FixedAtBuildPcds:
- key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
- if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
- continue
- else:
- DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
- if DscPcd.Type != TAB_PCDS_FIXED_AT_BUILD:
- continue
- if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:
- LibAuto.ConstPcd[key] = FixedAtBuildPcds[key]
-
- def CollectVariables(self, DynamicPcdSet):
- VpdRegionSize = 0
- VpdRegionBase = 0
- if self.Workspace.FdfFile:
- FdDict = self.Workspace.FdfProfile.FdDict[GlobalData.gFdfParser.CurrentFdName]
- for FdRegion in FdDict.RegionList:
- for item in FdRegion.RegionDataList:
- if self.Platform.VpdToolGuid.strip() and self.Platform.VpdToolGuid in item:
- VpdRegionSize = FdRegion.Size
- VpdRegionBase = FdRegion.Offset
- break
-
- VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(), self.DscBuildDataObj.SkuIds)
- VariableInfo.SetVpdRegionMaxSize(VpdRegionSize)
- VariableInfo.SetVpdRegionOffset(VpdRegionBase)
- Index = 0
- for Pcd in DynamicPcdSet:
- pcdname = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
- for SkuName in Pcd.SkuInfoList:
- Sku = Pcd.SkuInfoList[SkuName]
- SkuId = Sku.SkuId
- if SkuId is None or SkuId == '':
- continue
- if len(Sku.VariableName) > 0:
- if Sku.VariableAttribute and 'NV' not in Sku.VariableAttribute:
- continue
- VariableGuidStructure = Sku.VariableGuidValue
- VariableGuid = GuidStructureStringToGuidString(VariableGuidStructure)
- for StorageName in Sku.DefaultStoreDict:
- VariableInfo.append_variable(var_info(Index, pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultStoreDict[StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else StringToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAttribute['DscPosition'], Pcd.CustomAttribute.get('IsStru',False)))
- Index += 1
- return VariableInfo
-
- def UpdateNVStoreMaxSize(self, OrgVpdFile):
- if self.VariableInfo:
- VpdMapFilePath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
- PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
-
- if PcdNvStoreDfBuffer:
- if os.path.exists(VpdMapFilePath):
- OrgVpdFile.Read(VpdMapFilePath)
- PcdItems = OrgVpdFile.GetOffset(PcdNvStoreDfBuffer[0])
- NvStoreOffset = list(PcdItems.values())[0].strip() if PcdItems else '0'
- else:
- EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
-
- NvStoreOffset = int(NvStoreOffset, 16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
- default_skuobj = PcdNvStoreDfBuffer[0].SkuInfoList.get(TAB_DEFAULT)
- maxsize = self.VariableInfo.VpdRegionSize - NvStoreOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultValue.split(","))
- var_data = self.VariableInfo.PatchNVStoreDefaultMaxSize(maxsize)
-
- if var_data and default_skuobj:
- default_skuobj.DefaultValue = var_data
- PcdNvStoreDfBuffer[0].DefaultValue = var_data
- PcdNvStoreDfBuffer[0].SkuInfoList.clear()
- PcdNvStoreDfBuffer[0].SkuInfoList[TAB_DEFAULT] = default_skuobj
- PcdNvStoreDfBuffer[0].MaxDatumSize = str(len(default_skuobj.DefaultValue.split(",")))
-
- return OrgVpdFile
-
- ## Collect dynamic PCDs
- #
- # Gather dynamic PCDs list from each module and their settings from platform
- # This interface should be invoked explicitly when platform action is created.
- #
- def CollectPlatformDynamicPcds(self):
- for key in self.Platform.Pcds:
- for SinglePcd in GlobalData.MixedPcd:
- if (self.Platform.Pcds[key].TokenCName, self.Platform.Pcds[key].TokenSpaceGuidCName) == SinglePcd:
- for item in GlobalData.MixedPcd[SinglePcd]:
- Pcd_Type = item[0].split('_')[-1]
- if (Pcd_Type == self.Platform.Pcds[key].Type) or (Pcd_Type == TAB_PCDS_DYNAMIC_EX and self.Platform.Pcds[key].Type in PCD_DYNAMIC_EX_TYPE_SET) or \
- (Pcd_Type == TAB_PCDS_DYNAMIC and self.Platform.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET):
- Value = self.Platform.Pcds[key]
- Value.TokenCName = self.Platform.Pcds[key].TokenCName + '_' + Pcd_Type
- if len(key) == 2:
- newkey = (Value.TokenCName, key[1])
- elif len(key) == 3:
- newkey = (Value.TokenCName, key[1], key[2])
- del self.Platform.Pcds[key]
- self.Platform.Pcds[newkey] = Value
- break
- break
-
- # for gathering error information
- NoDatumTypePcdList = set()
- FdfModuleList = []
- for InfName in self._AsBuildInfList:
- InfName = mws.join(self.WorkspaceDir, InfName)
- FdfModuleList.append(os.path.normpath(InfName))
- for M in self._MaList:
-# F is the Module for which M is the module autogen
- for PcdFromModule in M.ModulePcdList + M.LibraryPcdList:
- # make sure that the "VOID*" kind of datum has MaxDatumSize set
- if PcdFromModule.DatumType == TAB_VOID and not PcdFromModule.MaxDatumSize:
- NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile))
-
- # Check the PCD from Binary INF or Source INF
- if M.IsBinaryModule == True:
- PcdFromModule.IsFromBinaryInf = True
-
- # Check the PCD from DSC or not
- PcdFromModule.IsFromDsc = (PcdFromModule.TokenCName, PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds
-
- if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET or PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
- if M.MetaFile.Path not in FdfModuleList:
- # If one of the Source built modules listed in the DSC is not listed
- # in FDF modules, and the INF lists a PCD can only use the PcdsDynamic
- # access method (it is only listed in the DEC file that declares the
- # PCD as PcdsDynamic), then build tool will report warning message
- # notify the PI that they are attempting to build a module that must
- # be included in a flash image in order to be functional. These Dynamic
- # PCD will not be added into the Database unless it is used by other
- # modules that are included in the FDF file.
- if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET and \
- PcdFromModule.IsFromBinaryInf == False:
- # Print warning message to let the developer make a determine.
- continue
- # If one of the Source built modules listed in the DSC is not listed in
- # FDF modules, and the INF lists a PCD can only use the PcdsDynamicEx
- # access method (it is only listed in the DEC file that declares the
- # PCD as PcdsDynamicEx), then DO NOT break the build; DO NOT add the
- # PCD to the Platform's PCD Database.
- if PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
- continue
- #
- # If a dynamic PCD used by a PEM module/PEI module & DXE module,
- # it should be stored in Pcd PEI database, If a dynamic only
- # used by DXE module, it should be stored in DXE PCD database.
- # The default Phase is DXE
- #
- if M.ModuleType in SUP_MODULE_SET_PEI:
- PcdFromModule.Phase = "PEI"
- if PcdFromModule not in self._DynaPcdList_:
- self._DynaPcdList_.append(PcdFromModule)
- elif PcdFromModule.Phase == 'PEI':
- # overwrite any the same PCD existing, if Phase is PEI
- Index = self._DynaPcdList_.index(PcdFromModule)
- self._DynaPcdList_[Index] = PcdFromModule
- elif PcdFromModule not in self._NonDynaPcdList_:
- self._NonDynaPcdList_.append(PcdFromModule)
- elif PcdFromModule in self._NonDynaPcdList_ and PcdFromModule.IsFromBinaryInf == True:
- Index = self._NonDynaPcdList_.index(PcdFromModule)
- if self._NonDynaPcdList_[Index].IsFromBinaryInf == False:
- #The PCD from Binary INF will override the same one from source INF
- self._NonDynaPcdList_.remove (self._NonDynaPcdList_[Index])
- PcdFromModule.Pending = False
- self._NonDynaPcdList_.append (PcdFromModule)
- DscModuleSet = {os.path.normpath(ModuleInf.Path) for ModuleInf in self.Platform.Modules}
- # add the PCD from modules that listed in FDF but not in DSC to Database
- for InfName in FdfModuleList:
- if InfName not in DscModuleSet:
- InfClass = PathClass(InfName)
- M = self.BuildDatabase[InfClass, self.Arch, self.BuildTarget, self.ToolChain]
- # If a module INF in FDF but not in current arch's DSC module list, it must be module (either binary or source)
- # for different Arch. PCDs in source module for different Arch is already added before, so skip the source module here.
- # For binary module, if in current arch, we need to list the PCDs into database.
- if not M.IsBinaryModule:
- continue
- # Override the module PCD setting by platform setting
- ModulePcdList = self.ApplyPcdSetting(M, M.Pcds)
- for PcdFromModule in ModulePcdList:
- PcdFromModule.IsFromBinaryInf = True
- PcdFromModule.IsFromDsc = False
- # Only allow the DynamicEx and Patchable PCD in AsBuild INF
- if PcdFromModule.Type not in PCD_DYNAMIC_EX_TYPE_SET and PcdFromModule.Type not in TAB_PCDS_PATCHABLE_IN_MODULE:
- EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting error",
- File=self.MetaFile,
- ExtraData="\n\tExisted %s PCD %s in:\n\t\t%s\n"
- % (PcdFromModule.Type, PcdFromModule.TokenCName, InfName))
- # make sure that the "VOID*" kind of datum has MaxDatumSize set
- if PcdFromModule.DatumType == TAB_VOID and not PcdFromModule.MaxDatumSize:
- NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName))
- if M.ModuleType in SUP_MODULE_SET_PEI:
- PcdFromModule.Phase = "PEI"
- if PcdFromModule not in self._DynaPcdList_ and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
- self._DynaPcdList_.append(PcdFromModule)
- elif PcdFromModule not in self._NonDynaPcdList_ and PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE:
- self._NonDynaPcdList_.append(PcdFromModule)
- if PcdFromModule in self._DynaPcdList_ and PcdFromModule.Phase == 'PEI' and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
- # Overwrite the phase of any the same PCD existing, if Phase is PEI.
- # It is to solve the case that a dynamic PCD used by a PEM module/PEI
- # module & DXE module at a same time.
- # Overwrite the type of the PCDs in source INF by the type of AsBuild
- # INF file as DynamicEx.
- Index = self._DynaPcdList_.index(PcdFromModule)
- self._DynaPcdList_[Index].Phase = PcdFromModule.Phase
- self._DynaPcdList_[Index].Type = PcdFromModule.Type
- for PcdFromModule in self._NonDynaPcdList_:
- # If a PCD is not listed in the DSC file, but binary INF files used by
- # this platform all (that use this PCD) list the PCD in a [PatchPcds]
- # section, AND all source INF files used by this platform the build
- # that use the PCD list the PCD in either a [Pcds] or [PatchPcds]
- # section, then the tools must NOT add the PCD to the Platform's PCD
- # Database; the build must assign the access method for this PCD as
- # PcdsPatchableInModule.
- if PcdFromModule not in self._DynaPcdList_:
- continue
- Index = self._DynaPcdList_.index(PcdFromModule)
- if PcdFromModule.IsFromDsc == False and \
- PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE and \
- PcdFromModule.IsFromBinaryInf == True and \
- self._DynaPcdList_[Index].IsFromBinaryInf == False:
- Index = self._DynaPcdList_.index(PcdFromModule)
- self._DynaPcdList_.remove (self._DynaPcdList_[Index])
-
- # print out error information and break the build, if error found
- if len(NoDatumTypePcdList) > 0:
- NoDatumTypePcdListString = "\n\t\t".join(NoDatumTypePcdList)
- EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting error",
- File=self.MetaFile,
- ExtraData="\n\tPCD(s) without MaxDatumSize:\n\t\t%s\n"
- % NoDatumTypePcdListString)
- self._NonDynamicPcdList = self._NonDynaPcdList_
- self._DynamicPcdList = self._DynaPcdList_
- #
- # Sort dynamic PCD list to:
- # 1) If PCD's datum type is VOID* and value is unicode string which starts with L, the PCD item should
- # try to be put header of dynamicd List
- # 2) If PCD is HII type, the PCD item should be put after unicode type PCD
- #
- # The reason of sorting is make sure the unicode string is in double-byte alignment in string table.
- #
- UnicodePcdArray = set()
- HiiPcdArray = set()
- OtherPcdArray = set()
- VpdPcdDict = {}
- VpdFile = VpdInfoFile.VpdInfoFile()
- NeedProcessVpdMapFile = False
-
- for pcd in self.Platform.Pcds:
- if pcd not in self._PlatformPcds:
- self._PlatformPcds[pcd] = self.Platform.Pcds[pcd]
-
- for item in self._PlatformPcds:
- if self._PlatformPcds[item].DatumType and self._PlatformPcds[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
- self._PlatformPcds[item].DatumType = TAB_VOID
-
- if (self.Workspace.ArchList[-1] == self.Arch):
- for Pcd in self._DynamicPcdList:
- # just pick the a value to determine whether is unicode string type
- Sku = Pcd.SkuInfoList.get(TAB_DEFAULT)
- Sku.VpdOffset = Sku.VpdOffset.strip()
-
- if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
- Pcd.DatumType = TAB_VOID
-
- # if found PCD which datum value is unicode string the insert to left size of UnicodeIndex
- # if found HII type PCD then insert to right of UnicodeIndex
- if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
- VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
-
- #Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
- PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid"))
- if PcdNvStoreDfBuffer:
- self.VariableInfo = self.CollectVariables(self._DynamicPcdList)
- vardump = self.VariableInfo.dump()
- if vardump:
- #
- #According to PCD_DATABASE_INIT in edk2\MdeModulePkg\Include\Guid\PcdDataBaseSignatureGuid.h,
- #the max size for string PCD should not exceed USHRT_MAX 65535(0xffff).
- #typedef UINT16 SIZE_INFO;
- #//SIZE_INFO SizeTable[];
- if len(vardump.split(",")) > 0xffff:
- EdkLogger.error("build", RESOURCE_OVERFLOW, 'The current length of PCD %s value is %d, it exceeds to the max size of String PCD.' %(".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName,PcdNvStoreDfBuffer.TokenCName]) ,len(vardump.split(","))))
- PcdNvStoreDfBuffer.DefaultValue = vardump
- for skuname in PcdNvStoreDfBuffer.SkuInfoList:
- PcdNvStoreDfBuffer.SkuInfoList[skuname].DefaultValue = vardump
- PcdNvStoreDfBuffer.MaxDatumSize = str(len(vardump.split(",")))
- else:
- #If the end user define [DefaultStores] and [XXX.Menufacturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to PcdsDynamicVpd
- if [Pcd for Pcd in self._DynamicPcdList if Pcd.UserDefinedDefaultStoresFlag]:
- EdkLogger.warn("build", "PcdNvStoreDefaultValueBuffer should be defined as PcdsDynamicExVpd in dsc file since the DefaultStores is enabled for this platform.\n%s" %self.Platform.MetaFile.Path)
- PlatformPcds = sorted(self._PlatformPcds.keys())
- #
- # Add VPD type PCD into VpdFile and determine whether the VPD PCD need to be fixed up.
- #
- VpdSkuMap = {}
- for PcdKey in PlatformPcds:
- Pcd = self._PlatformPcds[PcdKey]
- if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD] and \
- PcdKey in VpdPcdDict:
- Pcd = VpdPcdDict[PcdKey]
- SkuValueMap = {}
- DefaultSku = Pcd.SkuInfoList.get(TAB_DEFAULT)
- if DefaultSku:
- PcdValue = DefaultSku.DefaultValue
- if PcdValue not in SkuValueMap:
- SkuValueMap[PcdValue] = []
- VpdFile.Add(Pcd, TAB_DEFAULT, DefaultSku.VpdOffset)
- SkuValueMap[PcdValue].append(DefaultSku)
-
- for (SkuName, Sku) in Pcd.SkuInfoList.items():
- Sku.VpdOffset = Sku.VpdOffset.strip()
- PcdValue = Sku.DefaultValue
- if PcdValue == "":
- PcdValue = Pcd.DefaultValue
- if Sku.VpdOffset != TAB_STAR:
- if PcdValue.startswith("{"):
- Alignment = 8
- elif PcdValue.startswith("L"):
- Alignment = 2
- else:
- Alignment = 1
- try:
- VpdOffset = int(Sku.VpdOffset)
- except:
- try:
- VpdOffset = int(Sku.VpdOffset, 16)
- except:
- EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
- if VpdOffset % Alignment != 0:
- if PcdValue.startswith("{"):
- EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" %(Pcd.TokenSpaceGuidCName, Pcd.TokenCName), File=self.MetaFile)
- else:
- EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Alignment))
- if PcdValue not in SkuValueMap:
- SkuValueMap[PcdValue] = []
- VpdFile.Add(Pcd, SkuName, Sku.VpdOffset)
- SkuValueMap[PcdValue].append(Sku)
- # if the offset of a VPD is *, then it need to be fixed up by third party tool.
- if not NeedProcessVpdMapFile and Sku.VpdOffset == TAB_STAR:
- NeedProcessVpdMapFile = True
- if self.Platform.VpdToolGuid is None or self.Platform.VpdToolGuid == '':
- EdkLogger.error("Build", FILE_NOT_FOUND, \
- "Fail to find third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.")
-
- VpdSkuMap[PcdKey] = SkuValueMap
- #
- # Fix the PCDs define in VPD PCD section that never referenced by module.
- # An example is PCD for signature usage.
- #
- for DscPcd in PlatformPcds:
- DscPcdEntry = self._PlatformPcds[DscPcd]
- if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
- if not (self.Platform.VpdToolGuid is None or self.Platform.VpdToolGuid == ''):
- FoundFlag = False
- for VpdPcd in VpdFile._VpdArray:
- # This PCD has been referenced by module
- if (VpdPcd.TokenSpaceGuidCName == DscPcdEntry.TokenSpaceGuidCName) and \
- (VpdPcd.TokenCName == DscPcdEntry.TokenCName):
- FoundFlag = True
-
- # Not found, it should be signature
- if not FoundFlag :
- # just pick the a value to determine whether is unicode string type
- SkuValueMap = {}
- SkuObjList = list(DscPcdEntry.SkuInfoList.items())
- DefaultSku = DscPcdEntry.SkuInfoList.get(TAB_DEFAULT)
- if DefaultSku:
- defaultindex = SkuObjList.index((TAB_DEFAULT, DefaultSku))
- SkuObjList[0], SkuObjList[defaultindex] = SkuObjList[defaultindex], SkuObjList[0]
- for (SkuName, Sku) in SkuObjList:
- Sku.VpdOffset = Sku.VpdOffset.strip()
-
- # Need to iterate DEC pcd information to get the value & datumtype
- for eachDec in self.PackageList:
- for DecPcd in eachDec.Pcds:
- DecPcdEntry = eachDec.Pcds[DecPcd]
- if (DecPcdEntry.TokenSpaceGuidCName == DscPcdEntry.TokenSpaceGuidCName) and \
- (DecPcdEntry.TokenCName == DscPcdEntry.TokenCName):
- # Print warning message to let the developer make a determine.
- EdkLogger.warn("build", "Unreferenced vpd pcd used!",
- File=self.MetaFile, \
- ExtraData = "PCD: %s.%s used in the DSC file %s is unreferenced." \
- %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path))
-
- DscPcdEntry.DatumType = DecPcdEntry.DatumType
- DscPcdEntry.DefaultValue = DecPcdEntry.DefaultValue
- DscPcdEntry.TokenValue = DecPcdEntry.TokenValue
- DscPcdEntry.TokenSpaceGuidValue = eachDec.Guids[DecPcdEntry.TokenSpaceGuidCName]
- # Only fix the value while no value provided in DSC file.
- if not Sku.DefaultValue:
- DscPcdEntry.SkuInfoList[list(DscPcdEntry.SkuInfoList.keys())[0]].DefaultValue = DecPcdEntry.DefaultValue
-
- if DscPcdEntry not in self._DynamicPcdList:
- self._DynamicPcdList.append(DscPcdEntry)
- Sku.VpdOffset = Sku.VpdOffset.strip()
- PcdValue = Sku.DefaultValue
- if PcdValue == "":
- PcdValue = DscPcdEntry.DefaultValue
- if Sku.VpdOffset != TAB_STAR:
- if PcdValue.startswith("{"):
- Alignment = 8
- elif PcdValue.startswith("L"):
- Alignment = 2
- else:
- Alignment = 1
- try:
- VpdOffset = int(Sku.VpdOffset)
- except:
- try:
- VpdOffset = int(Sku.VpdOffset, 16)
- except:
- EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName))
- if VpdOffset % Alignment != 0:
- if PcdValue.startswith("{"):
- EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName), File=self.MetaFile)
- else:
- EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment))
- if PcdValue not in SkuValueMap:
- SkuValueMap[PcdValue] = []
- VpdFile.Add(DscPcdEntry, SkuName, Sku.VpdOffset)
- SkuValueMap[PcdValue].append(Sku)
- if not NeedProcessVpdMapFile and Sku.VpdOffset == TAB_STAR:
- NeedProcessVpdMapFile = True
- if DscPcdEntry.DatumType == TAB_VOID and PcdValue.startswith("L"):
- UnicodePcdArray.add(DscPcdEntry)
- elif len(Sku.VariableName) > 0:
- HiiPcdArray.add(DscPcdEntry)
- else:
- OtherPcdArray.add(DscPcdEntry)
-
- # if the offset of a VPD is *, then it need to be fixed up by third party tool.
- VpdSkuMap[DscPcd] = SkuValueMap
- if (self.Platform.FlashDefinition is None or self.Platform.FlashDefinition == '') and \
- VpdFile.GetCount() != 0:
- EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
- "Fail to get FLASH_DEFINITION definition in DSC file %s which is required when DSC contains VPD PCD." % str(self.Platform.MetaFile))
-
- if VpdFile.GetCount() != 0:
-
- self.FixVpdOffset(VpdFile)
-
- self.FixVpdOffset(self.UpdateNVStoreMaxSize(VpdFile))
- PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
- if PcdNvStoreDfBuffer:
- PcdName,PcdGuid = PcdNvStoreDfBuffer[0].TokenCName, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName
- if (PcdName,PcdGuid) in VpdSkuMap:
- DefaultSku = PcdNvStoreDfBuffer[0].SkuInfoList.get(TAB_DEFAULT)
- VpdSkuMap[(PcdName,PcdGuid)] = {DefaultSku.DefaultValue:[SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values() ]}
-
- # Process VPD map file generated by third party BPDG tool
- if NeedProcessVpdMapFile:
- VpdMapFilePath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
- if os.path.exists(VpdMapFilePath):
- VpdFile.Read(VpdMapFilePath)
-
- # Fixup TAB_STAR offset
- for pcd in VpdSkuMap:
- vpdinfo = VpdFile.GetVpdInfo(pcd)
- if vpdinfo is None:
- # just pick the a value to determine whether is unicode string type
- continue
- for pcdvalue in VpdSkuMap[pcd]:
- for sku in VpdSkuMap[pcd][pcdvalue]:
- for item in vpdinfo:
- if item[2] == pcdvalue:
- sku.VpdOffset = item[1]
- else:
- EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
-
- # Delete the DynamicPcdList At the last time enter into this function
- for Pcd in self._DynamicPcdList:
- # just pick the a value to determine whether is unicode string type
- Sku = Pcd.SkuInfoList.get(TAB_DEFAULT)
- Sku.VpdOffset = Sku.VpdOffset.strip()
-
- if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
- Pcd.DatumType = TAB_VOID
-
- PcdValue = Sku.DefaultValue
- if Pcd.DatumType == TAB_VOID and PcdValue.startswith("L"):
- # if found PCD which datum value is unicode string the insert to left size of UnicodeIndex
- UnicodePcdArray.add(Pcd)
- elif len(Sku.VariableName) > 0:
- # if found HII type PCD then insert to right of UnicodeIndex
- HiiPcdArray.add(Pcd)
- else:
- OtherPcdArray.add(Pcd)
- del self._DynamicPcdList[:]
- self._DynamicPcdList.extend(list(UnicodePcdArray))
- self._DynamicPcdList.extend(list(HiiPcdArray))
- self._DynamicPcdList.extend(list(OtherPcdArray))
- allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName, Sku) in pcd.SkuInfoList.items()]
- for pcd in self._DynamicPcdList:
- if len(pcd.SkuInfoList) == 1:
- for (SkuName, SkuId) in allskuset:
- if isinstance(SkuId, str) and eval(SkuId) == 0 or SkuId == 0:
- continue
- pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList[TAB_DEFAULT])
- pcd.SkuInfoList[SkuName].SkuId = SkuId
- pcd.SkuInfoList[SkuName].SkuIdName = SkuName
-
- def FixVpdOffset(self, VpdFile ):
- FvPath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY)
- if not os.path.exists(FvPath):
- try:
- os.makedirs(FvPath)
- except:
- EdkLogger.error("build", FILE_WRITE_FAILURE, "Fail to create FV folder under %s" % self.BuildDir)
-
- VpdFilePath = os.path.join(FvPath, "%s.txt" % self.Platform.VpdToolGuid)
-
- if VpdFile.Write(VpdFilePath):
- # retrieve BPDG tool's path from tool_def.txt according to VPD_TOOL_GUID defined in DSC file.
- BPDGToolName = None
- for ToolDef in self.ToolDefinition.values():
- if TAB_GUID in ToolDef and ToolDef[TAB_GUID] == self.Platform.VpdToolGuid:
- if "PATH" not in ToolDef:
- EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt" % self.Platform.VpdToolGuid)
- BPDGToolName = ToolDef["PATH"]
- break
- # Call third party GUID BPDG tool.
- if BPDGToolName is not None:
- VpdInfoFile.CallExtenalBPDGTool(BPDGToolName, VpdFilePath)
- else:
- EdkLogger.error("Build", FILE_NOT_FOUND, "Fail to find third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.")
-
- ## Return the platform build data object
- @cached_property
- def Platform(self):
- return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
-
- ## Return platform name
- @cached_property
- def Name(self):
- return self.Platform.PlatformName
-
- ## Return the meta file GUID
- @cached_property
- def Guid(self):
- return self.Platform.Guid
-
- ## Return the platform version
- @cached_property
- def Version(self):
- return self.Platform.Version
-
- ## Return the FDF file name
- @cached_property
- def FdfFile(self):
- if self.Workspace.FdfFile:
- RetVal= mws.join(self.WorkspaceDir, self.Workspace.FdfFile)
- else:
- RetVal = ''
- return RetVal
-
- ## Return the build output directory platform specifies
- @cached_property
- def OutputDir(self):
- return self.Platform.OutputDirectory
-
- ## Return the directory to store all intermediate and final files built
- @cached_property
- def BuildDir(self):
- if os.path.isabs(self.OutputDir):
- GlobalData.gBuildDirectory = RetVal = path.join(
- path.abspath(self.OutputDir),
- self.BuildTarget + "_" + self.ToolChain,
- )
- else:
- GlobalData.gBuildDirectory = RetVal = path.join(
- self.WorkspaceDir,
- self.OutputDir,
- self.BuildTarget + "_" + self.ToolChain,
- )
- return RetVal
-
- ## Return directory of platform makefile
- #
- # @retval string Makefile directory
- #
- @cached_property
- def MakeFileDir(self):
- return path.join(self.BuildDir, self.Arch)
-
- ## Return build command string
- #
- # @retval string Build command string
- #
- @cached_property
- def BuildCommand(self):
- RetVal = []
- if "MAKE" in self.ToolDefinition and "PATH" in self.ToolDefinition["MAKE"]:
- RetVal += _SplitOption(self.ToolDefinition["MAKE"]["PATH"])
- if "FLAGS" in self.ToolDefinition["MAKE"]:
- NewOption = self.ToolDefinition["MAKE"]["FLAGS"].strip()
- if NewOption != '':
- RetVal += _SplitOption(NewOption)
- if "MAKE" in self.EdkIIBuildOption:
- if "FLAGS" in self.EdkIIBuildOption["MAKE"]:
- Flags = self.EdkIIBuildOption["MAKE"]["FLAGS"]
- if Flags.startswith('='):
- RetVal = [RetVal[0]] + [Flags[1:]]
- else:
- RetVal.append(Flags)
- return RetVal
-
- ## Get tool chain definition
- #
- # Get each tool definition for given tool chain from tools_def.txt and platform
- #
- @cached_property
- def ToolDefinition(self):
- ToolDefinition = self.Workspace.ToolDef.ToolsDefTxtDictionary
- if TAB_TOD_DEFINES_COMMAND_TYPE not in self.Workspace.ToolDef.ToolsDefTxtDatabase:
- EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No tools found in configuration",
- ExtraData="[%s]" % self.MetaFile)
- RetVal = OrderedDict()
- DllPathList = set()
- for Def in ToolDefinition:
- Target, Tag, Arch, Tool, Attr = Def.split("_")
- if Target != self.BuildTarget or Tag != self.ToolChain or Arch != self.Arch:
- continue
-
- Value = ToolDefinition[Def]
- # don't record the DLL
- if Attr == "DLL":
- DllPathList.add(Value)
- continue
-
- if Tool not in RetVal:
- RetVal[Tool] = OrderedDict()
- RetVal[Tool][Attr] = Value
-
- ToolsDef = ''
- if GlobalData.gOptions.SilentMode and "MAKE" in RetVal:
- if "FLAGS" not in RetVal["MAKE"]:
- RetVal["MAKE"]["FLAGS"] = ""
- RetVal["MAKE"]["FLAGS"] += " -s"
- MakeFlags = ''
- for Tool in RetVal:
- for Attr in RetVal[Tool]:
- Value = RetVal[Tool][Attr]
- if Tool in self._BuildOptionWithToolDef(RetVal) and Attr in self._BuildOptionWithToolDef(RetVal)[Tool]:
- # check if override is indicated
- if self._BuildOptionWithToolDef(RetVal)[Tool][Attr].startswith('='):
- Value = self._BuildOptionWithToolDef(RetVal)[Tool][Attr][1:]
- else:
- if Attr != 'PATH':
- Value += " " + self._BuildOptionWithToolDef(RetVal)[Tool][Attr]
- else:
- Value = self._BuildOptionWithToolDef(RetVal)[Tool][Attr]
-
- if Attr == "PATH":
- # Don't put MAKE definition in the file
- if Tool != "MAKE":
- ToolsDef += "%s = %s\n" % (Tool, Value)
- elif Attr != "DLL":
- # Don't put MAKE definition in the file
- if Tool == "MAKE":
- if Attr == "FLAGS":
- MakeFlags = Value
- else:
- ToolsDef += "%s_%s = %s\n" % (Tool, Attr, Value)
- ToolsDef += "\n"
- tool_def_file = os.path.join(self.MakeFileDir, "TOOLS_DEF." + self.Arch)
- SaveFileOnChange(tool_def_file, ToolsDef, False)
- for DllPath in DllPathList:
- os.environ["PATH"] = DllPath + os.pathsep + os.environ["PATH"]
- os.environ["MAKE_FLAGS"] = MakeFlags
-
- return RetVal
-
- ## Return the paths of tools
- @cached_property
- def ToolDefinitionFile(self):
- tool_def_file = os.path.join(self.MakeFileDir, "TOOLS_DEF." + self.Arch)
- if not os.path.exists(tool_def_file):
- self.ToolDefinition
- return tool_def_file
-
- ## Retrieve the toolchain family of given toolchain tag. Default to 'MSFT'.
- @cached_property
- def ToolChainFamily(self):
- ToolDefinition = self.Workspace.ToolDef.ToolsDefTxtDatabase
- if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \
- or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_FAMILY] \
- or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain]:
- EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT." \
- % self.ToolChain)
- RetVal = TAB_COMPILER_MSFT
- else:
- RetVal = ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain]
- return RetVal
-
- @cached_property
- def BuildRuleFamily(self):
- ToolDefinition = self.Workspace.ToolDef.ToolsDefTxtDatabase
- if TAB_TOD_DEFINES_BUILDRULEFAMILY not in ToolDefinition \
- or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY] \
- or not ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.ToolChain]:
- EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT." \
- % self.ToolChain)
- return TAB_COMPILER_MSFT
-
- return ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.ToolChain]
-
- ## Return the build options specific for all modules in this platform
- @cached_property
- def BuildOption(self):
- return self._ExpandBuildOption(self.Platform.BuildOptions)
-
- def _BuildOptionWithToolDef(self, ToolDef):
- return self._ExpandBuildOption(self.Platform.BuildOptions, ToolDef=ToolDef)
-
- ## Return the build options specific for EDK modules in this platform
- @cached_property
- def EdkBuildOption(self):
- return self._ExpandBuildOption(self.Platform.BuildOptions, EDK_NAME)
-
- ## Return the build options specific for EDKII modules in this platform
- @cached_property
- def EdkIIBuildOption(self):
- return self._ExpandBuildOption(self.Platform.BuildOptions, EDKII_NAME)
-
- ## Summarize the packages used by modules in this platform
- @cached_property
- def PackageList(self):
- RetVal = set()
- for La in self.LibraryAutoGenList:
- RetVal.update(La.DependentPackageList)
- for Ma in self.ModuleAutoGenList:
- RetVal.update(Ma.DependentPackageList)
- #Collect package set information from INF of FDF
- for ModuleFile in self._AsBuildModuleList:
- if ModuleFile in self.Platform.Modules:
- continue
- ModuleData = self.BuildDatabase[ModuleFile, self.Arch, self.BuildTarget, self.ToolChain]
- RetVal.update(ModuleData.Packages)
- return list(RetVal)
-
- @cached_property
- def NonDynamicPcdDict(self):
- return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):Pcd for Pcd in self.NonDynamicPcdList}
-
- ## Get list of non-dynamic PCDs
- @property
- def NonDynamicPcdList(self):
- if not self._NonDynamicPcdList:
- self.CollectPlatformDynamicPcds()
- return self._NonDynamicPcdList
-
- ## Get list of dynamic PCDs
- @property
- def DynamicPcdList(self):
- if not self._DynamicPcdList:
- self.CollectPlatformDynamicPcds()
- return self._DynamicPcdList
-
- ## Generate Token Number for all PCD
- @cached_property
- def PcdTokenNumber(self):
- RetVal = OrderedDict()
- TokenNumber = 1
- #
- # Make the Dynamic and DynamicEx PCD use within different TokenNumber area.
- # Such as:
- #
- # Dynamic PCD:
- # TokenNumber 0 ~ 10
- # DynamicEx PCD:
- # TokeNumber 11 ~ 20
- #
- for Pcd in self.DynamicPcdList:
- if Pcd.Phase == "PEI" and Pcd.Type in PCD_DYNAMIC_TYPE_SET:
- EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
- RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
- TokenNumber += 1
-
- for Pcd in self.DynamicPcdList:
- if Pcd.Phase == "PEI" and Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
- EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
- RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
- TokenNumber += 1
-
- for Pcd in self.DynamicPcdList:
- if Pcd.Phase == "DXE" and Pcd.Type in PCD_DYNAMIC_TYPE_SET:
- EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
- RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
- TokenNumber += 1
-
- for Pcd in self.DynamicPcdList:
- if Pcd.Phase == "DXE" and Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
- EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
- RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
- TokenNumber += 1
-
- for Pcd in self.NonDynamicPcdList:
- RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
- TokenNumber += 1
- return RetVal
-
- @cached_property
- def _MaList(self):
- for ModuleFile in self.Platform.Modules:
- Ma = ModuleAutoGen(
- self.Workspace,
- ModuleFile,
- self.BuildTarget,
- self.ToolChain,
- self.Arch,
- self.MetaFile
- )
- self.Platform.Modules[ModuleFile].M = Ma
- return [x.M for x in self.Platform.Modules.values()]
-
- ## Summarize ModuleAutoGen objects of all modules to be built for this platform
- @cached_property
- def ModuleAutoGenList(self):
- RetVal = []
- for Ma in self._MaList:
- if Ma not in RetVal:
- RetVal.append(Ma)
- return RetVal
-
- ## Summarize ModuleAutoGen objects of all libraries to be built for this platform
- @cached_property
- def LibraryAutoGenList(self):
- RetVal = []
- for Ma in self._MaList:
- for La in Ma.LibraryAutoGenList:
- if La not in RetVal:
- RetVal.append(La)
- if Ma not in La.ReferenceModules:
- La.ReferenceModules.append(Ma)
- return RetVal
-
- ## Test if a module is supported by the platform
- #
- # An error will be raised directly if the module or its arch is not supported
- # by the platform or current configuration
- #
- def ValidModule(self, Module):
- return Module in self.Platform.Modules or Module in self.Platform.LibraryInstances \
- or Module in self._AsBuildModuleList
-
- ## Resolve the library classes in a module to library instances
- #
- # This method will not only resolve library classes but also sort the library
- # instances according to the dependency-ship.
- #
- # @param Module The module from which the library classes will be resolved
- #
- # @retval library_list List of library instances sorted
- #
- def ApplyLibraryInstance(self, Module):
- # Cover the case that the binary INF file is list in the FDF file but not DSC file, return empty list directly
- if str(Module) not in self.Platform.Modules:
- return []
-
- return GetModuleLibInstances(Module,
- self.Platform,
- self.BuildDatabase,
- self.Arch,
- self.BuildTarget,
- self.ToolChain,
- self.MetaFile,
- EdkLogger)
-
- ## Override PCD setting (type, value, ...)
- #
- # @param ToPcd The PCD to be overridden
- # @param FromPcd The PCD overriding from
- #
- def _OverridePcd(self, ToPcd, FromPcd, Module="", Msg="", Library=""):
- #
- # in case there's PCDs coming from FDF file, which have no type given.
- # at this point, ToPcd.Type has the type found from dependent
- # package
- #
- TokenCName = ToPcd.TokenCName
- for PcdItem in GlobalData.MixedPcd:
- if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
- TokenCName = PcdItem[0]
- break
- if FromPcd is not None:
- if ToPcd.Pending and FromPcd.Type:
- ToPcd.Type = FromPcd.Type
- elif ToPcd.Type and FromPcd.Type\
- and ToPcd.Type != FromPcd.Type and ToPcd.Type in FromPcd.Type:
- if ToPcd.Type.strip() == TAB_PCDS_DYNAMIC_EX:
- ToPcd.Type = FromPcd.Type
- elif ToPcd.Type and FromPcd.Type \
- and ToPcd.Type != FromPcd.Type:
- if Library:
- Module = str(Module) + " 's library file (" + str(Library) + ")"
- EdkLogger.error("build", OPTION_CONFLICT, "Mismatched PCD type",
- ExtraData="%s.%s is used as [%s] in module %s, but as [%s] in %s."\
- % (ToPcd.TokenSpaceGuidCName, TokenCName,
- ToPcd.Type, Module, FromPcd.Type, Msg),
- File=self.MetaFile)
-
- if FromPcd.MaxDatumSize:
- ToPcd.MaxDatumSize = FromPcd.MaxDatumSize
- ToPcd.MaxSizeUserSet = FromPcd.MaxDatumSize
- if FromPcd.DefaultValue:
- ToPcd.DefaultValue = FromPcd.DefaultValue
- if FromPcd.TokenValue:
- ToPcd.TokenValue = FromPcd.TokenValue
- if FromPcd.DatumType:
- ToPcd.DatumType = FromPcd.DatumType
- if FromPcd.SkuInfoList:
- ToPcd.SkuInfoList = FromPcd.SkuInfoList
- if FromPcd.UserDefinedDefaultStoresFlag:
- ToPcd.UserDefinedDefaultStoresFlag = FromPcd.UserDefinedDefaultStoresFlag
- # Add Flexible PCD format parse
- if ToPcd.DefaultValue:
- try:
- ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, self.Workspace._GuidDict)(True)
- except BadExpression as Value:
- EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
- File=self.MetaFile)
-
- # check the validation of datum
- IsValid, Cause = CheckPcdDatum(ToPcd.DatumType, ToPcd.DefaultValue)
- if not IsValid:
- EdkLogger.error('build', FORMAT_INVALID, Cause, File=self.MetaFile,
- ExtraData="%s.%s" % (ToPcd.TokenSpaceGuidCName, TokenCName))
- ToPcd.validateranges = FromPcd.validateranges
- ToPcd.validlists = FromPcd.validlists
- ToPcd.expressions = FromPcd.expressions
- ToPcd.CustomAttribute = FromPcd.CustomAttribute
-
- if FromPcd is not None and ToPcd.DatumType == TAB_VOID and not ToPcd.MaxDatumSize:
- EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specified for PCD %s.%s" \
- % (ToPcd.TokenSpaceGuidCName, TokenCName))
- Value = ToPcd.DefaultValue
- if not Value:
- ToPcd.MaxDatumSize = '1'
- elif Value[0] == 'L':
- ToPcd.MaxDatumSize = str((len(Value) - 2) * 2)
- elif Value[0] == '{':
- ToPcd.MaxDatumSize = str(len(Value.split(',')))
- else:
- ToPcd.MaxDatumSize = str(len(Value) - 1)
-
- # apply default SKU for dynamic PCDS if specified one is not available
- if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYNAMIC_EX_TYPE_SET) \
- and not ToPcd.SkuInfoList:
- if self.Platform.SkuName in self.Platform.SkuIds:
- SkuName = self.Platform.SkuName
- else:
- SkuName = TAB_DEFAULT
- ToPcd.SkuInfoList = {
- SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[SkuName][0], '', '', '', '', '', ToPcd.DefaultValue)
- }
-
- ## Apply PCD setting defined platform to a module
- #
- # @param Module The module from which the PCD setting will be overridden
- #
- # @retval PCD_list The list PCDs with settings from platform
- #
- def ApplyPcdSetting(self, Module, Pcds, Library=""):
- # for each PCD in module
- for Name, Guid in Pcds:
- PcdInModule = Pcds[Name, Guid]
- # find out the PCD setting in platform
- if (Name, Guid) in self.Platform.Pcds:
- PcdInPlatform = self.Platform.Pcds[Name, Guid]
- else:
- PcdInPlatform = None
- # then override the settings if any
- self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg="DSC PCD sections", Library=Library)
- # resolve the VariableGuid value
- for SkuId in PcdInModule.SkuInfoList:
- Sku = PcdInModule.SkuInfoList[SkuId]
- if Sku.VariableGuid == '': continue
- Sku.VariableGuidValue = GuidValue(Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
- if Sku.VariableGuidValue is None:
- PackageList = "\n\t".join(str(P) for P in self.PackageList)
- EdkLogger.error(
- 'build',
- RESOURCE_NOT_AVAILABLE,
- "Value of GUID [%s] is not found in" % Sku.VariableGuid,
- ExtraData=PackageList + "\n\t(used with %s.%s from module %s)" \
- % (Guid, Name, str(Module)),
- File=self.MetaFile
- )
-
- # override PCD settings with module specific setting
- if Module in self.Platform.Modules:
- PlatformModule = self.Platform.Modules[str(Module)]
- for Key in PlatformModule.Pcds:
- if GlobalData.BuildOptionPcd:
- for pcd in GlobalData.BuildOptionPcd:
- (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, _) = pcd
- if (TokenCName, TokenSpaceGuidCName) == Key and FieldName =="":
- PlatformModule.Pcds[Key].DefaultValue = pcdvalue
- PlatformModule.Pcds[Key].PcdValueFromComm = pcdvalue
- break
- Flag = False
- if Key in Pcds:
- ToPcd = Pcds[Key]
- Flag = True
- elif Key in GlobalData.MixedPcd:
- for PcdItem in GlobalData.MixedPcd[Key]:
- if PcdItem in Pcds:
- ToPcd = Pcds[PcdItem]
- Flag = True
- break
- if Flag:
- self._OverridePcd(ToPcd, PlatformModule.Pcds[Key], Module, Msg="DSC Components Module scoped PCD section", Library=Library)
- # use PCD value to calculate the MaxDatumSize when it is not specified
- for Name, Guid in Pcds:
- Pcd = Pcds[Name, Guid]
- if Pcd.DatumType == TAB_VOID and not Pcd.MaxDatumSize:
- Pcd.MaxSizeUserSet = None
- Value = Pcd.DefaultValue
- if not Value:
- Pcd.MaxDatumSize = '1'
- elif Value[0] == 'L':
- Pcd.MaxDatumSize = str((len(Value) - 2) * 2)
- elif Value[0] == '{':
- Pcd.MaxDatumSize = str(len(Value.split(',')))
- else:
- Pcd.MaxDatumSize = str(len(Value) - 1)
- return list(Pcds.values())
-
-
-
- ## Calculate the priority value of the build option
- #
- # @param Key Build option definition contain: TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
- #
- # @retval Value Priority value based on the priority list.
- #
- def CalculatePriorityValue(self, Key):
- Target, ToolChain, Arch, CommandType, Attr = Key.split('_')
- PriorityValue = 0x11111
- if Target == TAB_STAR:
- PriorityValue &= 0x01111
- if ToolChain == TAB_STAR:
- PriorityValue &= 0x10111
- if Arch == TAB_STAR:
- PriorityValue &= 0x11011
- if CommandType == TAB_STAR:
- PriorityValue &= 0x11101
- if Attr == TAB_STAR:
- PriorityValue &= 0x11110
-
- return self.PrioList["0x%0.5x" % PriorityValue]
-
-
- ## Expand * in build option key
- #
- # @param Options Options to be expanded
- # @param ToolDef Use specified ToolDef instead of full version.
- # This is needed during initialization to prevent
- # infinite recursion betweeh BuildOptions,
- # ToolDefinition, and this function.
- #
- # @retval options Options expanded
- #
- def _ExpandBuildOption(self, Options, ModuleStyle=None, ToolDef=None):
- if not ToolDef:
- ToolDef = self.ToolDefinition
- BuildOptions = {}
- FamilyMatch = False
- FamilyIsNull = True
-
- OverrideList = {}
- #
- # Construct a list contain the build options which need override.
- #
- for Key in Options:
- #
- # Key[0] -- tool family
- # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
- #
- if (Key[0] == self.BuildRuleFamily and
- (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 and Key[2] == ModuleStyle))):
- Target, ToolChain, Arch, CommandType, Attr = Key[1].split('_')
- if (Target == self.BuildTarget or Target == TAB_STAR) and\
- (ToolChain == self.ToolChain or ToolChain == TAB_STAR) and\
- (Arch == self.Arch or Arch == TAB_STAR) and\
- Options[Key].startswith("="):
-
- if OverrideList.get(Key[1]) is not None:
- OverrideList.pop(Key[1])
- OverrideList[Key[1]] = Options[Key]
-
- #
- # Use the highest priority value.
- #
- if (len(OverrideList) >= 2):
- KeyList = list(OverrideList.keys())
- for Index in range(len(KeyList)):
- NowKey = KeyList[Index]
- Target1, ToolChain1, Arch1, CommandType1, Attr1 = NowKey.split("_")
- for Index1 in range(len(KeyList) - Index - 1):
- NextKey = KeyList[Index1 + Index + 1]
- #
- # Compare two Key, if one is included by another, choose the higher priority one
- #
- Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split("_")
- if (Target1 == Target2 or Target1 == TAB_STAR or Target2 == TAB_STAR) and\
- (ToolChain1 == ToolChain2 or ToolChain1 == TAB_STAR or ToolChain2 == TAB_STAR) and\
- (Arch1 == Arch2 or Arch1 == TAB_STAR or Arch2 == TAB_STAR) and\
- (CommandType1 == CommandType2 or CommandType1 == TAB_STAR or CommandType2 == TAB_STAR) and\
- (Attr1 == Attr2 or Attr1 == TAB_STAR or Attr2 == TAB_STAR):
-
- if self.CalculatePriorityValue(NowKey) > self.CalculatePriorityValue(NextKey):
- if Options.get((self.BuildRuleFamily, NextKey)) is not None:
- Options.pop((self.BuildRuleFamily, NextKey))
- else:
- if Options.get((self.BuildRuleFamily, NowKey)) is not None:
- Options.pop((self.BuildRuleFamily, NowKey))
-
- for Key in Options:
- if ModuleStyle is not None and len (Key) > 2:
- # Check Module style is EDK or EDKII.
- # Only append build option for the matched style module.
- if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
- continue
- elif ModuleStyle == EDKII_NAME and Key[2] != EDKII_NAME:
- continue
- Family = Key[0]
- Target, Tag, Arch, Tool, Attr = Key[1].split("_")
- # if tool chain family doesn't match, skip it
- if Tool in ToolDef and Family != "":
- FamilyIsNull = False
- if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY, "") != "":
- if Family != ToolDef[Tool][TAB_TOD_DEFINES_BUILDRULEFAMILY]:
- continue
- elif Family != ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]:
- continue
- FamilyMatch = True
- # expand any wildcard
- if Target == TAB_STAR or Target == self.BuildTarget:
- if Tag == TAB_STAR or Tag == self.ToolChain:
- if Arch == TAB_STAR or Arch == self.Arch:
- if Tool not in BuildOptions:
- BuildOptions[Tool] = {}
- if Attr != "FLAGS" or Attr not in BuildOptions[Tool] or Options[Key].startswith('='):
- BuildOptions[Tool][Attr] = Options[Key]
- else:
- # append options for the same tool except PATH
- if Attr != 'PATH':
- BuildOptions[Tool][Attr] += " " + Options[Key]
- else:
- BuildOptions[Tool][Attr] = Options[Key]
- # Build Option Family has been checked, which need't to be checked again for family.
- if FamilyMatch or FamilyIsNull:
- return BuildOptions
-
- for Key in Options:
- if ModuleStyle is not None and len (Key) > 2:
- # Check Module style is EDK or EDKII.
- # Only append build option for the matched style module.
- if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
- continue
- elif ModuleStyle == EDKII_NAME and Key[2] != EDKII_NAME:
- continue
- Family = Key[0]
- Target, Tag, Arch, Tool, Attr = Key[1].split("_")
- # if tool chain family doesn't match, skip it
- if Tool not in ToolDef or Family == "":
- continue
- # option has been added before
- if Family != ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]:
- continue
-
- # expand any wildcard
- if Target == TAB_STAR or Target == self.BuildTarget:
- if Tag == TAB_STAR or Tag == self.ToolChain:
- if Arch == TAB_STAR or Arch == self.Arch:
- if Tool not in BuildOptions:
- BuildOptions[Tool] = {}
- if Attr != "FLAGS" or Attr not in BuildOptions[Tool] or Options[Key].startswith('='):
- BuildOptions[Tool][Attr] = Options[Key]
- else:
- # append options for the same tool except PATH
- if Attr != 'PATH':
- BuildOptions[Tool][Attr] += " " + Options[Key]
- else:
- BuildOptions[Tool][Attr] = Options[Key]
- return BuildOptions
- def GetGlobalBuildOptions(self,Module):
- ModuleTypeOptions = self.Platform.GetBuildOptionsByPkg(Module, Module.ModuleType)
- ModuleTypeOptions = self._ExpandBuildOption(ModuleTypeOptions)
- if Module in self.Platform.Modules:
- PlatformModule = self.Platform.Modules[str(Module)]
- PlatformModuleOptions = self._ExpandBuildOption(PlatformModule.BuildOptions)
- else:
- PlatformModuleOptions = {}
- return ModuleTypeOptions, PlatformModuleOptions
- ## Append build options in platform to a module
- #
- # @param Module The module to which the build options will be appended
- #
- # @retval options The options appended with build options in platform
- #
- def ApplyBuildOption(self, Module):
- # Get the different options for the different style module
- PlatformOptions = self.EdkIIBuildOption
- ModuleTypeOptions = self.Platform.GetBuildOptionsByModuleType(EDKII_NAME, Module.ModuleType)
- ModuleTypeOptions = self._ExpandBuildOption(ModuleTypeOptions)
- ModuleOptions = self._ExpandBuildOption(Module.BuildOptions)
- if Module in self.Platform.Modules:
- PlatformModule = self.Platform.Modules[str(Module)]
- PlatformModuleOptions = self._ExpandBuildOption(PlatformModule.BuildOptions)
- else:
- PlatformModuleOptions = {}
-
- BuildRuleOrder = None
- for Options in [self.ToolDefinition, ModuleOptions, PlatformOptions, ModuleTypeOptions, PlatformModuleOptions]:
- for Tool in Options:
- for Attr in Options[Tool]:
- if Attr == TAB_TOD_DEFINES_BUILDRULEORDER:
- BuildRuleOrder = Options[Tool][Attr]
-
- AllTools = set(list(ModuleOptions.keys()) + list(PlatformOptions.keys()) +
- list(PlatformModuleOptions.keys()) + list(ModuleTypeOptions.keys()) +
- list(self.ToolDefinition.keys()))
- BuildOptions = defaultdict(lambda: defaultdict(str))
- for Tool in AllTools:
- for Options in [self.ToolDefinition, ModuleOptions, PlatformOptions, ModuleTypeOptions, PlatformModuleOptions]:
- if Tool not in Options:
- continue
- for Attr in Options[Tool]:
- #
- # Do not generate it in Makefile
- #
- if Attr == TAB_TOD_DEFINES_BUILDRULEORDER:
- continue
- Value = Options[Tool][Attr]
- # check if override is indicated
- if Value.startswith('='):
- BuildOptions[Tool][Attr] = mws.handleWsMacro(Value[1:])
- else:
- if Attr != 'PATH':
- BuildOptions[Tool][Attr] += " " + mws.handleWsMacro(Value)
- else:
- BuildOptions[Tool][Attr] = mws.handleWsMacro(Value)
-
- return BuildOptions, BuildRuleOrder
-
-#
-# extend lists contained in a dictionary with lists stored in another dictionary
-# if CopyToDict is not derived from DefaultDict(list) then this may raise exception
-#
-def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict):
- for Key in CopyFromDict:
- CopyToDict[Key].extend(CopyFromDict[Key])
-
-# Create a directory specified by a set of path elements and return the full path
-def _MakeDir(PathList):
- RetVal = path.join(*PathList)
- CreateDirectory(RetVal)
- return RetVal
-
-## ModuleAutoGen class
-#
-# This class encapsules the AutoGen behaviors for the build tools. In addition to
-# the generation of AutoGen.h and AutoGen.c, it will generate *.depex file according
-# to the [depex] section in module's inf file.
-#
-class ModuleAutoGen(AutoGen):
- # call super().__init__ then call the worker function with different parameter count
- def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
- if not hasattr(self, "_Init"):
- self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch, *args)
- self._Init = True
-
- ## Cache the timestamps of metafiles of every module in a class attribute
- #
- TimeDict = {}
-
- def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
- # check if this module is employed by active platform
- if not PlatformAutoGen(Workspace, args[0], Target, Toolchain, Arch).ValidModule(MetaFile):
- EdkLogger.verbose("Module [%s] for [%s] is not employed by active platform\n" \
- % (MetaFile, Arch))
- return None
- return super(ModuleAutoGen, cls).__new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
-
- ## Initialize ModuleAutoGen
- #
- # @param Workspace EdkIIWorkspaceBuild object
- # @param ModuleFile The path of module file
- # @param Target Build target (DEBUG, RELEASE)
- # @param Toolchain Name of tool chain
- # @param Arch The arch the module supports
- # @param PlatformFile Platform meta-file
- #
- def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, Arch, PlatformFile):
- EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen module [%s] [%s]" % (ModuleFile, Arch))
- GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (ModuleFile, Arch, Toolchain, Target)
-
- self.Workspace = Workspace
- self.WorkspaceDir = Workspace.WorkspaceDir
- self.MetaFile = ModuleFile
- self.PlatformInfo = PlatformAutoGen(Workspace, PlatformFile, Target, Toolchain, Arch)
-
- self.SourceDir = self.MetaFile.SubDir
- self.SourceDir = mws.relpath(self.SourceDir, self.WorkspaceDir)
-
- self.ToolChain = Toolchain
- self.BuildTarget = Target
- self.Arch = Arch
- self.ToolChainFamily = self.PlatformInfo.ToolChainFamily
- self.BuildRuleFamily = self.PlatformInfo.BuildRuleFamily
-
- self.IsCodeFileCreated = False
- self.IsAsBuiltInfCreated = False
- self.DepexGenerated = False
-
- self.BuildDatabase = self.Workspace.BuildDatabase
- self.BuildRuleOrder = None
- self.BuildTime = 0
-
- self._PcdComments = OrderedListDict()
- self._GuidComments = OrderedListDict()
- self._ProtocolComments = OrderedListDict()
- self._PpiComments = OrderedListDict()
- self._BuildTargets = None
- self._IntroBuildTargetList = None
- self._FinalBuildTargetList = None
- self._FileTypes = None
-
- self.AutoGenDepSet = set()
- self.ReferenceModules = []
- self.ConstPcd = {}
-
- ## hash() operator of ModuleAutoGen
- #
- # The module file path and arch string will be used to represent
- # hash value of this object
- #
- # @retval int Hash value of the module file path and arch
- #
- @cached_class_function
- def __hash__(self):
- return hash((self.MetaFile, self.Arch))
-
- def __repr__(self):
- return "%s [%s]" % (self.MetaFile, self.Arch)
-
- # Get FixedAtBuild Pcds of this Module
- @cached_property
- def FixedAtBuildPcds(self):
- RetVal = []
- for Pcd in self.ModulePcdList:
- if Pcd.Type != TAB_PCDS_FIXED_AT_BUILD:
- continue
- if Pcd not in RetVal:
- RetVal.append(Pcd)
- return RetVal
-
- @cached_property
- def FixedVoidTypePcds(self):
- RetVal = {}
- for Pcd in self.FixedAtBuildPcds:
- if Pcd.DatumType == TAB_VOID:
- if '{}.{}'.format(Pcd.TokenSpaceGuidCName, Pcd.TokenCName) not in RetVal:
- RetVal['{}.{}'.format(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)] = Pcd.DefaultValue
- return RetVal
-
- @property
- def UniqueBaseName(self):
- BaseName = self.Name
- for Module in self.PlatformInfo.ModuleAutoGenList:
- if Module.MetaFile == self.MetaFile:
- continue
- if Module.Name == self.Name:
- if uuid.UUID(Module.Guid) == uuid.UUID(self.Guid):
- EdkLogger.error("build", FILE_DUPLICATED, 'Modules have same BaseName and FILE_GUID:\n'
- ' %s\n %s' % (Module.MetaFile, self.MetaFile))
- BaseName = '%s_%s' % (self.Name, self.Guid)
- return BaseName
-
- # Macros could be used in build_rule.txt (also Makefile)
- @cached_property
- def Macros(self):
- return OrderedDict((
- ("WORKSPACE" ,self.WorkspaceDir),
- ("MODULE_NAME" ,self.Name),
- ("MODULE_NAME_GUID" ,self.UniqueBaseName),
- ("MODULE_GUID" ,self.Guid),
- ("MODULE_VERSION" ,self.Version),
- ("MODULE_TYPE" ,self.ModuleType),
- ("MODULE_FILE" ,str(self.MetaFile)),
- ("MODULE_FILE_BASE_NAME" ,self.MetaFile.BaseName),
- ("MODULE_RELATIVE_DIR" ,self.SourceDir),
- ("MODULE_DIR" ,self.SourceDir),
- ("BASE_NAME" ,self.Name),
- ("ARCH" ,self.Arch),
- ("TOOLCHAIN" ,self.ToolChain),
- ("TOOLCHAIN_TAG" ,self.ToolChain),
- ("TOOL_CHAIN_TAG" ,self.ToolChain),
- ("TARGET" ,self.BuildTarget),
- ("BUILD_DIR" ,self.PlatformInfo.BuildDir),
- ("BIN_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
- ("LIB_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
- ("MODULE_BUILD_DIR" ,self.BuildDir),
- ("OUTPUT_DIR" ,self.OutputDir),
- ("DEBUG_DIR" ,self.DebugDir),
- ("DEST_DIR_OUTPUT" ,self.OutputDir),
- ("DEST_DIR_DEBUG" ,self.DebugDir),
- ("PLATFORM_NAME" ,self.PlatformInfo.Name),
- ("PLATFORM_GUID" ,self.PlatformInfo.Guid),
- ("PLATFORM_VERSION" ,self.PlatformInfo.Version),
- ("PLATFORM_RELATIVE_DIR" ,self.PlatformInfo.SourceDir),
- ("PLATFORM_DIR" ,mws.join(self.WorkspaceDir, self.PlatformInfo.SourceDir)),
- ("PLATFORM_OUTPUT_DIR" ,self.PlatformInfo.OutputDir),
- ("FFS_OUTPUT_DIR" ,self.FfsOutputDir)
- ))
-
- ## Return the module build data object
- @cached_property
- def Module(self):
- return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
-
- ## Return the module name
- @cached_property
- def Name(self):
- return self.Module.BaseName
-
- ## Return the module DxsFile if exist
- @cached_property
- def DxsFile(self):
- return self.Module.DxsFile
-
- ## Return the module meta-file GUID
- @cached_property
- def Guid(self):
- #
- # To build same module more than once, the module path with FILE_GUID overridden has
- # the file name FILE_GUIDmodule.inf, but the relative path (self.MetaFile.File) is the real path
- # in DSC. The overridden GUID can be retrieved from file name
- #
- if os.path.basename(self.MetaFile.File) != os.path.basename(self.MetaFile.Path):
- #
- # Length of GUID is 36
- #
- return os.path.basename(self.MetaFile.Path)[:36]
- return self.Module.Guid
-
- ## Return the module version
- @cached_property
- def Version(self):
- return self.Module.Version
-
- ## Return the module type
- @cached_property
- def ModuleType(self):
- return self.Module.ModuleType
-
- ## Return the component type (for Edk.x style of module)
- @cached_property
- def ComponentType(self):
- return self.Module.ComponentType
-
- ## Return the build type
- @cached_property
- def BuildType(self):
- return self.Module.BuildType
-
- ## Return the PCD_IS_DRIVER setting
- @cached_property
- def PcdIsDriver(self):
- return self.Module.PcdIsDriver
-
- ## Return the autogen version, i.e. module meta-file version
- @cached_property
- def AutoGenVersion(self):
- return self.Module.AutoGenVersion
-
- ## Check if the module is library or not
- @cached_property
- def IsLibrary(self):
- return bool(self.Module.LibraryClass)
-
- ## Check if the module is binary module or not
- @cached_property
- def IsBinaryModule(self):
- return self.Module.IsBinaryModule
-
- ## Return the directory to store intermediate files of the module
- @cached_property
- def BuildDir(self):
- return _MakeDir((
- self.PlatformInfo.BuildDir,
- self.Arch,
- self.SourceDir,
- self.MetaFile.BaseName
- ))
-
- ## Return the directory to store the intermediate object files of the module
- @cached_property
- def OutputDir(self):
- return _MakeDir((self.BuildDir, "OUTPUT"))
-
- ## Return the directory path to store ffs file
- @cached_property
- def FfsOutputDir(self):
- if GlobalData.gFdfParser:
- return path.join(self.PlatformInfo.BuildDir, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
- return ''
-
- ## Return the directory to store auto-gened source files of the module
- @cached_property
- def DebugDir(self):
- return _MakeDir((self.BuildDir, "DEBUG"))
-
- ## Return the path of custom file
- @cached_property
- def CustomMakefile(self):
- RetVal = {}
- for Type in self.Module.CustomMakefile:
- MakeType = gMakeTypeMap[Type] if Type in gMakeTypeMap else 'nmake'
- File = os.path.join(self.SourceDir, self.Module.CustomMakefile[Type])
- RetVal[MakeType] = File
- return RetVal
-
- ## Return the directory of the makefile
- #
- # @retval string The directory string of module's makefile
- #
- @cached_property
- def MakeFileDir(self):
- return self.BuildDir
-
- ## Return build command string
- #
- # @retval string Build command string
- #
- @cached_property
- def BuildCommand(self):
- return self.PlatformInfo.BuildCommand
-
- ## Get object list of all packages the module and its dependent libraries belong to
- #
- # @retval list The list of package object
- #
- @cached_property
- def DerivedPackageList(self):
- PackageList = []
- for M in [self.Module] + self.DependentLibraryList:
- for Package in M.Packages:
- if Package in PackageList:
- continue
- PackageList.append(Package)
- return PackageList
-
- ## Get the depex string
- #
- # @return : a string contain all depex expression.
- def _GetDepexExpresionString(self):
- DepexStr = ''
- DepexList = []
- ## DPX_SOURCE IN Define section.
- if self.Module.DxsFile:
- return DepexStr
- for M in [self.Module] + self.DependentLibraryList:
- Filename = M.MetaFile.Path
- InfObj = InfSectionParser.InfSectionParser(Filename)
- DepexExpressionList = InfObj.GetDepexExpresionList()
- for DepexExpression in DepexExpressionList:
- for key in DepexExpression:
- Arch, ModuleType = key
- DepexExpr = [x for x in DepexExpression[key] if not str(x).startswith('#')]
- # the type of build module is USER_DEFINED.
- # All different DEPEX section tags would be copied into the As Built INF file
- # and there would be separate DEPEX section tags
- if self.ModuleType.upper() == SUP_MODULE_USER_DEFINED or self.ModuleType.upper() == SUP_MODULE_HOST_APPLICATION:
- if (Arch.upper() == self.Arch.upper()) and (ModuleType.upper() != TAB_ARCH_COMMON):
- DepexList.append({(Arch, ModuleType): DepexExpr})
- else:
- if Arch.upper() == TAB_ARCH_COMMON or \
- (Arch.upper() == self.Arch.upper() and \
- ModuleType.upper() in [TAB_ARCH_COMMON, self.ModuleType.upper()]):
- DepexList.append({(Arch, ModuleType): DepexExpr})
-
- #the type of build module is USER_DEFINED.
- if self.ModuleType.upper() == SUP_MODULE_USER_DEFINED or self.ModuleType.upper() == SUP_MODULE_HOST_APPLICATION:
- for Depex in DepexList:
- for key in Depex:
- DepexStr += '[Depex.%s.%s]\n' % key
- DepexStr += '\n'.join('# '+ val for val in Depex[key])
- DepexStr += '\n\n'
- if not DepexStr:
- return '[Depex.%s]\n' % self.Arch
- return DepexStr
-
- #the type of build module not is USER_DEFINED.
- Count = 0
- for Depex in DepexList:
- Count += 1
- if DepexStr != '':
- DepexStr += ' AND '
- DepexStr += '('
- for D in Depex.values():
- DepexStr += ' '.join(val for val in D)
- Index = DepexStr.find('END')
- if Index > -1 and Index == len(DepexStr) - 3:
- DepexStr = DepexStr[:-3]
- DepexStr = DepexStr.strip()
- DepexStr += ')'
- if Count == 1:
- DepexStr = DepexStr.lstrip('(').rstrip(')').strip()
- if not DepexStr:
- return '[Depex.%s]\n' % self.Arch
- return '[Depex.%s]\n# ' % self.Arch + DepexStr
-
- ## Merge dependency expression
- #
- # @retval list The token list of the dependency expression after parsed
- #
- @cached_property
- def DepexList(self):
- if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_FILE in self.FileTypes:
- return {}
-
- DepexList = []
- #
- # Append depex from dependent libraries, if not "BEFORE", "AFTER" expression
- #
- for M in [self.Module] + self.DependentLibraryList:
- Inherited = False
- for D in M.Depex[self.Arch, self.ModuleType]:
- if DepexList != []:
- DepexList.append('AND')
- DepexList.append('(')
- #replace D with value if D is FixedAtBuild PCD
- NewList = []
- for item in D:
- if '.' not in item:
- NewList.append(item)
- else:
- FixedVoidTypePcds = {}
- if item in self.FixedVoidTypePcds:
- FixedVoidTypePcds = self.FixedVoidTypePcds
- elif M in self.PlatformInfo.LibraryAutoGenList:
- Index = self.PlatformInfo.LibraryAutoGenList.index(M)
- FixedVoidTypePcds = self.PlatformInfo.LibraryAutoGenList[Index].FixedVoidTypePcds
- if item not in FixedVoidTypePcds:
- EdkLogger.error("build", FORMAT_INVALID, "{} used in [Depex] section should be used as FixedAtBuild type and VOID* datum type in the module.".format(item))
- else:
- Value = FixedVoidTypePcds[item]
- if len(Value.split(',')) != 16:
- EdkLogger.error("build", FORMAT_INVALID,
- "{} used in [Depex] section should be used as FixedAtBuild type and VOID* datum type and 16 bytes in the module.".format(item))
- NewList.append(Value)
- DepexList.extend(NewList)
- if DepexList[-1] == 'END': # no need of a END at this time
- DepexList.pop()
- DepexList.append(')')
- Inherited = True
- if Inherited:
- EdkLogger.verbose("DEPEX[%s] (+%s) = %s" % (self.Name, M.BaseName, DepexList))
- if 'BEFORE' in DepexList or 'AFTER' in DepexList:
- break
- if len(DepexList) > 0:
- EdkLogger.verbose('')
- return {self.ModuleType:DepexList}
-
- ## Merge dependency expression
- #
- # @retval list The token list of the dependency expression after parsed
- #
- @cached_property
- def DepexExpressionDict(self):
- if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_FILE in self.FileTypes:
- return {}
-
- DepexExpressionString = ''
- #
- # Append depex from dependent libraries, if not "BEFORE", "AFTER" expresion
- #
- for M in [self.Module] + self.DependentLibraryList:
- Inherited = False
- for D in M.DepexExpression[self.Arch, self.ModuleType]:
- if DepexExpressionString != '':
- DepexExpressionString += ' AND '
- DepexExpressionString += '('
- DepexExpressionString += D
- DepexExpressionString = DepexExpressionString.rstrip('END').strip()
- DepexExpressionString += ')'
- Inherited = True
- if Inherited:
- EdkLogger.verbose("DEPEX[%s] (+%s) = %s" % (self.Name, M.BaseName, DepexExpressionString))
- if 'BEFORE' in DepexExpressionString or 'AFTER' in DepexExpressionString:
- break
- if len(DepexExpressionString) > 0:
- EdkLogger.verbose('')
-
- return {self.ModuleType:DepexExpressionString}
-
- # Get the tiano core user extension, it is contain dependent library.
- # @retval: a list contain tiano core userextension.
- #
- def _GetTianoCoreUserExtensionList(self):
- TianoCoreUserExtentionList = []
- for M in [self.Module] + self.DependentLibraryList:
- Filename = M.MetaFile.Path
- InfObj = InfSectionParser.InfSectionParser(Filename)
- TianoCoreUserExtenList = InfObj.GetUserExtensionTianoCore()
- for TianoCoreUserExtent in TianoCoreUserExtenList:
- for Section in TianoCoreUserExtent:
- ItemList = Section.split(TAB_SPLIT)
- Arch = self.Arch
- if len(ItemList) == 4:
- Arch = ItemList[3]
- if Arch.upper() == TAB_ARCH_COMMON or Arch.upper() == self.Arch.upper():
- TianoCoreList = []
- TianoCoreList.extend([TAB_SECTION_START + Section + TAB_SECTION_END])
- TianoCoreList.extend(TianoCoreUserExtent[Section][:])
- TianoCoreList.append('\n')
- TianoCoreUserExtentionList.append(TianoCoreList)
-
- return TianoCoreUserExtentionList
-
- ## Return the list of specification version required for the module
- #
- # @retval list The list of specification defined in module file
- #
- @cached_property
- def Specification(self):
- return self.Module.Specification
-
- ## Tool option for the module build
- #
- # @param PlatformInfo The object of PlatformBuildInfo
- # @retval dict The dict containing valid options
- #
- @cached_property
- def BuildOption(self):
- RetVal, self.BuildRuleOrder = self.PlatformInfo.ApplyBuildOption(self.Module)
- if self.BuildRuleOrder:
- self.BuildRuleOrder = ['.%s' % Ext for Ext in self.BuildRuleOrder.split()]
- return RetVal
-
- ## Get include path list from tool option for the module build
- #
- # @retval list The include path list
- #
- @cached_property
- def BuildOptionIncPathList(self):
- #
- # Regular expression for finding Include Directories, the difference between MSFT and INTEL/GCC/RVCT
- # is the former use /I , the Latter used -I to specify include directories
- #
- if self.PlatformInfo.ToolChainFamily in (TAB_COMPILER_MSFT):
- BuildOptIncludeRegEx = gBuildOptIncludePatternMsft
- elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'RVCT'):
- BuildOptIncludeRegEx = gBuildOptIncludePatternOther
- else:
- #
- # New ToolChainFamily, don't known whether there is option to specify include directories
- #
- return []
-
- RetVal = []
- for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM'):
- try:
- FlagOption = self.BuildOption[Tool]['FLAGS']
- except KeyError:
- FlagOption = ''
-
- if self.ToolChainFamily != 'RVCT':
- IncPathList = [NormPath(Path, self.Macros) for Path in BuildOptIncludeRegEx.findall(FlagOption)]
- else:
- #
- # RVCT may specify a list of directory separated by commas
- #
- IncPathList = []
- for Path in BuildOptIncludeRegEx.findall(FlagOption):
- PathList = GetSplitList(Path, TAB_COMMA_SPLIT)
- IncPathList.extend(NormPath(PathEntry, self.Macros) for PathEntry in PathList)
-
- #
- # EDK II modules must not reference header files outside of the packages they depend on or
- # within the module's directory tree. Report error if violation.
- #
- if GlobalData.gDisableIncludePathCheck == False:
- for Path in IncPathList:
- if (Path not in self.IncludePathList) and (CommonPath([Path, self.MetaFile.Dir]) != self.MetaFile.Dir):
- ErrMsg = "The include directory for the EDK II module in this line is invalid %s specified in %s FLAGS '%s'" % (Path, Tool, FlagOption)
- EdkLogger.error("build",
- PARAMETER_INVALID,
- ExtraData=ErrMsg,
- File=str(self.MetaFile))
- RetVal += IncPathList
- return RetVal
-
- ## Return a list of files which can be built from source
- #
- # What kind of files can be built is determined by build rules in
- # $(CONF_DIRECTORY)/build_rule.txt and toolchain family.
- #
- @cached_property
- def SourceFileList(self):
- RetVal = []
- ToolChainTagSet = {"", TAB_STAR, self.ToolChain}
- ToolChainFamilySet = {"", TAB_STAR, self.ToolChainFamily, self.BuildRuleFamily}
- for F in self.Module.Sources:
- # match tool chain
- if F.TagName not in ToolChainTagSet:
- EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s] for processing file [%s] is found, "
- "but [%s] is currently used" % (F.TagName, str(F), self.ToolChain))
- continue
- # match tool chain family or build rule family
- if F.ToolChainFamily not in ToolChainFamilySet:
- EdkLogger.debug(
- EdkLogger.DEBUG_0,
- "The file [%s] must be built by tools of [%s], " \
- "but current toolchain family is [%s], buildrule family is [%s]" \
- % (str(F), F.ToolChainFamily, self.ToolChainFamily, self.BuildRuleFamily))
- continue
-
- # add the file path into search path list for file including
- if F.Dir not in self.IncludePathList:
- self.IncludePathList.insert(0, F.Dir)
- RetVal.append(F)
-
- self._MatchBuildRuleOrder(RetVal)
-
- for F in RetVal:
- self._ApplyBuildRule(F, TAB_UNKNOWN_FILE)
- return RetVal
-
- def _MatchBuildRuleOrder(self, FileList):
- Order_Dict = {}
- self.BuildOption
- for SingleFile in FileList:
- if self.BuildRuleOrder and SingleFile.Ext in self.BuildRuleOrder and SingleFile.Ext in self.BuildRules:
- key = SingleFile.Path.rsplit(SingleFile.Ext,1)[0]
- if key in Order_Dict:
- Order_Dict[key].append(SingleFile.Ext)
- else:
- Order_Dict[key] = [SingleFile.Ext]
-
- RemoveList = []
- for F in Order_Dict:
- if len(Order_Dict[F]) > 1:
- Order_Dict[F].sort(key=lambda i: self.BuildRuleOrder.index(i))
- for Ext in Order_Dict[F][1:]:
- RemoveList.append(F + Ext)
-
- for item in RemoveList:
- FileList.remove(item)
-
- return FileList
-
- ## Return the list of unicode files
- @cached_property
- def UnicodeFileList(self):
- return self.FileTypes.get(TAB_UNICODE_FILE,[])
-
- ## Return the list of vfr files
- @cached_property
- def VfrFileList(self):
- return self.FileTypes.get(TAB_VFR_FILE, [])
-
- ## Return the list of Image Definition files
- @cached_property
- def IdfFileList(self):
- return self.FileTypes.get(TAB_IMAGE_FILE,[])
-
- ## Return a list of files which can be built from binary
- #
- # "Build" binary files are just to copy them to build directory.
- #
- # @retval list The list of files which can be built later
- #
- @cached_property
- def BinaryFileList(self):
- RetVal = []
- for F in self.Module.Binaries:
- if F.Target not in [TAB_ARCH_COMMON, TAB_STAR] and F.Target != self.BuildTarget:
- continue
- RetVal.append(F)
- self._ApplyBuildRule(F, F.Type, BinaryFileList=RetVal)
- return RetVal
-
- @cached_property
- def BuildRules(self):
- RetVal = {}
- BuildRuleDatabase = BuildRule
- for Type in BuildRuleDatabase.FileTypeList:
- #first try getting build rule by BuildRuleFamily
- RuleObject = BuildRuleDatabase[Type, self.BuildType, self.Arch, self.BuildRuleFamily]
- if not RuleObject:
- # build type is always module type, but ...
- if self.ModuleType != self.BuildType:
- RuleObject = BuildRuleDatabase[Type, self.ModuleType, self.Arch, self.BuildRuleFamily]
- #second try getting build rule by ToolChainFamily
- if not RuleObject:
- RuleObject = BuildRuleDatabase[Type, self.BuildType, self.Arch, self.ToolChainFamily]
- if not RuleObject:
- # build type is always module type, but ...
- if self.ModuleType != self.BuildType:
- RuleObject = BuildRuleDatabase[Type, self.ModuleType, self.Arch, self.ToolChainFamily]
- if not RuleObject:
- continue
- RuleObject = RuleObject.Instantiate(self.Macros)
- RetVal[Type] = RuleObject
- for Ext in RuleObject.SourceFileExtList:
- RetVal[Ext] = RuleObject
- return RetVal
-
- def _ApplyBuildRule(self, File, FileType, BinaryFileList=None):
- if self._BuildTargets is None:
- self._IntroBuildTargetList = set()
- self._FinalBuildTargetList = set()
- self._BuildTargets = defaultdict(set)
- self._FileTypes = defaultdict(set)
-
- if not BinaryFileList:
- BinaryFileList = self.BinaryFileList
-
- SubDirectory = os.path.join(self.OutputDir, File.SubDir)
- if not os.path.exists(SubDirectory):
- CreateDirectory(SubDirectory)
- LastTarget = None
- RuleChain = set()
- SourceList = [File]
- Index = 0
- #
- # Make sure to get build rule order value
- #
- self.BuildOption
-
- while Index < len(SourceList):
- Source = SourceList[Index]
- Index = Index + 1
-
- if Source != File:
- CreateDirectory(Source.Dir)
-
- if File.IsBinary and File == Source and File in BinaryFileList:
- # Skip all files that are not binary libraries
- if not self.IsLibrary:
- continue
- RuleObject = self.BuildRules[TAB_DEFAULT_BINARY_FILE]
- elif FileType in self.BuildRules:
- RuleObject = self.BuildRules[FileType]
- elif Source.Ext in self.BuildRules:
- RuleObject = self.BuildRules[Source.Ext]
- else:
- # stop at no more rules
- if LastTarget:
- self._FinalBuildTargetList.add(LastTarget)
- break
-
- FileType = RuleObject.SourceFileType
- self._FileTypes[FileType].add(Source)
-
- # stop at STATIC_LIBRARY for library
- if self.IsLibrary and FileType == TAB_STATIC_LIBRARY:
- if LastTarget:
- self._FinalBuildTargetList.add(LastTarget)
- break
-
- Target = RuleObject.Apply(Source, self.BuildRuleOrder)
- if not Target:
- if LastTarget:
- self._FinalBuildTargetList.add(LastTarget)
- break
- elif not Target.Outputs:
- # Only do build for target with outputs
- self._FinalBuildTargetList.add(Target)
-
- self._BuildTargets[FileType].add(Target)
-
- if not Source.IsBinary and Source == File:
- self._IntroBuildTargetList.add(Target)
-
- # to avoid cyclic rule
- if FileType in RuleChain:
- break
-
- RuleChain.add(FileType)
- SourceList.extend(Target.Outputs)
- LastTarget = Target
- FileType = TAB_UNKNOWN_FILE
-
- @cached_property
- def Targets(self):
- if self._BuildTargets is None:
- self._IntroBuildTargetList = set()
- self._FinalBuildTargetList = set()
- self._BuildTargets = defaultdict(set)
- self._FileTypes = defaultdict(set)
-
- #TRICK: call SourceFileList property to apply build rule for source files
- self.SourceFileList
-
- #TRICK: call _GetBinaryFileList to apply build rule for binary files
- self.BinaryFileList
-
- return self._BuildTargets
-
- @cached_property
- def IntroTargetList(self):
- self.Targets
- return self._IntroBuildTargetList
-
- @cached_property
- def CodaTargetList(self):
- self.Targets
- return self._FinalBuildTargetList
-
- @cached_property
- def FileTypes(self):
- self.Targets
- return self._FileTypes
-
- ## Get the list of package object the module depends on
- #
- # @retval list The package object list
- #
- @cached_property
- def DependentPackageList(self):
- return self.Module.Packages
-
- ## Return the list of auto-generated code file
- #
- # @retval list The list of auto-generated file
- #
- @cached_property
- def AutoGenFileList(self):
- AutoGenUniIdf = self.BuildType != 'UEFI_HII'
- UniStringBinBuffer = BytesIO()
- IdfGenBinBuffer = BytesIO()
- RetVal = {}
- AutoGenC = TemplateString()
- AutoGenH = TemplateString()
- StringH = TemplateString()
- StringIdf = TemplateString()
- GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUniIdf, UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer)
- #
- # AutoGen.c is generated if there are library classes in inf, or there are object files
- #
- if str(AutoGenC) != "" and (len(self.Module.LibraryClasses) > 0
- or TAB_OBJECT_FILE in self.FileTypes):
- AutoFile = PathClass(gAutoGenCodeFileName, self.DebugDir)
- RetVal[AutoFile] = str(AutoGenC)
- self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
- if str(AutoGenH) != "":
- AutoFile = PathClass(gAutoGenHeaderFileName, self.DebugDir)
- RetVal[AutoFile] = str(AutoGenH)
- self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
- if str(StringH) != "":
- AutoFile = PathClass(gAutoGenStringFileName % {"module_name":self.Name}, self.DebugDir)
- RetVal[AutoFile] = str(StringH)
- self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
- if UniStringBinBuffer is not None and UniStringBinBuffer.getvalue() != b"":
- AutoFile = PathClass(gAutoGenStringFormFileName % {"module_name":self.Name}, self.OutputDir)
- RetVal[AutoFile] = UniStringBinBuffer.getvalue()
- AutoFile.IsBinary = True
- self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
- if UniStringBinBuffer is not None:
- UniStringBinBuffer.close()
- if str(StringIdf) != "":
- AutoFile = PathClass(gAutoGenImageDefFileName % {"module_name":self.Name}, self.DebugDir)
- RetVal[AutoFile] = str(StringIdf)
- self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
- if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue() != b"":
- AutoFile = PathClass(gAutoGenIdfFileName % {"module_name":self.Name}, self.OutputDir)
- RetVal[AutoFile] = IdfGenBinBuffer.getvalue()
- AutoFile.IsBinary = True
- self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
- if IdfGenBinBuffer is not None:
- IdfGenBinBuffer.close()
- return RetVal
-
- ## Return the list of library modules explicitly or implicitly used by this module
- @cached_property
- def DependentLibraryList(self):
- # only merge library classes and PCD for non-library module
- if self.IsLibrary:
- return []
- return self.PlatformInfo.ApplyLibraryInstance(self.Module)
-
- ## Get the list of PCDs from current module
- #
- # @retval list The list of PCD
- #
- @cached_property
- def ModulePcdList(self):
- # apply PCD settings from platform
- RetVal = self.PlatformInfo.ApplyPcdSetting(self.Module, self.Module.Pcds)
- ExtendCopyDictionaryLists(self._PcdComments, self.Module.PcdComments)
- return RetVal
-
- ## Get the list of PCDs from dependent libraries
- #
- # @retval list The list of PCD
- #
- @cached_property
- def LibraryPcdList(self):
- if self.IsLibrary:
- return []
- RetVal = []
- Pcds = set()
- # get PCDs from dependent libraries
- for Library in self.DependentLibraryList:
- PcdsInLibrary = OrderedDict()
- ExtendCopyDictionaryLists(self._PcdComments, Library.PcdComments)
- for Key in Library.Pcds:
- # skip duplicated PCDs
- if Key in self.Module.Pcds or Key in Pcds:
- continue
- Pcds.add(Key)
- PcdsInLibrary[Key] = copy.copy(Library.Pcds[Key])
- RetVal.extend(self.PlatformInfo.ApplyPcdSetting(self.Module, PcdsInLibrary, Library=Library))
- return RetVal
-
- ## Get the GUID value mapping
- #
- # @retval dict The mapping between GUID cname and its value
- #
- @cached_property
- def GuidList(self):
- RetVal = OrderedDict(self.Module.Guids)
- for Library in self.DependentLibraryList:
- RetVal.update(Library.Guids)
- ExtendCopyDictionaryLists(self._GuidComments, Library.GuidComments)
- ExtendCopyDictionaryLists(self._GuidComments, self.Module.GuidComments)
- return RetVal
-
- @cached_property
- def GetGuidsUsedByPcd(self):
- RetVal = OrderedDict(self.Module.GetGuidsUsedByPcd())
- for Library in self.DependentLibraryList:
- RetVal.update(Library.GetGuidsUsedByPcd())
- return RetVal
- ## Get the protocol value mapping
- #
- # @retval dict The mapping between protocol cname and its value
- #
- @cached_property
- def ProtocolList(self):
- RetVal = OrderedDict(self.Module.Protocols)
- for Library in self.DependentLibraryList:
- RetVal.update(Library.Protocols)
- ExtendCopyDictionaryLists(self._ProtocolComments, Library.ProtocolComments)
- ExtendCopyDictionaryLists(self._ProtocolComments, self.Module.ProtocolComments)
- return RetVal
-
- ## Get the PPI value mapping
- #
- # @retval dict The mapping between PPI cname and its value
- #
- @cached_property
- def PpiList(self):
- RetVal = OrderedDict(self.Module.Ppis)
- for Library in self.DependentLibraryList:
- RetVal.update(Library.Ppis)
- ExtendCopyDictionaryLists(self._PpiComments, Library.PpiComments)
- ExtendCopyDictionaryLists(self._PpiComments, self.Module.PpiComments)
- return RetVal
-
- ## Get the list of include search path
- #
- # @retval list The list path
- #
- @cached_property
- def IncludePathList(self):
- RetVal = []
- RetVal.append(self.MetaFile.Dir)
- RetVal.append(self.DebugDir)
-
- for Package in self.Module.Packages:
- PackageDir = mws.join(self.WorkspaceDir, Package.MetaFile.Dir)
- if PackageDir not in RetVal:
- RetVal.append(PackageDir)
- IncludesList = Package.Includes
- if Package._PrivateIncludes:
- if not self.MetaFile.OriginalPath.Path.startswith(PackageDir):
- IncludesList = list(set(Package.Includes).difference(set(Package._PrivateIncludes)))
- for Inc in IncludesList:
- if Inc not in RetVal:
- RetVal.append(str(Inc))
- return RetVal
-
- @cached_property
- def IncludePathLength(self):
- return sum(len(inc)+1 for inc in self.IncludePathList)
-
- ## Get HII EX PCDs which maybe used by VFR
- #
- # efivarstore used by VFR may relate with HII EX PCDs
- # Get the variable name and GUID from efivarstore and HII EX PCD
- # List the HII EX PCDs in As Built INF if both name and GUID match.
- #
- # @retval list HII EX PCDs
- #
- def _GetPcdsMaybeUsedByVfr(self):
- if not self.SourceFileList:
- return []
-
- NameGuids = set()
- for SrcFile in self.SourceFileList:
- if SrcFile.Ext.lower() != '.vfr':
- continue
- Vfri = os.path.join(self.OutputDir, SrcFile.BaseName + '.i')
- if not os.path.exists(Vfri):
- continue
- VfriFile = open(Vfri, 'r')
- Content = VfriFile.read()
- VfriFile.close()
- Pos = Content.find('efivarstore')
- while Pos != -1:
- #
- # Make sure 'efivarstore' is the start of efivarstore statement
- # In case of the value of 'name' (name = efivarstore) is equal to 'efivarstore'
- #
- Index = Pos - 1
- while Index >= 0 and Content[Index] in ' \t\r\n':
- Index -= 1
- if Index >= 0 and Content[Index] != ';':
- Pos = Content.find('efivarstore', Pos + len('efivarstore'))
- continue
- #
- # 'efivarstore' must be followed by name and guid
- #
- Name = gEfiVarStoreNamePattern.search(Content, Pos)
- if not Name:
- break
- Guid = gEfiVarStoreGuidPattern.search(Content, Pos)
- if not Guid:
- break
- NameArray = _ConvertStringToByteArray('L"' + Name.group(1) + '"')
- NameGuids.add((NameArray, GuidStructureStringToGuidString(Guid.group(1))))
- Pos = Content.find('efivarstore', Name.end())
- if not NameGuids:
- return []
- HiiExPcds = []
- for Pcd in self.PlatformInfo.Platform.Pcds.values():
- if Pcd.Type != TAB_PCDS_DYNAMIC_EX_HII:
- continue
- for SkuInfo in Pcd.SkuInfoList.values():
- Value = GuidValue(SkuInfo.VariableGuid, self.PlatformInfo.PackageList, self.MetaFile.Path)
- if not Value:
- continue
- Name = _ConvertStringToByteArray(SkuInfo.VariableName)
- Guid = GuidStructureStringToGuidString(Value)
- if (Name, Guid) in NameGuids and Pcd not in HiiExPcds:
- HiiExPcds.append(Pcd)
- break
-
- return HiiExPcds
-
- def _GenOffsetBin(self):
- VfrUniBaseName = {}
- for SourceFile in self.Module.Sources:
- if SourceFile.Type.upper() == ".VFR" :
- #
- # search the .map file to find the offset of vfr binary in the PE32+/TE file.
- #
- VfrUniBaseName[SourceFile.BaseName] = (SourceFile.BaseName + "Bin")
- elif SourceFile.Type.upper() == ".UNI" :
- #
- # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
- #
- VfrUniBaseName["UniOffsetName"] = (self.Name + "Strings")
-
- if not VfrUniBaseName:
- return None
- MapFileName = os.path.join(self.OutputDir, self.Name + ".map")
- EfiFileName = os.path.join(self.OutputDir, self.Name + ".efi")
- VfrUniOffsetList = GetVariableOffset(MapFileName, EfiFileName, list(VfrUniBaseName.values()))
- if not VfrUniOffsetList:
- return None
-
- OutputName = '%sOffset.bin' % self.Name
- UniVfrOffsetFileName = os.path.join( self.OutputDir, OutputName)
-
- try:
- fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
- except:
- EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
-
- # Use a instance of BytesIO to cache data
- fStringIO = BytesIO()
-
- for Item in VfrUniOffsetList:
- if (Item[0].find("Strings") != -1):
- #
- # UNI offset in image.
- # GUID + Offset
- # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66 } }
- #
- UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
- fStringIO.write(UniGuid)
- UniValue = pack ('Q', int (Item[1], 16))
- fStringIO.write (UniValue)
- else:
- #
- # VFR binary offset in image.
- # GUID + Offset
- # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2 } };
- #
- VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
- fStringIO.write(VfrGuid)
- VfrValue = pack ('Q', int (Item[1], 16))
- fStringIO.write (VfrValue)
- #
- # write data into file.
- #
- try :
- fInputfile.write (fStringIO.getvalue())
- except:
- EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
- "file been locked or using by other applications." %UniVfrOffsetFileName, None)
-
- fStringIO.close ()
- fInputfile.close ()
- return OutputName
-
- @cached_property
- def OutputFile(self):
- retVal = set()
- OutputDir = self.OutputDir.replace('\\', '/').strip('/')
- DebugDir = self.DebugDir.replace('\\', '/').strip('/')
- for Item in self.CodaTargetList:
- File = Item.Target.Path.replace('\\', '/').strip('/').replace(DebugDir, '').replace(OutputDir, '').strip('/')
- retVal.add(File)
- if self.DepexGenerated:
- retVal.add(self.Name + '.depex')
-
- Bin = self._GenOffsetBin()
- if Bin:
- retVal.add(Bin)
-
- for Root, Dirs, Files in os.walk(OutputDir):
- for File in Files:
- if File.lower().endswith('.pdb'):
- retVal.add(File)
-
- return retVal
-
- ## Create AsBuilt INF file the module
- #
- def CreateAsBuiltInf(self):
-
- if self.IsAsBuiltInfCreated:
- return
-
- # Skip INF file generation for libraries
- if self.IsLibrary:
- return
-
- # Skip the following code for modules with no source files
- if not self.SourceFileList:
- return
-
- # Skip the following code for modules without any binary files
- if self.BinaryFileList:
- return
-
- ### TODO: How to handles mixed source and binary modules
-
- # Find all DynamicEx and PatchableInModule PCDs used by this module and dependent libraries
- # Also find all packages that the DynamicEx PCDs depend on
- Pcds = []
- PatchablePcds = []
- Packages = []
- PcdCheckList = []
- PcdTokenSpaceList = []
- for Pcd in self.ModulePcdList + self.LibraryPcdList:
- if Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
- PatchablePcds.append(Pcd)
- PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_PATCHABLE_IN_MODULE))
- elif Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
- if Pcd not in Pcds:
- Pcds.append(Pcd)
- PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC_EX))
- PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC))
- PcdTokenSpaceList.append(Pcd.TokenSpaceGuidCName)
- GuidList = OrderedDict(self.GuidList)
- for TokenSpace in self.GetGuidsUsedByPcd:
- # If token space is not referred by patch PCD or Ex PCD, remove the GUID from GUID list
- # The GUIDs in GUIDs section should really be the GUIDs in source INF or referred by Ex an patch PCDs
- if TokenSpace not in PcdTokenSpaceList and TokenSpace in GuidList:
- GuidList.pop(TokenSpace)
- CheckList = (GuidList, self.PpiList, self.ProtocolList, PcdCheckList)
- for Package in self.DerivedPackageList:
- if Package in Packages:
- continue
- BeChecked = (Package.Guids, Package.Ppis, Package.Protocols, Package.Pcds)
- Found = False
- for Index in range(len(BeChecked)):
- for Item in CheckList[Index]:
- if Item in BeChecked[Index]:
- Packages.append(Package)
- Found = True
- break
- if Found:
- break
-
- VfrPcds = self._GetPcdsMaybeUsedByVfr()
- for Pkg in self.PlatformInfo.PackageList:
- if Pkg in Packages:
- continue
- for VfrPcd in VfrPcds:
- if ((VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC_EX) in Pkg.Pcds or
- (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC) in Pkg.Pcds):
- Packages.append(Pkg)
- break
-
- ModuleType = SUP_MODULE_DXE_DRIVER if self.ModuleType == SUP_MODULE_UEFI_DRIVER and self.DepexGenerated else self.ModuleType
- DriverType = self.PcdIsDriver if self.PcdIsDriver else ''
- Guid = self.Guid
- MDefs = self.Module.Defines
-
- AsBuiltInfDict = {
- 'module_name' : self.Name,
- 'module_guid' : Guid,
- 'module_module_type' : ModuleType,
- 'module_version_string' : [MDefs['VERSION_STRING']] if 'VERSION_STRING' in MDefs else [],
- 'pcd_is_driver_string' : [],
- 'module_uefi_specification_version' : [],
- 'module_pi_specification_version' : [],
- 'module_entry_point' : self.Module.ModuleEntryPointList,
- 'module_unload_image' : self.Module.ModuleUnloadImageList,
- 'module_constructor' : self.Module.ConstructorList,
- 'module_destructor' : self.Module.DestructorList,
- 'module_shadow' : [MDefs['SHADOW']] if 'SHADOW' in MDefs else [],
- 'module_pci_vendor_id' : [MDefs['PCI_VENDOR_ID']] if 'PCI_VENDOR_ID' in MDefs else [],
- 'module_pci_device_id' : [MDefs['PCI_DEVICE_ID']] if 'PCI_DEVICE_ID' in MDefs else [],
- 'module_pci_class_code' : [MDefs['PCI_CLASS_CODE']] if 'PCI_CLASS_CODE' in MDefs else [],
- 'module_pci_revision' : [MDefs['PCI_REVISION']] if 'PCI_REVISION' in MDefs else [],
- 'module_build_number' : [MDefs['BUILD_NUMBER']] if 'BUILD_NUMBER' in MDefs else [],
- 'module_spec' : [MDefs['SPEC']] if 'SPEC' in MDefs else [],
- 'module_uefi_hii_resource_section' : [MDefs['UEFI_HII_RESOURCE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [],
- 'module_uni_file' : [MDefs['MODULE_UNI_FILE']] if 'MODULE_UNI_FILE' in MDefs else [],
- 'module_arch' : self.Arch,
- 'package_item' : [Package.MetaFile.File.replace('\\', '/') for Package in Packages],
- 'binary_item' : [],
- 'patchablepcd_item' : [],
- 'pcd_item' : [],
- 'protocol_item' : [],
- 'ppi_item' : [],
- 'guid_item' : [],
- 'flags_item' : [],
- 'libraryclasses_item' : []
- }
-
- if 'MODULE_UNI_FILE' in MDefs:
- UNIFile = os.path.join(self.MetaFile.Dir, MDefs['MODULE_UNI_FILE'])
- if os.path.isfile(UNIFile):
- shutil.copy2(UNIFile, self.OutputDir)
-
- if self.AutoGenVersion > int(gInfSpecVersion, 0):
- AsBuiltInfDict['module_inf_version'] = '0x%08x' % self.AutoGenVersion
- else:
- AsBuiltInfDict['module_inf_version'] = gInfSpecVersion
-
- if DriverType:
- AsBuiltInfDict['pcd_is_driver_string'].append(DriverType)
-
- if 'UEFI_SPECIFICATION_VERSION' in self.Specification:
- AsBuiltInfDict['module_uefi_specification_version'].append(self.Specification['UEFI_SPECIFICATION_VERSION'])
- if 'PI_SPECIFICATION_VERSION' in self.Specification:
- AsBuiltInfDict['module_pi_specification_version'].append(self.Specification['PI_SPECIFICATION_VERSION'])
-
- OutputDir = self.OutputDir.replace('\\', '/').strip('/')
- DebugDir = self.DebugDir.replace('\\', '/').strip('/')
- for Item in self.CodaTargetList:
- File = Item.Target.Path.replace('\\', '/').strip('/').replace(DebugDir, '').replace(OutputDir, '').strip('/')
- if os.path.isabs(File):
- File = File.replace('\\', '/').strip('/').replace(OutputDir, '').strip('/')
- if Item.Target.Ext.lower() == '.aml':
- AsBuiltInfDict['binary_item'].append('ASL|' + File)
- elif Item.Target.Ext.lower() == '.acpi':
- AsBuiltInfDict['binary_item'].append('ACPI|' + File)
- elif Item.Target.Ext.lower() == '.efi':
- AsBuiltInfDict['binary_item'].append('PE32|' + self.Name + '.efi')
- else:
- AsBuiltInfDict['binary_item'].append('BIN|' + File)
- if not self.DepexGenerated:
- DepexFile = os.path.join(self.OutputDir, self.Name + '.depex')
- if os.path.exists(DepexFile):
- self.DepexGenerated = True
- if self.DepexGenerated:
- if self.ModuleType in [SUP_MODULE_PEIM]:
- AsBuiltInfDict['binary_item'].append('PEI_DEPEX|' + self.Name + '.depex')
- elif self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]:
- AsBuiltInfDict['binary_item'].append('DXE_DEPEX|' + self.Name + '.depex')
- elif self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]:
- AsBuiltInfDict['binary_item'].append('SMM_DEPEX|' + self.Name + '.depex')
-
- Bin = self._GenOffsetBin()
- if Bin:
- AsBuiltInfDict['binary_item'].append('BIN|%s' % Bin)
-
- for Root, Dirs, Files in os.walk(OutputDir):
- for File in Files:
- if File.lower().endswith('.pdb'):
- AsBuiltInfDict['binary_item'].append('DISPOSABLE|' + File)
- HeaderComments = self.Module.HeaderComments
- StartPos = 0
- for Index in range(len(HeaderComments)):
- if HeaderComments[Index].find('@BinaryHeader') != -1:
- HeaderComments[Index] = HeaderComments[Index].replace('@BinaryHeader', '@file')
- StartPos = Index
- break
- AsBuiltInfDict['header_comments'] = '\n'.join(HeaderComments[StartPos:]).replace(':#', '://')
- AsBuiltInfDict['tail_comments'] = '\n'.join(self.Module.TailComments)
-
- GenList = [
- (self.ProtocolList, self._ProtocolComments, 'protocol_item'),
- (self.PpiList, self._PpiComments, 'ppi_item'),
- (GuidList, self._GuidComments, 'guid_item')
- ]
- for Item in GenList:
- for CName in Item[0]:
- Comments = '\n '.join(Item[1][CName]) if CName in Item[1] else ''
- Entry = Comments + '\n ' + CName if Comments else CName
- AsBuiltInfDict[Item[2]].append(Entry)
- PatchList = parsePcdInfoFromMapFile(
- os.path.join(self.OutputDir, self.Name + '.map'),
- os.path.join(self.OutputDir, self.Name + '.efi')
- )
- if PatchList:
- for Pcd in PatchablePcds:
- TokenCName = Pcd.TokenCName
- for PcdItem in GlobalData.MixedPcd:
- if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
- TokenCName = PcdItem[0]
- break
- for PatchPcd in PatchList:
- if TokenCName == PatchPcd[0]:
- break
- else:
- continue
- PcdValue = ''
- if Pcd.DatumType == 'BOOLEAN':
- BoolValue = Pcd.DefaultValue.upper()
- if BoolValue == 'TRUE':
- Pcd.DefaultValue = '1'
- elif BoolValue == 'FALSE':
- Pcd.DefaultValue = '0'
-
- if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
- HexFormat = '0x%02x'
- if Pcd.DatumType == TAB_UINT16:
- HexFormat = '0x%04x'
- elif Pcd.DatumType == TAB_UINT32:
- HexFormat = '0x%08x'
- elif Pcd.DatumType == TAB_UINT64:
- HexFormat = '0x%016x'
- PcdValue = HexFormat % int(Pcd.DefaultValue, 0)
- else:
- if Pcd.MaxDatumSize is None or Pcd.MaxDatumSize == '':
- EdkLogger.error("build", AUTOGEN_ERROR,
- "Unknown [MaxDatumSize] of PCD [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName)
- )
- ArraySize = int(Pcd.MaxDatumSize, 0)
- PcdValue = Pcd.DefaultValue
- if PcdValue[0] != '{':
- Unicode = False
- if PcdValue[0] == 'L':
- Unicode = True
- PcdValue = PcdValue.lstrip('L')
- PcdValue = eval(PcdValue)
- NewValue = '{'
- for Index in range(0, len(PcdValue)):
- if Unicode:
- CharVal = ord(PcdValue[Index])
- NewValue = NewValue + '0x%02x' % (CharVal & 0x00FF) + ', ' \
- + '0x%02x' % (CharVal >> 8) + ', '
- else:
- NewValue = NewValue + '0x%02x' % (ord(PcdValue[Index]) % 0x100) + ', '
- Padding = '0x00, '
- if Unicode:
- Padding = Padding * 2
- ArraySize = ArraySize // 2
- if ArraySize < (len(PcdValue) + 1):
- if Pcd.MaxSizeUserSet:
- EdkLogger.error("build", AUTOGEN_ERROR,
- "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName)
- )
- else:
- ArraySize = len(PcdValue) + 1
- if ArraySize > len(PcdValue) + 1:
- NewValue = NewValue + Padding * (ArraySize - len(PcdValue) - 1)
- PcdValue = NewValue + Padding.strip().rstrip(',') + '}'
- elif len(PcdValue.split(',')) <= ArraySize:
- PcdValue = PcdValue.rstrip('}') + ', 0x00' * (ArraySize - len(PcdValue.split(',')))
- PcdValue += '}'
- else:
- if Pcd.MaxSizeUserSet:
- EdkLogger.error("build", AUTOGEN_ERROR,
- "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName)
- )
- else:
- ArraySize = len(PcdValue) + 1
- PcdItem = '%s.%s|%s|0x%X' % \
- (Pcd.TokenSpaceGuidCName, TokenCName, PcdValue, PatchPcd[1])
- PcdComments = ''
- if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdComments:
- PcdComments = '\n '.join(self._PcdComments[Pcd.TokenSpaceGuidCName, Pcd.TokenCName])
- if PcdComments:
- PcdItem = PcdComments + '\n ' + PcdItem
- AsBuiltInfDict['patchablepcd_item'].append(PcdItem)
-
- for Pcd in Pcds + VfrPcds:
- PcdCommentList = []
- HiiInfo = ''
- TokenCName = Pcd.TokenCName
- for PcdItem in GlobalData.MixedPcd:
- if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
- TokenCName = PcdItem[0]
- break
- if Pcd.Type == TAB_PCDS_DYNAMIC_EX_HII:
- for SkuName in Pcd.SkuInfoList:
- SkuInfo = Pcd.SkuInfoList[SkuName]
- HiiInfo = '## %s|%s|%s' % (SkuInfo.VariableName, SkuInfo.VariableGuid, SkuInfo.VariableOffset)
- break
- if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdComments:
- PcdCommentList = self._PcdComments[Pcd.TokenSpaceGuidCName, Pcd.TokenCName][:]
- if HiiInfo:
- UsageIndex = -1
- UsageStr = ''
- for Index, Comment in enumerate(PcdCommentList):
- for Usage in UsageList:
- if Comment.find(Usage) != -1:
- UsageStr = Usage
- UsageIndex = Index
- break
- if UsageIndex != -1:
- PcdCommentList[UsageIndex] = '## %s %s %s' % (UsageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, ''))
- else:
- PcdCommentList.append('## UNDEFINED ' + HiiInfo)
- PcdComments = '\n '.join(PcdCommentList)
- PcdEntry = Pcd.TokenSpaceGuidCName + '.' + TokenCName
- if PcdComments:
- PcdEntry = PcdComments + '\n ' + PcdEntry
- AsBuiltInfDict['pcd_item'].append(PcdEntry)
- for Item in self.BuildOption:
- if 'FLAGS' in self.BuildOption[Item]:
- AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_FLAGS = %s' % (self.ToolChainFamily, self.BuildTarget, self.ToolChain, self.Arch, Item, self.BuildOption[Item]['FLAGS'].strip()))
-
- # Generated LibraryClasses section in comments.
- for Library in self.LibraryAutoGenList:
- AsBuiltInfDict['libraryclasses_item'].append(Library.MetaFile.File.replace('\\', '/'))
-
- # Generated UserExtensions TianoCore section.
- # All tianocore user extensions are copied.
- UserExtStr = ''
- for TianoCore in self._GetTianoCoreUserExtensionList():
- UserExtStr += '\n'.join(TianoCore)
- ExtensionFile = os.path.join(self.MetaFile.Dir, TianoCore[1])
- if os.path.isfile(ExtensionFile):
- shutil.copy2(ExtensionFile, self.OutputDir)
- AsBuiltInfDict['userextension_tianocore_item'] = UserExtStr
-
- # Generated depex expression section in comments.
- DepexExpression = self._GetDepexExpresionString()
- AsBuiltInfDict['depexsection_item'] = DepexExpression if DepexExpression else ''
-
- AsBuiltInf = TemplateString()
- AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfDict))
-
- SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.inf'), str(AsBuiltInf), False)
-
- self.IsAsBuiltInfCreated = True
-
- def CopyModuleToCache(self):
- FileDir = path.join(GlobalData.gBinCacheDest, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
- CreateDirectory (FileDir)
- HashFile = path.join(self.BuildDir, self.Name + '.hash')
- if os.path.exists(HashFile):
- CopyFileOnChange(HashFile, FileDir)
- ModuleFile = path.join(self.OutputDir, self.Name + '.inf')
- if os.path.exists(ModuleFile):
- CopyFileOnChange(ModuleFile, FileDir)
-
- if not self.OutputFile:
- Ma = self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
- self.OutputFile = Ma.Binaries
-
- for File in self.OutputFile:
- File = str(File)
- if not os.path.isabs(File):
- File = os.path.join(self.OutputDir, File)
- if os.path.exists(File):
- sub_dir = os.path.relpath(File, self.OutputDir)
- destination_file = os.path.join(FileDir, sub_dir)
- destination_dir = os.path.dirname(destination_file)
- CreateDirectory(destination_dir)
- CopyFileOnChange(File, destination_dir)
-
- def AttemptModuleCacheCopy(self):
- # If library or Module is binary do not skip by hash
- if self.IsBinaryModule:
- return False
- # .inc is contains binary information so do not skip by hash as well
- for f_ext in self.SourceFileList:
- if '.inc' in str(f_ext):
- return False
- FileDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
- HashFile = path.join(FileDir, self.Name + '.hash')
- if os.path.exists(HashFile):
- f = open(HashFile, 'r')
- CacheHash = f.read()
- f.close()
- self.GenModuleHash()
- if GlobalData.gModuleHash[self.Arch][self.Name]:
- if CacheHash == GlobalData.gModuleHash[self.Arch][self.Name]:
- for root, dir, files in os.walk(FileDir):
- for f in files:
- if self.Name + '.hash' in f:
- CopyFileOnChange(HashFile, self.BuildDir)
- else:
- File = path.join(root, f)
- sub_dir = os.path.relpath(File, FileDir)
- destination_file = os.path.join(self.OutputDir, sub_dir)
- destination_dir = os.path.dirname(destination_file)
- CreateDirectory(destination_dir)
- CopyFileOnChange(File, destination_dir)
- if self.Name == "PcdPeim" or self.Name == "PcdDxe":
- CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
- return True
- return False
-
- ## Create makefile for the module and its dependent libraries
- #
- # @param CreateLibraryMakeFile Flag indicating if or not the makefiles of
- # dependent libraries will be created
- #
- @cached_class_function
- def CreateMakeFile(self, CreateLibraryMakeFile=True, GenFfsList = []):
- # nest this function inside its only caller.
- def CreateTimeStamp():
- FileSet = {self.MetaFile.Path}
-
- for SourceFile in self.Module.Sources:
- FileSet.add (SourceFile.Path)
-
- for Lib in self.DependentLibraryList:
- FileSet.add (Lib.MetaFile.Path)
-
- for f in self.AutoGenDepSet:
- FileSet.add (f.Path)
-
- if os.path.exists (self.TimeStampPath):
- os.remove (self.TimeStampPath)
- with open(self.TimeStampPath, 'w+') as file:
- for f in FileSet:
- print(f, file=file)
-
- # Ignore generating makefile when it is a binary module
- if self.IsBinaryModule:
- return
-
- self.GenFfsList = GenFfsList
- if not self.IsLibrary and CreateLibraryMakeFile:
- for LibraryAutoGen in self.LibraryAutoGenList:
- LibraryAutoGen.CreateMakeFile()
-
- # Don't enable if hash feature enabled, CanSkip uses timestamps to determine build skipping
- if not GlobalData.gUseHashCache and self.CanSkip():
- return
-
- if len(self.CustomMakefile) == 0:
- Makefile = GenMake.ModuleMakefile(self)
- else:
- Makefile = GenMake.CustomMakefile(self)
- if Makefile.Generate():
- EdkLogger.debug(EdkLogger.DEBUG_9, "Generated makefile for module %s [%s]" %
- (self.Name, self.Arch))
- else:
- EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation of makefile for module %s [%s]" %
- (self.Name, self.Arch))
-
- CreateTimeStamp()
-
- def CopyBinaryFiles(self):
- for File in self.Module.Binaries:
- SrcPath = File.Path
- DstPath = os.path.join(self.OutputDir, os.path.basename(SrcPath))
- CopyLongFilePath(SrcPath, DstPath)
- ## Create autogen code for the module and its dependent libraries
- #
- # @param CreateLibraryCodeFile Flag indicating if or not the code of
- # dependent libraries will be created
- #
- def CreateCodeFile(self, CreateLibraryCodeFile=True):
- if self.IsCodeFileCreated:
- return
-
- # Need to generate PcdDatabase even PcdDriver is binarymodule
- if self.IsBinaryModule and self.PcdIsDriver != '':
- CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
- return
- if self.IsBinaryModule:
- if self.IsLibrary:
- self.CopyBinaryFiles()
- return
-
- if not self.IsLibrary and CreateLibraryCodeFile:
- for LibraryAutoGen in self.LibraryAutoGenList:
- LibraryAutoGen.CreateCodeFile()
-
- # Don't enable if hash feature enabled, CanSkip uses timestamps to determine build skipping
- if not GlobalData.gUseHashCache and self.CanSkip():
- return
-
- AutoGenList = []
- IgoredAutoGenList = []
-
- for File in self.AutoGenFileList:
- if GenC.Generate(File.Path, self.AutoGenFileList[File], File.IsBinary):
- AutoGenList.append(str(File))
- else:
- IgoredAutoGenList.append(str(File))
-
-
- for ModuleType in self.DepexList:
- # Ignore empty [depex] section or [depex] section for SUP_MODULE_USER_DEFINED module
- if len(self.DepexList[ModuleType]) == 0 or ModuleType == SUP_MODULE_USER_DEFINED or ModuleType == SUP_MODULE_HOST_APPLICATION:
- continue
-
- Dpx = GenDepex.DependencyExpression(self.DepexList[ModuleType], ModuleType, True)
- DpxFile = gAutoGenDepexFileName % {"module_name" : self.Name}
-
- if len(Dpx.PostfixNotation) != 0:
- self.DepexGenerated = True
-
- if Dpx.Generate(path.join(self.OutputDir, DpxFile)):
- AutoGenList.append(str(DpxFile))
- else:
- IgoredAutoGenList.append(str(DpxFile))
-
- if IgoredAutoGenList == []:
- EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] files for module %s [%s]" %
- (" ".join(AutoGenList), self.Name, self.Arch))
- elif AutoGenList == []:
- EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation of [%s] files for module %s [%s]" %
- (" ".join(IgoredAutoGenList), self.Name, self.Arch))
- else:
- EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] (skipped %s) files for module %s [%s]" %
- (" ".join(AutoGenList), " ".join(IgoredAutoGenList), self.Name, self.Arch))
-
- self.IsCodeFileCreated = True
- return AutoGenList
-
- ## Summarize the ModuleAutoGen objects of all libraries used by this module
- @cached_property
- def LibraryAutoGenList(self):
- RetVal = []
- for Library in self.DependentLibraryList:
- La = ModuleAutoGen(
- self.Workspace,
- Library.MetaFile,
- self.BuildTarget,
- self.ToolChain,
- self.Arch,
- self.PlatformInfo.MetaFile
- )
- if La not in RetVal:
- RetVal.append(La)
- for Lib in La.CodaTargetList:
- self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE)
- return RetVal
-
- def GenModuleHash(self):
- # Initialize a dictionary for each arch type
- if self.Arch not in GlobalData.gModuleHash:
- GlobalData.gModuleHash[self.Arch] = {}
-
- # Early exit if module or library has been hashed and is in memory
- if self.Name in GlobalData.gModuleHash[self.Arch]:
- return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8')
-
- # Initialze hash object
- m = hashlib.md5()
-
- # Add Platform level hash
- m.update(GlobalData.gPlatformHash.encode('utf-8'))
-
- # Add Package level hash
- if self.DependentPackageList:
- for Pkg in sorted(self.DependentPackageList, key=lambda x: x.PackageName):
- if Pkg.PackageName in GlobalData.gPackageHash:
- m.update(GlobalData.gPackageHash[Pkg.PackageName].encode('utf-8'))
-
- # Add Library hash
- if self.LibraryAutoGenList:
- for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.Name):
- if Lib.Name not in GlobalData.gModuleHash[self.Arch]:
- Lib.GenModuleHash()
- m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].encode('utf-8'))
-
- # Add Module self
- f = open(str(self.MetaFile), 'rb')
- Content = f.read()
- f.close()
- m.update(Content)
-
- # Add Module's source files
- if self.SourceFileList:
- for File in sorted(self.SourceFileList, key=lambda x: str(x)):
- f = open(str(File), 'rb')
- Content = f.read()
- f.close()
- m.update(Content)
-
- GlobalData.gModuleHash[self.Arch][self.Name] = m.hexdigest()
-
- return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8')
-
- ## Decide whether we can skip the ModuleAutoGen process
- def CanSkipbyHash(self):
- # Hashing feature is off
- if not GlobalData.gUseHashCache:
- return False
-
- # Initialize a dictionary for each arch type
- if self.Arch not in GlobalData.gBuildHashSkipTracking:
- GlobalData.gBuildHashSkipTracking[self.Arch] = dict()
-
- # If library or Module is binary do not skip by hash
- if self.IsBinaryModule:
- return False
-
- # .inc is contains binary information so do not skip by hash as well
- for f_ext in self.SourceFileList:
- if '.inc' in str(f_ext):
- return False
-
- # Use Cache, if exists and if Module has a copy in cache
- if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy():
- return True
-
- # Early exit for libraries that haven't yet finished building
- HashFile = path.join(self.BuildDir, self.Name + ".hash")
- if self.IsLibrary and not os.path.exists(HashFile):
- return False
-
- # Return a Boolean based on if can skip by hash, either from memory or from IO.
- if self.Name not in GlobalData.gBuildHashSkipTracking[self.Arch]:
- # If hashes are the same, SaveFileOnChange() will return False.
- GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] = not SaveFileOnChange(HashFile, self.GenModuleHash(), True)
- return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name]
- else:
- return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name]
-
- ## Decide whether we can skip the ModuleAutoGen process
- # If any source file is newer than the module than we cannot skip
- #
- def CanSkip(self):
- if self.MakeFileDir in GlobalData.gSikpAutoGenCache:
- return True
- if not os.path.exists(self.TimeStampPath):
- return False
- #last creation time of the module
- DstTimeStamp = os.stat(self.TimeStampPath)[8]
-
- SrcTimeStamp = self.Workspace._SrcTimeStamp
- if SrcTimeStamp > DstTimeStamp:
- return False
-
- with open(self.TimeStampPath,'r') as f:
- for source in f:
- source = source.rstrip('\n')
- if not os.path.exists(source):
- return False
- if source not in ModuleAutoGen.TimeDict :
- ModuleAutoGen.TimeDict[source] = os.stat(source)[8]
- if ModuleAutoGen.TimeDict[source] > DstTimeStamp:
- return False
- GlobalData.gSikpAutoGenCache.add(self.MakeFileDir)
- return True
-
- @cached_property
- def TimeStampPath(self):
- return os.path.join(self.MakeFileDir, 'AutoGenTimeStamp')
+ @classmethod
+ def Cache(cls):
+ return cls.__ObjectCache
+
+#
+# The priority list while override build option
+#
+PrioList = {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
+ "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
+ "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+ "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+ "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+ "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+ "0x10011" : 10, # TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
+ "0x00011" : 9, # ******_*********_****_COMMANDTYPE_ATTRIBUTE
+ "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+ "0x01101" : 7, # ******_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+ "0x10101" : 6, # TARGET_*********_ARCH_***********_ATTRIBUTE
+ "0x00101" : 5, # ******_*********_ARCH_***********_ATTRIBUTE
+ "0x11001" : 4, # TARGET_TOOLCHAIN_****_***********_ATTRIBUTE
+ "0x01001" : 3, # ******_TOOLCHAIN_****_***********_ATTRIBUTE
+ "0x10001" : 2, # TARGET_*********_****_***********_ATTRIBUTE
+ "0x00001" : 1} # ******_*********_****_***********_ATTRIBUTE (Lowest)
+## Calculate the priority value of the build option
+#
+# @param Key Build option definition contain: TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
+#
+# @retval Value Priority value based on the priority list.
+#
+def CalculatePriorityValue(Key):
+ Target, ToolChain, Arch, CommandType, Attr = Key.split('_')
+ PriorityValue = 0x11111
+ if Target == TAB_STAR:
+ PriorityValue &= 0x01111
+ if ToolChain == TAB_STAR:
+ PriorityValue &= 0x10111
+ if Arch == TAB_STAR:
+ PriorityValue &= 0x11011
+ if CommandType == TAB_STAR:
+ PriorityValue &= 0x11101
+ if Attr == TAB_STAR:
+ PriorityValue &= 0x11110
+
+ return PrioList["0x%0.5x" % PriorityValue]
diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source/Python/AutoGen/DataPipe.py
new file mode 100644
index 000000000000..5bcc39bd380d
--- /dev/null
+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py
@@ -0,0 +1,147 @@
+## @file
+# Create makefile for MS nmake and GNU make
+#
+# Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+#
+from __future__ import absolute_import
+from Workspace.WorkspaceDatabase import BuildDB
+from Workspace.WorkspaceCommon import GetModuleLibInstances
+import Common.GlobalData as GlobalData
+import os
+import pickle
+from pickle import HIGHEST_PROTOCOL
+
+class PCD_DATA():
+ def __init__(self,TokenCName,TokenSpaceGuidCName,Type,DatumType,SkuInfoList,DefaultValue,
+ MaxDatumSize,UserDefinedDefaultStoresFlag,validateranges,
+ validlists,expressions,CustomAttribute,TokenValue):
+ self.TokenCName = TokenCName
+ self.TokenSpaceGuidCName = TokenSpaceGuidCName
+ self.Type = Type
+ self.DatumType = DatumType
+ self.SkuInfoList = SkuInfoList
+ self.DefaultValue = DefaultValue
+ self.MaxDatumSize = MaxDatumSize
+ self.UserDefinedDefaultStoresFlag = UserDefinedDefaultStoresFlag
+ self.validateranges = validateranges
+ self.validlists = validlists
+ self.expressions = expressions
+ self.CustomAttribute = CustomAttribute
+ self.TokenValue = TokenValue
+
+class DataPipe(object):
+ def __init__(self, BuildDir=None):
+ self.data_container = {}
+ self.BuildDir = BuildDir
+
+class MemoryDataPipe(DataPipe):
+
+ def Get(self,key):
+ return self.data_container.get(key)
+
+ def dump(self,file_path):
+ with open(file_path,'wb') as fd:
+ pickle.dump(self.data_container,fd,pickle.HIGHEST_PROTOCOL)
+
+ def load(self,file_path):
+ with open(file_path,'rb') as fd:
+ self.data_container = pickle.load(fd)
+
+ @property
+ def DataContainer(self):
+ return self.data_container
+ @DataContainer.setter
+ def DataContainer(self,data):
+ self.data_container.update(data)
+
+ def FillData(self,PlatformInfo):
+ #Platform Pcds
+ self.DataContainer = {
+ "PLA_PCD" : [PCD_DATA(
+ pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type,
+ pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue,
+ pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.validateranges,
+ pcd.validlists,pcd.expressions,pcd.CustomAttribute,pcd.TokenValue)
+ for pcd in PlatformInfo.Platform.Pcds.values()]
+ }
+
+ #Platform Module Pcds
+ ModulePcds = {}
+ for m in PlatformInfo.Platform.Modules:
+ m_pcds = PlatformInfo.Platform.Modules[m].Pcds
+ if m_pcds:
+ ModulePcds[(m.File,m.Root)] = [PCD_DATA(
+ pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type,
+ pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue,
+ pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.validateranges,
+ pcd.validlists,pcd.expressions,pcd.CustomAttribute,pcd.TokenValue)
+ for pcd in PlatformInfo.Platform.Modules[m].Pcds.values()]
+
+
+ self.DataContainer = {"MOL_PCDS":ModulePcds}
+
+ #Module's Library Instance
+ ModuleLibs = {}
+ for m in PlatformInfo.Platform.Modules:
+ module_obj = BuildDB.BuildObject[m,PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain]
+ Libs = GetModuleLibInstances(module_obj, PlatformInfo.Platform, BuildDB.BuildObject, PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain)
+ ModuleLibs[(m.File,m.Root,module_obj.Arch)] = [(l.MetaFile.File,l.MetaFile.Root,l.Arch) for l in Libs]
+ self.DataContainer = {"DEPS":ModuleLibs}
+
+ #Platform BuildOptions
+
+ platform_build_opt = PlatformInfo.EdkIIBuildOption
+
+ ToolDefinition = PlatformInfo.ToolDefinition
+ module_build_opt = {}
+ for m in PlatformInfo.Platform.Modules:
+ ModuleTypeOptions, PlatformModuleOptions = PlatformInfo.GetGlobalBuildOptions(BuildDB.BuildObject[m,PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain])
+ if ModuleTypeOptions or PlatformModuleOptions:
+ module_build_opt.update({(m.File,m.Root): {"ModuleTypeOptions":ModuleTypeOptions, "PlatformModuleOptions":PlatformModuleOptions}})
+
+ self.DataContainer = {"PLA_BO":platform_build_opt,
+ "TOOLDEF":ToolDefinition,
+ "MOL_BO":module_build_opt
+ }
+
+
+
+ #Platform Info
+ PInfo = {
+ "WorkspaceDir":PlatformInfo.Workspace.WorkspaceDir,
+ "Target":PlatformInfo.BuildTarget,
+ "ToolChain":PlatformInfo.Workspace.ToolChain,
+ "BuildRuleFile":PlatformInfo.BuildRule,
+ "Arch": PlatformInfo.Arch,
+ "ArchList":PlatformInfo.Workspace.ArchList,
+ "ActivePlatform":PlatformInfo.MetaFile
+ }
+ self.DataContainer = {'P_Info':PInfo}
+
+ self.DataContainer = {'M_Name':PlatformInfo.UniqueBaseName}
+
+ self.DataContainer = {"ToolChainFamily": PlatformInfo.ToolChainFamily}
+
+ self.DataContainer = {"BuildRuleFamily": PlatformInfo.BuildRuleFamily}
+
+ self.DataContainer = {"MixedPcd":GlobalData.MixedPcd}
+
+ self.DataContainer = {"BuildOptPcd":GlobalData.BuildOptionPcd}
+
+ self.DataContainer = {"BuildCommand": PlatformInfo.BuildCommand}
+
+ self.DataContainer = {"AsBuildModuleList": PlatformInfo._AsBuildModuleList}
+
+ self.DataContainer = {"G_defines": GlobalData.gGlobalDefines}
+
+ self.DataContainer = {"CL_defines": GlobalData.gCommandLineDefines}
+
+ self.DataContainer = {"Env_Var": {k:v for k, v in os.environ.items()}}
+
+ self.DataContainer = {"PackageList": [(dec.MetaFile,dec.Arch) for dec in PlatformInfo.PackageList]}
+
+ self.DataContainer = {"GuidDict": PlatformInfo.Platform._GuidDict}
+
+ self.DataContainer = {"FdfParser": True if GlobalData.gFdfParser else False}
+
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 4cb776206e90..4c3f4e3e55ae 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1627,11 +1627,11 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
TokenSpaceList = []
for Pcd in Info.ModulePcdList:
if Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET and Pcd.TokenSpaceGuidCName not in TokenSpaceList:
TokenSpaceList.append(Pcd.TokenSpaceGuidCName)
- SkuMgr = Info.Workspace.Platform.SkuIdMgr
+ SkuMgr = Info.PlatformInfo.Platform.SkuIdMgr
AutoGenH.Append("\n// Definition of SkuId Array\n")
AutoGenH.Append("extern UINT64 _gPcd_SkuId_Array[];\n")
# Add extern declarations to AutoGen.h if one or more Token Space GUIDs were found
if TokenSpaceList:
AutoGenH.Append("\n// Definition of PCD Token Space GUIDs used in this module\n\n")
diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
new file mode 100644
index 000000000000..f0a4afc3a664
--- /dev/null
+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
@@ -0,0 +1,1908 @@
+## @file
+# Create makefile for MS nmake and GNU make
+#
+# Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+#
+from __future__ import absolute_import
+from AutoGen.AutoGen import AutoGen
+from Common.LongFilePathSupport import CopyLongFilePath
+from Common.BuildToolError import *
+from Common.DataType import *
+from Common.Misc import *
+from Common.StringUtils import NormPath,GetSplitList
+from collections import defaultdict
+from Workspace.WorkspaceCommon import OrderedListDict
+import os.path as path
+import copy
+import hashlib
+from . import InfSectionParser
+from . import GenC
+from . import GenMake
+from . import GenDepex
+from io import BytesIO
+from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile
+from Workspace.MetaFileCommentParser import UsageList
+from .GenPcdDb import CreatePcdDatabaseCode
+from Common.caching import cached_class_function
+from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo
+
+## Mapping Makefile type
+gMakeTypeMap = {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"}
+#
+# Regular expression for finding Include Directories, the difference between MSFT and INTEL/GCC/RVCT
+# is the former use /I , the Latter used -I to specify include directories
+#
+gBuildOptIncludePatternMsft = re.compile(r"(?:.*?)/I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
+gBuildOptIncludePatternOther = re.compile(r"(?:.*?)-I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
+
+## default file name for AutoGen
+gAutoGenCodeFileName = "AutoGen.c"
+gAutoGenHeaderFileName = "AutoGen.h"
+gAutoGenStringFileName = "%(module_name)sStrDefs.h"
+gAutoGenStringFormFileName = "%(module_name)sStrDefs.hpk"
+gAutoGenDepexFileName = "%(module_name)s.depex"
+gAutoGenImageDefFileName = "%(module_name)sImgDefs.h"
+gAutoGenIdfFileName = "%(module_name)sIdf.hpk"
+gInfSpecVersion = "0x00010017"
+
+#
+# Match name = variable
+#
+gEfiVarStoreNamePattern = re.compile("\s*name\s*=\s*(\w+)")
+#
+# The format of guid in efivarstore statement likes following and must be correct:
+# guid = {0xA04A27f4, 0xDF00, 0x4D42, {0xB5, 0x52, 0x39, 0x51, 0x13, 0x02, 0x11, 0x3D}}
+#
+gEfiVarStoreGuidPattern = re.compile("\s*guid\s*=\s*({.*?{.*?}\s*})")
+
+#
+# Template string to generic AsBuilt INF
+#
+gAsBuiltInfHeaderString = TemplateString("""${header_comments}
+
+# DO NOT EDIT
+# FILE auto-generated
+
+[Defines]
+ INF_VERSION = ${module_inf_version}
+ BASE_NAME = ${module_name}
+ FILE_GUID = ${module_guid}
+ MODULE_TYPE = ${module_module_type}${BEGIN}
+ VERSION_STRING = ${module_version_string}${END}${BEGIN}
+ PCD_IS_DRIVER = ${pcd_is_driver_string}${END}${BEGIN}
+ UEFI_SPECIFICATION_VERSION = ${module_uefi_specification_version}${END}${BEGIN}
+ PI_SPECIFICATION_VERSION = ${module_pi_specification_version}${END}${BEGIN}
+ ENTRY_POINT = ${module_entry_point}${END}${BEGIN}
+ UNLOAD_IMAGE = ${module_unload_image}${END}${BEGIN}
+ CONSTRUCTOR = ${module_constructor}${END}${BEGIN}
+ DESTRUCTOR = ${module_destructor}${END}${BEGIN}
+ SHADOW = ${module_shadow}${END}${BEGIN}
+ PCI_VENDOR_ID = ${module_pci_vendor_id}${END}${BEGIN}
+ PCI_DEVICE_ID = ${module_pci_device_id}${END}${BEGIN}
+ PCI_CLASS_CODE = ${module_pci_class_code}${END}${BEGIN}
+ PCI_REVISION = ${module_pci_revision}${END}${BEGIN}
+ BUILD_NUMBER = ${module_build_number}${END}${BEGIN}
+ SPEC = ${module_spec}${END}${BEGIN}
+ UEFI_HII_RESOURCE_SECTION = ${module_uefi_hii_resource_section}${END}${BEGIN}
+ MODULE_UNI_FILE = ${module_uni_file}${END}
+
+[Packages.${module_arch}]${BEGIN}
+ ${package_item}${END}
+
+[Binaries.${module_arch}]${BEGIN}
+ ${binary_item}${END}
+
+[PatchPcd.${module_arch}]${BEGIN}
+ ${patchablepcd_item}
+${END}
+
+[Protocols.${module_arch}]${BEGIN}
+ ${protocol_item}
+${END}
+
+[Ppis.${module_arch}]${BEGIN}
+ ${ppi_item}
+${END}
+
+[Guids.${module_arch}]${BEGIN}
+ ${guid_item}
+${END}
+
+[PcdEx.${module_arch}]${BEGIN}
+ ${pcd_item}
+${END}
+
+[LibraryClasses.${module_arch}]
+## @LIB_INSTANCES${BEGIN}
+# ${libraryclasses_item}${END}
+
+${depexsection_item}
+
+${userextension_tianocore_item}
+
+${tail_comments}
+
+[BuildOptions.${module_arch}]
+## @AsBuilt${BEGIN}
+## ${flags_item}${END}
+""")
+#
+# extend lists contained in a dictionary with lists stored in another dictionary
+# if CopyToDict is not derived from DefaultDict(list) then this may raise exception
+#
+def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict):
+ for Key in CopyFromDict:
+ CopyToDict[Key].extend(CopyFromDict[Key])
+
+# Create a directory specified by a set of path elements and return the full path
+def _MakeDir(PathList):
+ RetVal = path.join(*PathList)
+ CreateDirectory(RetVal)
+ return RetVal
+
+#
+# Convert string to C format array
+#
+def _ConvertStringToByteArray(Value):
+ Value = Value.strip()
+ if not Value:
+ return None
+ if Value[0] == '{':
+ if not Value.endswith('}'):
+ return None
+ Value = Value.replace(' ', '').replace('{', '').replace('}', '')
+ ValFields = Value.split(',')
+ try:
+ for Index in range(len(ValFields)):
+ ValFields[Index] = str(int(ValFields[Index], 0))
+ except ValueError:
+ return None
+ Value = '{' + ','.join(ValFields) + '}'
+ return Value
+
+ Unicode = False
+ if Value.startswith('L"'):
+ if not Value.endswith('"'):
+ return None
+ Value = Value[1:]
+ Unicode = True
+ elif not Value.startswith('"') or not Value.endswith('"'):
+ return None
+
+ Value = eval(Value) # translate escape character
+ NewValue = '{'
+ for Index in range(0, len(Value)):
+ if Unicode:
+ NewValue = NewValue + str(ord(Value[Index]) % 0x10000) + ','
+ else:
+ NewValue = NewValue + str(ord(Value[Index]) % 0x100) + ','
+ Value = NewValue + '0}'
+ return Value
+
+## ModuleAutoGen class
+#
+# This class encapsules the AutoGen behaviors for the build tools. In addition to
+# the generation of AutoGen.h and AutoGen.c, it will generate *.depex file according
+# to the [depex] section in module's inf file.
+#
+class ModuleAutoGen(AutoGen):
+ # call super().__init__ then call the worker function with different parameter count
+ def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
+ if not hasattr(self, "_Init"):
+ self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch, *args)
+ self._Init = True
+
+ ## Cache the timestamps of metafiles of every module in a class attribute
+ #
+ TimeDict = {}
+
+ def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
+# check if this module is employed by active platform
+ if not PlatformInfo(Workspace, args[0], Target, Toolchain, Arch,args[-1]).ValidModule(MetaFile):
+ EdkLogger.verbose("Module [%s] for [%s] is not employed by active platform\n" \
+ % (MetaFile, Arch))
+ return None
+ return super(ModuleAutoGen, cls).__new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
+
+ ## Initialize ModuleAutoGen
+ #
+ # @param Workspace EdkIIWorkspaceBuild object
+ # @param ModuleFile The path of module file
+ # @param Target Build target (DEBUG, RELEASE)
+ # @param Toolchain Name of tool chain
+ # @param Arch The arch the module supports
+ # @param PlatformFile Platform meta-file
+ #
+ def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, Arch, PlatformFile,DataPipe):
+ EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen module [%s] [%s]" % (ModuleFile, Arch))
+ GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (ModuleFile, Arch, Toolchain, Target)
+
+ self.Workspace = None
+ self.WorkspaceDir = ""
+ self.PlatformInfo = None
+ self.DataPipe = DataPipe
+ self.__init_platform_info__()
+ self.MetaFile = ModuleFile
+ self.SourceDir = self.MetaFile.SubDir
+ self.SourceDir = mws.relpath(self.SourceDir, self.WorkspaceDir)
+
+ self.ToolChain = Toolchain
+ self.BuildTarget = Target
+ self.Arch = Arch
+ self.ToolChainFamily = self.PlatformInfo.ToolChainFamily
+ self.BuildRuleFamily = self.PlatformInfo.BuildRuleFamily
+
+ self.IsCodeFileCreated = False
+ self.IsAsBuiltInfCreated = False
+ self.DepexGenerated = False
+
+ self.BuildDatabase = self.Workspace.BuildDatabase
+ self.BuildRuleOrder = None
+ self.BuildTime = 0
+
+ self._GuidComments = OrderedListDict()
+ self._ProtocolComments = OrderedListDict()
+ self._PpiComments = OrderedListDict()
+ self._BuildTargets = None
+ self._IntroBuildTargetList = None
+ self._FinalBuildTargetList = None
+ self._FileTypes = None
+
+ self.AutoGenDepSet = set()
+ self.ReferenceModules = []
+ self.ConstPcd = {}
+
+ def __init_platform_info__(self):
+ pinfo = self.DataPipe.Get("P_Info")
+ self.Workspace = WorkSpaceInfo(pinfo.get("WorkspaceDir"),pinfo.get("ActivePlatform"),pinfo.get("Target"),pinfo.get("ToolChain"),pinfo.get("ArchList"))
+ self.WorkspaceDir = pinfo.get("WorkspaceDir")
+ self.PlatformInfo = PlatformInfo(self.Workspace,pinfo.get("ActivePlatform"),pinfo.get("Target"),pinfo.get("ToolChain"),pinfo.get("Arch"),self.DataPipe)
+ ## hash() operator of ModuleAutoGen
+ #
+ # The module file path and arch string will be used to represent
+ # hash value of this object
+ #
+ # @retval int Hash value of the module file path and arch
+ #
+ @cached_class_function
+ def __hash__(self):
+ return hash((self.MetaFile, self.Arch))
+ def __repr__(self):
+ return "%s [%s]" % (self.MetaFile, self.Arch)
+
+ # Get FixedAtBuild Pcds of this Module
+ @cached_property
+ def FixedAtBuildPcds(self):
+ RetVal = []
+ for Pcd in self.ModulePcdList:
+ if Pcd.Type != TAB_PCDS_FIXED_AT_BUILD:
+ continue
+ if Pcd not in RetVal:
+ RetVal.append(Pcd)
+ return RetVal
+
+ @cached_property
+ def FixedVoidTypePcds(self):
+ RetVal = {}
+ for Pcd in self.FixedAtBuildPcds:
+ if Pcd.DatumType == TAB_VOID:
+ if '.'.join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName)) not in RetVal:
+ RetVal['.'.join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))] = Pcd.DefaultValue
+ return RetVal
+
+ @property
+ def UniqueBaseName(self):
+ ModuleNames = self.DataPipe.Get("M_Name")
+ if not ModuleNames:
+ return self.Name
+ return ModuleNames.get(self.Name,self.Name)
+
+ # Macros could be used in build_rule.txt (also Makefile)
+ @cached_property
+ def Macros(self):
+ return OrderedDict((
+ ("WORKSPACE" ,self.WorkspaceDir),
+ ("MODULE_NAME" ,self.Name),
+ ("MODULE_NAME_GUID" ,self.UniqueBaseName),
+ ("MODULE_GUID" ,self.Guid),
+ ("MODULE_VERSION" ,self.Version),
+ ("MODULE_TYPE" ,self.ModuleType),
+ ("MODULE_FILE" ,str(self.MetaFile)),
+ ("MODULE_FILE_BASE_NAME" ,self.MetaFile.BaseName),
+ ("MODULE_RELATIVE_DIR" ,self.SourceDir),
+ ("MODULE_DIR" ,self.SourceDir),
+ ("BASE_NAME" ,self.Name),
+ ("ARCH" ,self.Arch),
+ ("TOOLCHAIN" ,self.ToolChain),
+ ("TOOLCHAIN_TAG" ,self.ToolChain),
+ ("TOOL_CHAIN_TAG" ,self.ToolChain),
+ ("TARGET" ,self.BuildTarget),
+ ("BUILD_DIR" ,self.PlatformInfo.BuildDir),
+ ("BIN_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
+ ("LIB_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
+ ("MODULE_BUILD_DIR" ,self.BuildDir),
+ ("OUTPUT_DIR" ,self.OutputDir),
+ ("DEBUG_DIR" ,self.DebugDir),
+ ("DEST_DIR_OUTPUT" ,self.OutputDir),
+ ("DEST_DIR_DEBUG" ,self.DebugDir),
+ ("PLATFORM_NAME" ,self.PlatformInfo.Name),
+ ("PLATFORM_GUID" ,self.PlatformInfo.Guid),
+ ("PLATFORM_VERSION" ,self.PlatformInfo.Version),
+ ("PLATFORM_RELATIVE_DIR" ,self.PlatformInfo.SourceDir),
+ ("PLATFORM_DIR" ,mws.join(self.WorkspaceDir, self.PlatformInfo.SourceDir)),
+ ("PLATFORM_OUTPUT_DIR" ,self.PlatformInfo.OutputDir),
+ ("FFS_OUTPUT_DIR" ,self.FfsOutputDir)
+ ))
+
+ ## Return the module build data object
+ @cached_property
+ def Module(self):
+ return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
+
+ ## Return the module name
+ @cached_property
+ def Name(self):
+ return self.Module.BaseName
+
+ ## Return the module DxsFile if exist
+ @cached_property
+ def DxsFile(self):
+ return self.Module.DxsFile
+
+ ## Return the module meta-file GUID
+ @cached_property
+ def Guid(self):
+ #
+ # To build same module more than once, the module path with FILE_GUID overridden has
+ # the file name FILE_GUIDmodule.inf, but the relative path (self.MetaFile.File) is the real path
+ # in DSC. The overridden GUID can be retrieved from file name
+ #
+ if os.path.basename(self.MetaFile.File) != os.path.basename(self.MetaFile.Path):
+ #
+ # Length of GUID is 36
+ #
+ return os.path.basename(self.MetaFile.Path)[:36]
+ return self.Module.Guid
+
+ ## Return the module version
+ @cached_property
+ def Version(self):
+ return self.Module.Version
+
+ ## Return the module type
+ @cached_property
+ def ModuleType(self):
+ return self.Module.ModuleType
+
+ ## Return the component type (for Edk.x style of module)
+ @cached_property
+ def ComponentType(self):
+ return self.Module.ComponentType
+
+ ## Return the build type
+ @cached_property
+ def BuildType(self):
+ return self.Module.BuildType
+
+ ## Return the PCD_IS_DRIVER setting
+ @cached_property
+ def PcdIsDriver(self):
+ return self.Module.PcdIsDriver
+
+ ## Return the autogen version, i.e. module meta-file version
+ @cached_property
+ def AutoGenVersion(self):
+ return self.Module.AutoGenVersion
+
+ ## Check if the module is library or not
+ @cached_property
+ def IsLibrary(self):
+ return bool(self.Module.LibraryClass)
+
+ ## Check if the module is binary module or not
+ @cached_property
+ def IsBinaryModule(self):
+ return self.Module.IsBinaryModule
+
+ ## Return the directory to store intermediate files of the module
+ @cached_property
+ def BuildDir(self):
+ return _MakeDir((
+ self.PlatformInfo.BuildDir,
+ self.Arch,
+ self.SourceDir,
+ self.MetaFile.BaseName
+ ))
+
+ ## Return the directory to store the intermediate object files of the module
+ @cached_property
+ def OutputDir(self):
+ return _MakeDir((self.BuildDir, "OUTPUT"))
+
+ ## Return the directory path to store ffs file
+ @cached_property
+ def FfsOutputDir(self):
+ if GlobalData.gFdfParser:
+ return path.join(self.PlatformInfo.BuildDir, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
+ return ''
+
+ ## Return the directory to store auto-gened source files of the module
+ @cached_property
+ def DebugDir(self):
+ return _MakeDir((self.BuildDir, "DEBUG"))
+
+ ## Return the path of custom file
+ @cached_property
+ def CustomMakefile(self):
+ RetVal = {}
+ for Type in self.Module.CustomMakefile:
+ MakeType = gMakeTypeMap[Type] if Type in gMakeTypeMap else 'nmake'
+ File = os.path.join(self.SourceDir, self.Module.CustomMakefile[Type])
+ RetVal[MakeType] = File
+ return RetVal
+
+ ## Return the directory of the makefile
+ #
+ # @retval string The directory string of module's makefile
+ #
+ @cached_property
+ def MakeFileDir(self):
+ return self.BuildDir
+
+ ## Return build command string
+ #
+ # @retval string Build command string
+ #
+ @cached_property
+ def BuildCommand(self):
+ return self.PlatformInfo.BuildCommand
+
+ ## Get object list of all packages the module and its dependent libraries belong to
+ #
+ # @retval list The list of package object
+ #
+ @cached_property
+ def DerivedPackageList(self):
+ PackageList = []
+ for M in [self.Module] + self.DependentLibraryList:
+ for Package in M.Packages:
+ if Package in PackageList:
+ continue
+ PackageList.append(Package)
+ return PackageList
+
+ ## Get the depex string
+ #
+ # @return : a string contain all depex expression.
+ def _GetDepexExpresionString(self):
+ DepexStr = ''
+ DepexList = []
+ ## DPX_SOURCE IN Define section.
+ if self.Module.DxsFile:
+ return DepexStr
+ for M in [self.Module] + self.DependentLibraryList:
+ Filename = M.MetaFile.Path
+ InfObj = InfSectionParser.InfSectionParser(Filename)
+ DepexExpressionList = InfObj.GetDepexExpresionList()
+ for DepexExpression in DepexExpressionList:
+ for key in DepexExpression:
+ Arch, ModuleType = key
+ DepexExpr = [x for x in DepexExpression[key] if not str(x).startswith('#')]
+ # the type of build module is USER_DEFINED.
+ # All different DEPEX section tags would be copied into the As Built INF file
+ # and there would be separate DEPEX section tags
+ if self.ModuleType.upper() == SUP_MODULE_USER_DEFINED or self.ModuleType.upper() == SUP_MODULE_HOST_APPLICATION:
+ if (Arch.upper() == self.Arch.upper()) and (ModuleType.upper() != TAB_ARCH_COMMON):
+ DepexList.append({(Arch, ModuleType): DepexExpr})
+ else:
+ if Arch.upper() == TAB_ARCH_COMMON or \
+ (Arch.upper() == self.Arch.upper() and \
+ ModuleType.upper() in [TAB_ARCH_COMMON, self.ModuleType.upper()]):
+ DepexList.append({(Arch, ModuleType): DepexExpr})
+
+ #the type of build module is USER_DEFINED.
+ if self.ModuleType.upper() == SUP_MODULE_USER_DEFINED or self.ModuleType.upper() == SUP_MODULE_HOST_APPLICATION:
+ for Depex in DepexList:
+ for key in Depex:
+ DepexStr += '[Depex.%s.%s]\n' % key
+ DepexStr += '\n'.join('# '+ val for val in Depex[key])
+ DepexStr += '\n\n'
+ if not DepexStr:
+ return '[Depex.%s]\n' % self.Arch
+ return DepexStr
+
+ #the type of build module not is USER_DEFINED.
+ Count = 0
+ for Depex in DepexList:
+ Count += 1
+ if DepexStr != '':
+ DepexStr += ' AND '
+ DepexStr += '('
+ for D in Depex.values():
+ DepexStr += ' '.join(val for val in D)
+ Index = DepexStr.find('END')
+ if Index > -1 and Index == len(DepexStr) - 3:
+ DepexStr = DepexStr[:-3]
+ DepexStr = DepexStr.strip()
+ DepexStr += ')'
+ if Count == 1:
+ DepexStr = DepexStr.lstrip('(').rstrip(')').strip()
+ if not DepexStr:
+ return '[Depex.%s]\n' % self.Arch
+ return '[Depex.%s]\n# ' % self.Arch + DepexStr
+
+ ## Merge dependency expression
+ #
+ # @retval list The token list of the dependency expression after parsed
+ #
+ @cached_property
+ def DepexList(self):
+ if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_FILE in self.FileTypes:
+ return {}
+
+ DepexList = []
+ #
+ # Append depex from dependent libraries, if not "BEFORE", "AFTER" expression
+ #
+ FixedVoidTypePcds = {}
+ for M in [self] + self.LibraryAutoGenList:
+ FixedVoidTypePcds.update(M.FixedVoidTypePcds)
+ for M in [self] + self.LibraryAutoGenList:
+ Inherited = False
+ for D in M.Module.Depex[self.Arch, self.ModuleType]:
+ if DepexList != []:
+ DepexList.append('AND')
+ DepexList.append('(')
+ #replace D with value if D is FixedAtBuild PCD
+ NewList = []
+ for item in D:
+ if '.' not in item:
+ NewList.append(item)
+ else:
+ try:
+ Value = FixedVoidTypePcds[item]
+ if len(Value.split(',')) != 16:
+ EdkLogger.error("build", FORMAT_INVALID,
+ "{} used in [Depex] section should be used as FixedAtBuild type and VOID* datum type and 16 bytes in the module.".format(item))
+ NewList.append(Value)
+ except:
+ EdkLogger.error("build", FORMAT_INVALID, "{} used in [Depex] section should be used as FixedAtBuild type and VOID* datum type in the module.".format(item))
+
+ DepexList.extend(NewList)
+ if DepexList[-1] == 'END': # no need of a END at this time
+ DepexList.pop()
+ DepexList.append(')')
+ Inherited = True
+ if Inherited:
+ EdkLogger.verbose("DEPEX[%s] (+%s) = %s" % (self.Name, M.Module.BaseName, DepexList))
+ if 'BEFORE' in DepexList or 'AFTER' in DepexList:
+ break
+ if len(DepexList) > 0:
+ EdkLogger.verbose('')
+ return {self.ModuleType:DepexList}
+
+ ## Merge dependency expression
+ #
+ # @retval list The token list of the dependency expression after parsed
+ #
+ @cached_property
+ def DepexExpressionDict(self):
+ if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_FILE in self.FileTypes:
+ return {}
+
+ DepexExpressionString = ''
+ #
+ # Append depex from dependent libraries, if not "BEFORE", "AFTER" expresion
+ #
+ for M in [self.Module] + self.DependentLibraryList:
+ Inherited = False
+ for D in M.DepexExpression[self.Arch, self.ModuleType]:
+ if DepexExpressionString != '':
+ DepexExpressionString += ' AND '
+ DepexExpressionString += '('
+ DepexExpressionString += D
+ DepexExpressionString = DepexExpressionString.rstrip('END').strip()
+ DepexExpressionString += ')'
+ Inherited = True
+ if Inherited:
+ EdkLogger.verbose("DEPEX[%s] (+%s) = %s" % (self.Name, M.BaseName, DepexExpressionString))
+ if 'BEFORE' in DepexExpressionString or 'AFTER' in DepexExpressionString:
+ break
+ if len(DepexExpressionString) > 0:
+ EdkLogger.verbose('')
+
+ return {self.ModuleType:DepexExpressionString}
+
+ # Get the tiano core user extension, it is contain dependent library.
+ # @retval: a list contain tiano core userextension.
+ #
+ def _GetTianoCoreUserExtensionList(self):
+ TianoCoreUserExtentionList = []
+ for M in [self.Module] + self.DependentLibraryList:
+ Filename = M.MetaFile.Path
+ InfObj = InfSectionParser.InfSectionParser(Filename)
+ TianoCoreUserExtenList = InfObj.GetUserExtensionTianoCore()
+ for TianoCoreUserExtent in TianoCoreUserExtenList:
+ for Section in TianoCoreUserExtent:
+ ItemList = Section.split(TAB_SPLIT)
+ Arch = self.Arch
+ if len(ItemList) == 4:
+ Arch = ItemList[3]
+ if Arch.upper() == TAB_ARCH_COMMON or Arch.upper() == self.Arch.upper():
+ TianoCoreList = []
+ TianoCoreList.extend([TAB_SECTION_START + Section + TAB_SECTION_END])
+ TianoCoreList.extend(TianoCoreUserExtent[Section][:])
+ TianoCoreList.append('\n')
+ TianoCoreUserExtentionList.append(TianoCoreList)
+
+ return TianoCoreUserExtentionList
+
+ ## Return the list of specification version required for the module
+ #
+ # @retval list The list of specification defined in module file
+ #
+ @cached_property
+ def Specification(self):
+ return self.Module.Specification
+
+ ## Tool option for the module build
+ #
+ # @param PlatformInfo The object of PlatformBuildInfo
+ # @retval dict The dict containing valid options
+ #
+ @cached_property
+ def BuildOption(self):
+ RetVal, self.BuildRuleOrder = self.PlatformInfo.ApplyBuildOption(self.Module)
+ if self.BuildRuleOrder:
+ self.BuildRuleOrder = ['.%s' % Ext for Ext in self.BuildRuleOrder.split()]
+ return RetVal
+
+ ## Get include path list from tool option for the module build
+ #
+ # @retval list The include path list
+ #
+ @cached_property
+ def BuildOptionIncPathList(self):
+ #
+ # Regular expression for finding Include Directories, the difference between MSFT and INTEL/GCC/RVCT
+ # is the former use /I , the Latter used -I to specify include directories
+ #
+ if self.PlatformInfo.ToolChainFamily in (TAB_COMPILER_MSFT):
+ BuildOptIncludeRegEx = gBuildOptIncludePatternMsft
+ elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'RVCT'):
+ BuildOptIncludeRegEx = gBuildOptIncludePatternOther
+ else:
+ #
+ # New ToolChainFamily, don't known whether there is option to specify include directories
+ #
+ return []
+
+ RetVal = []
+ for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM'):
+ try:
+ FlagOption = self.BuildOption[Tool]['FLAGS']
+ except KeyError:
+ FlagOption = ''
+
+ if self.ToolChainFamily != 'RVCT':
+ IncPathList = [NormPath(Path, self.Macros) for Path in BuildOptIncludeRegEx.findall(FlagOption)]
+ else:
+ #
+ # RVCT may specify a list of directory seperated by commas
+ #
+ IncPathList = []
+ for Path in BuildOptIncludeRegEx.findall(FlagOption):
+ PathList = GetSplitList(Path, TAB_COMMA_SPLIT)
+ IncPathList.extend(NormPath(PathEntry, self.Macros) for PathEntry in PathList)
+
+ #
+ # EDK II modules must not reference header files outside of the packages they depend on or
+ # within the module's directory tree. Report error if violation.
+ #
+ if GlobalData.gDisableIncludePathCheck == False:
+ for Path in IncPathList:
+ if (Path not in self.IncludePathList) and (CommonPath([Path, self.MetaFile.Dir]) != self.MetaFile.Dir):
+ ErrMsg = "The include directory for the EDK II module in this line is invalid %s specified in %s FLAGS '%s'" % (Path, Tool, FlagOption)
+ EdkLogger.error("build",
+ PARAMETER_INVALID,
+ ExtraData=ErrMsg,
+ File=str(self.MetaFile))
+ RetVal += IncPathList
+ return RetVal
+
+ ## Return a list of files which can be built from source
+ #
+ # What kind of files can be built is determined by build rules in
+ # $(CONF_DIRECTORY)/build_rule.txt and toolchain family.
+ #
+ @cached_property
+ def SourceFileList(self):
+ RetVal = []
+ ToolChainTagSet = {"", TAB_STAR, self.ToolChain}
+ ToolChainFamilySet = {"", TAB_STAR, self.ToolChainFamily, self.BuildRuleFamily}
+ for F in self.Module.Sources:
+ # match tool chain
+ if F.TagName not in ToolChainTagSet:
+ EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s] for processing file [%s] is found, "
+ "but [%s] is currently used" % (F.TagName, str(F), self.ToolChain))
+ continue
+ # match tool chain family or build rule family
+ if F.ToolChainFamily not in ToolChainFamilySet:
+ EdkLogger.debug(
+ EdkLogger.DEBUG_0,
+ "The file [%s] must be built by tools of [%s], " \
+ "but current toolchain family is [%s], buildrule family is [%s]" \
+ % (str(F), F.ToolChainFamily, self.ToolChainFamily, self.BuildRuleFamily))
+ continue
+
+ # add the file path into search path list for file including
+ if F.Dir not in self.IncludePathList:
+ self.IncludePathList.insert(0, F.Dir)
+ RetVal.append(F)
+
+ self._MatchBuildRuleOrder(RetVal)
+
+ for F in RetVal:
+ self._ApplyBuildRule(F, TAB_UNKNOWN_FILE)
+ return RetVal
+
+ def _MatchBuildRuleOrder(self, FileList):
+ Order_Dict = {}
+ self.BuildOption
+ for SingleFile in FileList:
+ if self.BuildRuleOrder and SingleFile.Ext in self.BuildRuleOrder and SingleFile.Ext in self.BuildRules:
+ key = SingleFile.Path.rsplit(SingleFile.Ext,1)[0]
+ if key in Order_Dict:
+ Order_Dict[key].append(SingleFile.Ext)
+ else:
+ Order_Dict[key] = [SingleFile.Ext]
+
+ RemoveList = []
+ for F in Order_Dict:
+ if len(Order_Dict[F]) > 1:
+ Order_Dict[F].sort(key=lambda i: self.BuildRuleOrder.index(i))
+ for Ext in Order_Dict[F][1:]:
+ RemoveList.append(F + Ext)
+
+ for item in RemoveList:
+ FileList.remove(item)
+
+ return FileList
+
+ ## Return the list of unicode files
+ @cached_property
+ def UnicodeFileList(self):
+ return self.FileTypes.get(TAB_UNICODE_FILE,[])
+
+ ## Return the list of vfr files
+ @cached_property
+ def VfrFileList(self):
+ return self.FileTypes.get(TAB_VFR_FILE, [])
+
+ ## Return the list of Image Definition files
+ @cached_property
+ def IdfFileList(self):
+ return self.FileTypes.get(TAB_IMAGE_FILE,[])
+
+ ## Return a list of files which can be built from binary
+ #
+ # "Build" binary files are just to copy them to build directory.
+ #
+ # @retval list The list of files which can be built later
+ #
+ @cached_property
+ def BinaryFileList(self):
+ RetVal = []
+ for F in self.Module.Binaries:
+ if F.Target not in [TAB_ARCH_COMMON, TAB_STAR] and F.Target != self.BuildTarget:
+ continue
+ RetVal.append(F)
+ self._ApplyBuildRule(F, F.Type, BinaryFileList=RetVal)
+ return RetVal
+
+ @cached_property
+ def BuildRules(self):
+ RetVal = {}
+ BuildRuleDatabase = self.PlatformInfo.BuildRule
+ for Type in BuildRuleDatabase.FileTypeList:
+ #first try getting build rule by BuildRuleFamily
+ RuleObject = BuildRuleDatabase[Type, self.BuildType, self.Arch, self.BuildRuleFamily]
+ if not RuleObject:
+ # build type is always module type, but ...
+ if self.ModuleType != self.BuildType:
+ RuleObject = BuildRuleDatabase[Type, self.ModuleType, self.Arch, self.BuildRuleFamily]
+ #second try getting build rule by ToolChainFamily
+ if not RuleObject:
+ RuleObject = BuildRuleDatabase[Type, self.BuildType, self.Arch, self.ToolChainFamily]
+ if not RuleObject:
+ # build type is always module type, but ...
+ if self.ModuleType != self.BuildType:
+ RuleObject = BuildRuleDatabase[Type, self.ModuleType, self.Arch, self.ToolChainFamily]
+ if not RuleObject:
+ continue
+ RuleObject = RuleObject.Instantiate(self.Macros)
+ RetVal[Type] = RuleObject
+ for Ext in RuleObject.SourceFileExtList:
+ RetVal[Ext] = RuleObject
+ return RetVal
+
+ def _ApplyBuildRule(self, File, FileType, BinaryFileList=None):
+ if self._BuildTargets is None:
+ self._IntroBuildTargetList = set()
+ self._FinalBuildTargetList = set()
+ self._BuildTargets = defaultdict(set)
+ self._FileTypes = defaultdict(set)
+
+ if not BinaryFileList:
+ BinaryFileList = self.BinaryFileList
+
+ SubDirectory = os.path.join(self.OutputDir, File.SubDir)
+ if not os.path.exists(SubDirectory):
+ CreateDirectory(SubDirectory)
+ LastTarget = None
+ RuleChain = set()
+ SourceList = [File]
+ Index = 0
+ #
+ # Make sure to get build rule order value
+ #
+ self.BuildOption
+
+ while Index < len(SourceList):
+ Source = SourceList[Index]
+ Index = Index + 1
+
+ if Source != File:
+ CreateDirectory(Source.Dir)
+
+ if File.IsBinary and File == Source and File in BinaryFileList:
+ # Skip all files that are not binary libraries
+ if not self.IsLibrary:
+ continue
+ RuleObject = self.BuildRules[TAB_DEFAULT_BINARY_FILE]
+ elif FileType in self.BuildRules:
+ RuleObject = self.BuildRules[FileType]
+ elif Source.Ext in self.BuildRules:
+ RuleObject = self.BuildRules[Source.Ext]
+ else:
+ # stop at no more rules
+ if LastTarget:
+ self._FinalBuildTargetList.add(LastTarget)
+ break
+
+ FileType = RuleObject.SourceFileType
+ self._FileTypes[FileType].add(Source)
+
+ # stop at STATIC_LIBRARY for library
+ if self.IsLibrary and FileType == TAB_STATIC_LIBRARY:
+ if LastTarget:
+ self._FinalBuildTargetList.add(LastTarget)
+ break
+
+ Target = RuleObject.Apply(Source, self.BuildRuleOrder)
+ if not Target:
+ if LastTarget:
+ self._FinalBuildTargetList.add(LastTarget)
+ break
+ elif not Target.Outputs:
+ # Only do build for target with outputs
+ self._FinalBuildTargetList.add(Target)
+
+ self._BuildTargets[FileType].add(Target)
+
+ if not Source.IsBinary and Source == File:
+ self._IntroBuildTargetList.add(Target)
+
+ # to avoid cyclic rule
+ if FileType in RuleChain:
+ break
+
+ RuleChain.add(FileType)
+ SourceList.extend(Target.Outputs)
+ LastTarget = Target
+ FileType = TAB_UNKNOWN_FILE
+
+ @cached_property
+ def Targets(self):
+ if self._BuildTargets is None:
+ self._IntroBuildTargetList = set()
+ self._FinalBuildTargetList = set()
+ self._BuildTargets = defaultdict(set)
+ self._FileTypes = defaultdict(set)
+
+ #TRICK: call SourceFileList property to apply build rule for source files
+ self.SourceFileList
+
+ #TRICK: call _GetBinaryFileList to apply build rule for binary files
+ self.BinaryFileList
+
+ return self._BuildTargets
+
+ @cached_property
+ def IntroTargetList(self):
+ self.Targets
+ return self._IntroBuildTargetList
+
+ @cached_property
+ def CodaTargetList(self):
+ self.Targets
+ return self._FinalBuildTargetList
+
+ @cached_property
+ def FileTypes(self):
+ self.Targets
+ return self._FileTypes
+
+ ## Get the list of package object the module depends on
+ #
+ # @retval list The package object list
+ #
+ @cached_property
+ def DependentPackageList(self):
+ return self.Module.Packages
+
+ ## Return the list of auto-generated code file
+ #
+ # @retval list The list of auto-generated file
+ #
+ @cached_property
+ def AutoGenFileList(self):
+ AutoGenUniIdf = self.BuildType != 'UEFI_HII'
+ UniStringBinBuffer = BytesIO()
+ IdfGenBinBuffer = BytesIO()
+ RetVal = {}
+ AutoGenC = TemplateString()
+ AutoGenH = TemplateString()
+ StringH = TemplateString()
+ StringIdf = TemplateString()
+ GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUniIdf, UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer)
+ #
+ # AutoGen.c is generated if there are library classes in inf, or there are object files
+ #
+ if str(AutoGenC) != "" and (len(self.Module.LibraryClasses) > 0
+ or TAB_OBJECT_FILE in self.FileTypes):
+ AutoFile = PathClass(gAutoGenCodeFileName, self.DebugDir)
+ RetVal[AutoFile] = str(AutoGenC)
+ self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
+ if str(AutoGenH) != "":
+ AutoFile = PathClass(gAutoGenHeaderFileName, self.DebugDir)
+ RetVal[AutoFile] = str(AutoGenH)
+ self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
+ if str(StringH) != "":
+ AutoFile = PathClass(gAutoGenStringFileName % {"module_name":self.Name}, self.DebugDir)
+ RetVal[AutoFile] = str(StringH)
+ self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
+ if UniStringBinBuffer is not None and UniStringBinBuffer.getvalue() != b"":
+ AutoFile = PathClass(gAutoGenStringFormFileName % {"module_name":self.Name}, self.OutputDir)
+ RetVal[AutoFile] = UniStringBinBuffer.getvalue()
+ AutoFile.IsBinary = True
+ self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
+ if UniStringBinBuffer is not None:
+ UniStringBinBuffer.close()
+ if str(StringIdf) != "":
+ AutoFile = PathClass(gAutoGenImageDefFileName % {"module_name":self.Name}, self.DebugDir)
+ RetVal[AutoFile] = str(StringIdf)
+ self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
+ if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue() != b"":
+ AutoFile = PathClass(gAutoGenIdfFileName % {"module_name":self.Name}, self.OutputDir)
+ RetVal[AutoFile] = IdfGenBinBuffer.getvalue()
+ AutoFile.IsBinary = True
+ self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
+ if IdfGenBinBuffer is not None:
+ IdfGenBinBuffer.close()
+ return RetVal
+
+ ## Return the list of library modules explicitly or implicitly used by this module
+ @cached_property
+ def DependentLibraryList(self):
+ # only merge library classes and PCD for non-library module
+ if self.IsLibrary:
+ return []
+ return self.PlatformInfo.ApplyLibraryInstance(self.Module)
+
+ ## Get the list of PCDs from current module
+ #
+ # @retval list The list of PCD
+ #
+ @cached_property
+ def ModulePcdList(self):
+ # apply PCD settings from platform
+ RetVal = self.PlatformInfo.ApplyPcdSetting(self.Module, self.Module.Pcds)
+
+ return RetVal
+ @cached_property
+ def _PcdComments(self):
+ ReVal = OrderedListDict()
+ ExtendCopyDictionaryLists(ReVal, self.Module.PcdComments)
+ if not self.IsLibrary:
+ for Library in self.DependentLibraryList:
+ ExtendCopyDictionaryLists(ReVal, Library.PcdComments)
+ return ReVal
+
+ ## Get the list of PCDs from dependent libraries
+ #
+ # @retval list The list of PCD
+ #
+ @cached_property
+ def LibraryPcdList(self):
+ if self.IsLibrary:
+ return []
+ RetVal = []
+ Pcds = set()
+ # get PCDs from dependent libraries
+ for Library in self.DependentLibraryList:
+ PcdsInLibrary = OrderedDict()
+ for Key in Library.Pcds:
+ # skip duplicated PCDs
+ if Key in self.Module.Pcds or Key in Pcds:
+ continue
+ Pcds.add(Key)
+ PcdsInLibrary[Key] = copy.copy(Library.Pcds[Key])
+ RetVal.extend(self.PlatformInfo.ApplyPcdSetting(self.Module, PcdsInLibrary, Library=Library))
+ return RetVal
+
+ ## Get the GUID value mapping
+ #
+ # @retval dict The mapping between GUID cname and its value
+ #
+ @cached_property
+ def GuidList(self):
+ RetVal = self.Module.Guids
+ for Library in self.DependentLibraryList:
+ RetVal.update(Library.Guids)
+ ExtendCopyDictionaryLists(self._GuidComments, Library.GuidComments)
+ ExtendCopyDictionaryLists(self._GuidComments, self.Module.GuidComments)
+ return RetVal
+
+ @cached_property
+ def GetGuidsUsedByPcd(self):
+ RetVal = OrderedDict(self.Module.GetGuidsUsedByPcd())
+ for Library in self.DependentLibraryList:
+ RetVal.update(Library.GetGuidsUsedByPcd())
+ return RetVal
+ ## Get the protocol value mapping
+ #
+ # @retval dict The mapping between protocol cname and its value
+ #
+ @cached_property
+ def ProtocolList(self):
+ RetVal = OrderedDict(self.Module.Protocols)
+ for Library in self.DependentLibraryList:
+ RetVal.update(Library.Protocols)
+ ExtendCopyDictionaryLists(self._ProtocolComments, Library.ProtocolComments)
+ ExtendCopyDictionaryLists(self._ProtocolComments, self.Module.ProtocolComments)
+ return RetVal
+
+ ## Get the PPI value mapping
+ #
+ # @retval dict The mapping between PPI cname and its value
+ #
+ @cached_property
+ def PpiList(self):
+ RetVal = OrderedDict(self.Module.Ppis)
+ for Library in self.DependentLibraryList:
+ RetVal.update(Library.Ppis)
+ ExtendCopyDictionaryLists(self._PpiComments, Library.PpiComments)
+ ExtendCopyDictionaryLists(self._PpiComments, self.Module.PpiComments)
+ return RetVal
+
+ ## Get the list of include search path
+ #
+ # @retval list The list path
+ #
+ @cached_property
+ def IncludePathList(self):
+ RetVal = []
+ RetVal.append(self.MetaFile.Dir)
+ RetVal.append(self.DebugDir)
+
+ for Package in self.Module.Packages:
+ PackageDir = mws.join(self.WorkspaceDir, Package.MetaFile.Dir)
+ if PackageDir not in RetVal:
+ RetVal.append(PackageDir)
+ IncludesList = Package.Includes
+ if Package._PrivateIncludes:
+ if not self.MetaFile.OriginalPath.Path.startswith(PackageDir):
+ IncludesList = list(set(Package.Includes).difference(set(Package._PrivateIncludes)))
+ for Inc in IncludesList:
+ if Inc not in RetVal:
+ RetVal.append(str(Inc))
+ return RetVal
+
+ @cached_property
+ def IncludePathLength(self):
+ return sum(len(inc)+1 for inc in self.IncludePathList)
+
+ ## Get HII EX PCDs which maybe used by VFR
+ #
+ # efivarstore used by VFR may relate with HII EX PCDs
+ # Get the variable name and GUID from efivarstore and HII EX PCD
+ # List the HII EX PCDs in As Built INF if both name and GUID match.
+ #
+ # @retval list HII EX PCDs
+ #
+ def _GetPcdsMaybeUsedByVfr(self):
+ if not self.SourceFileList:
+ return []
+
+ NameGuids = set()
+ for SrcFile in self.SourceFileList:
+ if SrcFile.Ext.lower() != '.vfr':
+ continue
+ Vfri = os.path.join(self.OutputDir, SrcFile.BaseName + '.i')
+ if not os.path.exists(Vfri):
+ continue
+ VfriFile = open(Vfri, 'r')
+ Content = VfriFile.read()
+ VfriFile.close()
+ Pos = Content.find('efivarstore')
+ while Pos != -1:
+ #
+ # Make sure 'efivarstore' is the start of efivarstore statement
+ # In case of the value of 'name' (name = efivarstore) is equal to 'efivarstore'
+ #
+ Index = Pos - 1
+ while Index >= 0 and Content[Index] in ' \t\r\n':
+ Index -= 1
+ if Index >= 0 and Content[Index] != ';':
+ Pos = Content.find('efivarstore', Pos + len('efivarstore'))
+ continue
+ #
+ # 'efivarstore' must be followed by name and guid
+ #
+ Name = gEfiVarStoreNamePattern.search(Content, Pos)
+ if not Name:
+ break
+ Guid = gEfiVarStoreGuidPattern.search(Content, Pos)
+ if not Guid:
+ break
+ NameArray = _ConvertStringToByteArray('L"' + Name.group(1) + '"')
+ NameGuids.add((NameArray, GuidStructureStringToGuidString(Guid.group(1))))
+ Pos = Content.find('efivarstore', Name.end())
+ if not NameGuids:
+ return []
+ HiiExPcds = []
+ for Pcd in self.PlatformInfo.Platform.Pcds.values():
+ if Pcd.Type != TAB_PCDS_DYNAMIC_EX_HII:
+ continue
+ for SkuInfo in Pcd.SkuInfoList.values():
+ Value = GuidValue(SkuInfo.VariableGuid, self.PlatformInfo.PackageList, self.MetaFile.Path)
+ if not Value:
+ continue
+ Name = _ConvertStringToByteArray(SkuInfo.VariableName)
+ Guid = GuidStructureStringToGuidString(Value)
+ if (Name, Guid) in NameGuids and Pcd not in HiiExPcds:
+ HiiExPcds.append(Pcd)
+ break
+
+ return HiiExPcds
+
+ def _GenOffsetBin(self):
+ VfrUniBaseName = {}
+ for SourceFile in self.Module.Sources:
+ if SourceFile.Type.upper() == ".VFR" :
+ #
+ # search the .map file to find the offset of vfr binary in the PE32+/TE file.
+ #
+ VfrUniBaseName[SourceFile.BaseName] = (SourceFile.BaseName + "Bin")
+ elif SourceFile.Type.upper() == ".UNI" :
+ #
+ # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
+ #
+ VfrUniBaseName["UniOffsetName"] = (self.Name + "Strings")
+
+ if not VfrUniBaseName:
+ return None
+ MapFileName = os.path.join(self.OutputDir, self.Name + ".map")
+ EfiFileName = os.path.join(self.OutputDir, self.Name + ".efi")
+ VfrUniOffsetList = GetVariableOffset(MapFileName, EfiFileName, list(VfrUniBaseName.values()))
+ if not VfrUniOffsetList:
+ return None
+
+ OutputName = '%sOffset.bin' % self.Name
+ UniVfrOffsetFileName = os.path.join( self.OutputDir, OutputName)
+
+ try:
+ fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
+ except:
+ EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
+
+ # Use a instance of BytesIO to cache data
+ fStringIO = BytesIO()
+
+ for Item in VfrUniOffsetList:
+ if (Item[0].find("Strings") != -1):
+ #
+ # UNI offset in image.
+ # GUID + Offset
+ # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66 } }
+ #
+ UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
+ fStringIO.write(UniGuid)
+ UniValue = pack ('Q', int (Item[1], 16))
+ fStringIO.write (UniValue)
+ else:
+ #
+ # VFR binary offset in image.
+ # GUID + Offset
+ # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2 } };
+ #
+ VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
+ fStringIO.write(VfrGuid)
+ VfrValue = pack ('Q', int (Item[1], 16))
+ fStringIO.write (VfrValue)
+ #
+ # write data into file.
+ #
+ try :
+ fInputfile.write (fStringIO.getvalue())
+ except:
+ EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
+ "file been locked or using by other applications." %UniVfrOffsetFileName, None)
+
+ fStringIO.close ()
+ fInputfile.close ()
+ return OutputName
+ @cached_property
+ def OutputFile(self):
+ retVal = set()
+ OutputDir = self.OutputDir.replace('\\', '/').strip('/')
+ DebugDir = self.DebugDir.replace('\\', '/').strip('/')
+ for Item in self.CodaTargetList:
+ File = Item.Target.Path.replace('\\', '/').strip('/').replace(DebugDir, '').replace(OutputDir, '').strip('/')
+ retVal.add(File)
+ if self.DepexGenerated:
+ retVal.add(self.Name + '.depex')
+
+ Bin = self._GenOffsetBin()
+ if Bin:
+ retVal.add(Bin)
+
+ for Root, Dirs, Files in os.walk(OutputDir):
+ for File in Files:
+ if File.lower().endswith('.pdb'):
+ retVal.add(File)
+
+ return retVal
+
+ ## Create AsBuilt INF file the module
+ #
+ def CreateAsBuiltInf(self):
+
+ if self.IsAsBuiltInfCreated:
+ return
+
+ # Skip INF file generation for libraries
+ if self.IsLibrary:
+ return
+
+ # Skip the following code for modules with no source files
+ if not self.SourceFileList:
+ return
+
+ # Skip the following code for modules without any binary files
+ if self.BinaryFileList:
+ return
+
+ ### TODO: How to handles mixed source and binary modules
+
+ # Find all DynamicEx and PatchableInModule PCDs used by this module and dependent libraries
+ # Also find all packages that the DynamicEx PCDs depend on
+ Pcds = []
+ PatchablePcds = []
+ Packages = []
+ PcdCheckList = []
+ PcdTokenSpaceList = []
+ for Pcd in self.ModulePcdList + self.LibraryPcdList:
+ if Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
+ PatchablePcds.append(Pcd)
+ PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_PATCHABLE_IN_MODULE))
+ elif Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
+ if Pcd not in Pcds:
+ Pcds.append(Pcd)
+ PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC_EX))
+ PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC))
+ PcdTokenSpaceList.append(Pcd.TokenSpaceGuidCName)
+ GuidList = OrderedDict(self.GuidList)
+ for TokenSpace in self.GetGuidsUsedByPcd:
+ # If token space is not referred by patch PCD or Ex PCD, remove the GUID from GUID list
+ # The GUIDs in GUIDs section should really be the GUIDs in source INF or referred by Ex an patch PCDs
+ if TokenSpace not in PcdTokenSpaceList and TokenSpace in GuidList:
+ GuidList.pop(TokenSpace)
+ CheckList = (GuidList, self.PpiList, self.ProtocolList, PcdCheckList)
+ for Package in self.DerivedPackageList:
+ if Package in Packages:
+ continue
+ BeChecked = (Package.Guids, Package.Ppis, Package.Protocols, Package.Pcds)
+ Found = False
+ for Index in range(len(BeChecked)):
+ for Item in CheckList[Index]:
+ if Item in BeChecked[Index]:
+ Packages.append(Package)
+ Found = True
+ break
+ if Found:
+ break
+
+ VfrPcds = self._GetPcdsMaybeUsedByVfr()
+ for Pkg in self.PlatformInfo.PackageList:
+ if Pkg in Packages:
+ continue
+ for VfrPcd in VfrPcds:
+ if ((VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC_EX) in Pkg.Pcds or
+ (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC) in Pkg.Pcds):
+ Packages.append(Pkg)
+ break
+
+ ModuleType = SUP_MODULE_DXE_DRIVER if self.ModuleType == SUP_MODULE_UEFI_DRIVER and self.DepexGenerated else self.ModuleType
+ DriverType = self.PcdIsDriver if self.PcdIsDriver else ''
+ Guid = self.Guid
+ MDefs = self.Module.Defines
+
+ AsBuiltInfDict = {
+ 'module_name' : self.Name,
+ 'module_guid' : Guid,
+ 'module_module_type' : ModuleType,
+ 'module_version_string' : [MDefs['VERSION_STRING']] if 'VERSION_STRING' in MDefs else [],
+ 'pcd_is_driver_string' : [],
+ 'module_uefi_specification_version' : [],
+ 'module_pi_specification_version' : [],
+ 'module_entry_point' : self.Module.ModuleEntryPointList,
+ 'module_unload_image' : self.Module.ModuleUnloadImageList,
+ 'module_constructor' : self.Module.ConstructorList,
+ 'module_destructor' : self.Module.DestructorList,
+ 'module_shadow' : [MDefs['SHADOW']] if 'SHADOW' in MDefs else [],
+ 'module_pci_vendor_id' : [MDefs['PCI_VENDOR_ID']] if 'PCI_VENDOR_ID' in MDefs else [],
+ 'module_pci_device_id' : [MDefs['PCI_DEVICE_ID']] if 'PCI_DEVICE_ID' in MDefs else [],
+ 'module_pci_class_code' : [MDefs['PCI_CLASS_CODE']] if 'PCI_CLASS_CODE' in MDefs else [],
+ 'module_pci_revision' : [MDefs['PCI_REVISION']] if 'PCI_REVISION' in MDefs else [],
+ 'module_build_number' : [MDefs['BUILD_NUMBER']] if 'BUILD_NUMBER' in MDefs else [],
+ 'module_spec' : [MDefs['SPEC']] if 'SPEC' in MDefs else [],
+ 'module_uefi_hii_resource_section' : [MDefs['UEFI_HII_RESOURCE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [],
+ 'module_uni_file' : [MDefs['MODULE_UNI_FILE']] if 'MODULE_UNI_FILE' in MDefs else [],
+ 'module_arch' : self.Arch,
+ 'package_item' : [Package.MetaFile.File.replace('\\', '/') for Package in Packages],
+ 'binary_item' : [],
+ 'patchablepcd_item' : [],
+ 'pcd_item' : [],
+ 'protocol_item' : [],
+ 'ppi_item' : [],
+ 'guid_item' : [],
+ 'flags_item' : [],
+ 'libraryclasses_item' : []
+ }
+
+ if 'MODULE_UNI_FILE' in MDefs:
+ UNIFile = os.path.join(self.MetaFile.Dir, MDefs['MODULE_UNI_FILE'])
+ if os.path.isfile(UNIFile):
+ shutil.copy2(UNIFile, self.OutputDir)
+
+ if self.AutoGenVersion > int(gInfSpecVersion, 0):
+ AsBuiltInfDict['module_inf_version'] = '0x%08x' % self.AutoGenVersion
+ else:
+ AsBuiltInfDict['module_inf_version'] = gInfSpecVersion
+
+ if DriverType:
+ AsBuiltInfDict['pcd_is_driver_string'].append(DriverType)
+
+ if 'UEFI_SPECIFICATION_VERSION' in self.Specification:
+ AsBuiltInfDict['module_uefi_specification_version'].append(self.Specification['UEFI_SPECIFICATION_VERSION'])
+ if 'PI_SPECIFICATION_VERSION' in self.Specification:
+ AsBuiltInfDict['module_pi_specification_version'].append(self.Specification['PI_SPECIFICATION_VERSION'])
+
+ OutputDir = self.OutputDir.replace('\\', '/').strip('/')
+ DebugDir = self.DebugDir.replace('\\', '/').strip('/')
+ for Item in self.CodaTargetList:
+ File = Item.Target.Path.replace('\\', '/').strip('/').replace(DebugDir, '').replace(OutputDir, '').strip('/')
+ if os.path.isabs(File):
+ File = File.replace('\\', '/').strip('/').replace(OutputDir, '').strip('/')
+ if Item.Target.Ext.lower() == '.aml':
+ AsBuiltInfDict['binary_item'].append('ASL|' + File)
+ elif Item.Target.Ext.lower() == '.acpi':
+ AsBuiltInfDict['binary_item'].append('ACPI|' + File)
+ elif Item.Target.Ext.lower() == '.efi':
+ AsBuiltInfDict['binary_item'].append('PE32|' + self.Name + '.efi')
+ else:
+ AsBuiltInfDict['binary_item'].append('BIN|' + File)
+ if not self.DepexGenerated:
+ DepexFile = os.path.join(self.OutputDir, self.Name + '.depex')
+ if os.path.exists(DepexFile):
+ self.DepexGenerated = True
+ if self.DepexGenerated:
+ if self.ModuleType in [SUP_MODULE_PEIM]:
+ AsBuiltInfDict['binary_item'].append('PEI_DEPEX|' + self.Name + '.depex')
+ elif self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]:
+ AsBuiltInfDict['binary_item'].append('DXE_DEPEX|' + self.Name + '.depex')
+ elif self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]:
+ AsBuiltInfDict['binary_item'].append('SMM_DEPEX|' + self.Name + '.depex')
+
+ Bin = self._GenOffsetBin()
+ if Bin:
+ AsBuiltInfDict['binary_item'].append('BIN|%s' % Bin)
+
+ for Root, Dirs, Files in os.walk(OutputDir):
+ for File in Files:
+ if File.lower().endswith('.pdb'):
+ AsBuiltInfDict['binary_item'].append('DISPOSABLE|' + File)
+ HeaderComments = self.Module.HeaderComments
+ StartPos = 0
+ for Index in range(len(HeaderComments)):
+ if HeaderComments[Index].find('@BinaryHeader') != -1:
+ HeaderComments[Index] = HeaderComments[Index].replace('@BinaryHeader', '@file')
+ StartPos = Index
+ break
+ AsBuiltInfDict['header_comments'] = '\n'.join(HeaderComments[StartPos:]).replace(':#', '://')
+ AsBuiltInfDict['tail_comments'] = '\n'.join(self.Module.TailComments)
+
+ GenList = [
+ (self.ProtocolList, self._ProtocolComments, 'protocol_item'),
+ (self.PpiList, self._PpiComments, 'ppi_item'),
+ (GuidList, self._GuidComments, 'guid_item')
+ ]
+ for Item in GenList:
+ for CName in Item[0]:
+ Comments = '\n '.join(Item[1][CName]) if CName in Item[1] else ''
+ Entry = Comments + '\n ' + CName if Comments else CName
+ AsBuiltInfDict[Item[2]].append(Entry)
+ PatchList = parsePcdInfoFromMapFile(
+ os.path.join(self.OutputDir, self.Name + '.map'),
+ os.path.join(self.OutputDir, self.Name + '.efi')
+ )
+ if PatchList:
+ for Pcd in PatchablePcds:
+ TokenCName = Pcd.TokenCName
+ for PcdItem in GlobalData.MixedPcd:
+ if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
+ TokenCName = PcdItem[0]
+ break
+ for PatchPcd in PatchList:
+ if TokenCName == PatchPcd[0]:
+ break
+ else:
+ continue
+ PcdValue = ''
+ if Pcd.DatumType == 'BOOLEAN':
+ BoolValue = Pcd.DefaultValue.upper()
+ if BoolValue == 'TRUE':
+ Pcd.DefaultValue = '1'
+ elif BoolValue == 'FALSE':
+ Pcd.DefaultValue = '0'
+
+ if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
+ HexFormat = '0x%02x'
+ if Pcd.DatumType == TAB_UINT16:
+ HexFormat = '0x%04x'
+ elif Pcd.DatumType == TAB_UINT32:
+ HexFormat = '0x%08x'
+ elif Pcd.DatumType == TAB_UINT64:
+ HexFormat = '0x%016x'
+ PcdValue = HexFormat % int(Pcd.DefaultValue, 0)
+ else:
+ if Pcd.MaxDatumSize is None or Pcd.MaxDatumSize == '':
+ EdkLogger.error("build", AUTOGEN_ERROR,
+ "Unknown [MaxDatumSize] of PCD [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName)
+ )
+ ArraySize = int(Pcd.MaxDatumSize, 0)
+ PcdValue = Pcd.DefaultValue
+ if PcdValue[0] != '{':
+ Unicode = False
+ if PcdValue[0] == 'L':
+ Unicode = True
+ PcdValue = PcdValue.lstrip('L')
+ PcdValue = eval(PcdValue)
+ NewValue = '{'
+ for Index in range(0, len(PcdValue)):
+ if Unicode:
+ CharVal = ord(PcdValue[Index])
+ NewValue = NewValue + '0x%02x' % (CharVal & 0x00FF) + ', ' \
+ + '0x%02x' % (CharVal >> 8) + ', '
+ else:
+ NewValue = NewValue + '0x%02x' % (ord(PcdValue[Index]) % 0x100) + ', '
+ Padding = '0x00, '
+ if Unicode:
+ Padding = Padding * 2
+ ArraySize = ArraySize // 2
+ if ArraySize < (len(PcdValue) + 1):
+ if Pcd.MaxSizeUserSet:
+ EdkLogger.error("build", AUTOGEN_ERROR,
+ "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName)
+ )
+ else:
+ ArraySize = len(PcdValue) + 1
+ if ArraySize > len(PcdValue) + 1:
+ NewValue = NewValue + Padding * (ArraySize - len(PcdValue) - 1)
+ PcdValue = NewValue + Padding.strip().rstrip(',') + '}'
+ elif len(PcdValue.split(',')) <= ArraySize:
+ PcdValue = PcdValue.rstrip('}') + ', 0x00' * (ArraySize - len(PcdValue.split(',')))
+ PcdValue += '}'
+ else:
+ if Pcd.MaxSizeUserSet:
+ EdkLogger.error("build", AUTOGEN_ERROR,
+ "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName)
+ )
+ else:
+ ArraySize = len(PcdValue) + 1
+ PcdItem = '%s.%s|%s|0x%X' % \
+ (Pcd.TokenSpaceGuidCName, TokenCName, PcdValue, PatchPcd[1])
+ PcdComments = ''
+ if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdComments:
+ PcdComments = '\n '.join(self._PcdComments[Pcd.TokenSpaceGuidCName, Pcd.TokenCName])
+ if PcdComments:
+ PcdItem = PcdComments + '\n ' + PcdItem
+ AsBuiltInfDict['patchablepcd_item'].append(PcdItem)
+
+ for Pcd in Pcds + VfrPcds:
+ PcdCommentList = []
+ HiiInfo = ''
+ TokenCName = Pcd.TokenCName
+ for PcdItem in GlobalData.MixedPcd:
+ if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
+ TokenCName = PcdItem[0]
+ break
+ if Pcd.Type == TAB_PCDS_DYNAMIC_EX_HII:
+ for SkuName in Pcd.SkuInfoList:
+ SkuInfo = Pcd.SkuInfoList[SkuName]
+ HiiInfo = '## %s|%s|%s' % (SkuInfo.VariableName, SkuInfo.VariableGuid, SkuInfo.VariableOffset)
+ break
+ if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdComments:
+ PcdCommentList = self._PcdComments[Pcd.TokenSpaceGuidCName, Pcd.TokenCName][:]
+ if HiiInfo:
+ UsageIndex = -1
+ UsageStr = ''
+ for Index, Comment in enumerate(PcdCommentList):
+ for Usage in UsageList:
+ if Comment.find(Usage) != -1:
+ UsageStr = Usage
+ UsageIndex = Index
+ break
+ if UsageIndex != -1:
+ PcdCommentList[UsageIndex] = '## %s %s %s' % (UsageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, ''))
+ else:
+ PcdCommentList.append('## UNDEFINED ' + HiiInfo)
+ PcdComments = '\n '.join(PcdCommentList)
+ PcdEntry = Pcd.TokenSpaceGuidCName + '.' + TokenCName
+ if PcdComments:
+ PcdEntry = PcdComments + '\n ' + PcdEntry
+ AsBuiltInfDict['pcd_item'].append(PcdEntry)
+ for Item in self.BuildOption:
+ if 'FLAGS' in self.BuildOption[Item]:
+ AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_FLAGS = %s' % (self.ToolChainFamily, self.BuildTarget, self.ToolChain, self.Arch, Item, self.BuildOption[Item]['FLAGS'].strip()))
+
+ # Generated LibraryClasses section in comments.
+ for Library in self.LibraryAutoGenList:
+ AsBuiltInfDict['libraryclasses_item'].append(Library.MetaFile.File.replace('\\', '/'))
+
+ # Generated UserExtensions TianoCore section.
+ # All tianocore user extensions are copied.
+ UserExtStr = ''
+ for TianoCore in self._GetTianoCoreUserExtensionList():
+ UserExtStr += '\n'.join(TianoCore)
+ ExtensionFile = os.path.join(self.MetaFile.Dir, TianoCore[1])
+ if os.path.isfile(ExtensionFile):
+ shutil.copy2(ExtensionFile, self.OutputDir)
+ AsBuiltInfDict['userextension_tianocore_item'] = UserExtStr
+
+ # Generated depex expression section in comments.
+ DepexExpression = self._GetDepexExpresionString()
+ AsBuiltInfDict['depexsection_item'] = DepexExpression if DepexExpression else ''
+
+ AsBuiltInf = TemplateString()
+ AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfDict))
+
+ SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.inf'), str(AsBuiltInf), False)
+
+ self.IsAsBuiltInfCreated = True
+
+ def CopyModuleToCache(self):
+ FileDir = path.join(GlobalData.gBinCacheDest, self.PlatformInfo.Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
+ CreateDirectory (FileDir)
+ HashFile = path.join(self.BuildDir, self.Name + '.hash')
+ if os.path.exists(HashFile):
+ CopyFileOnChange(HashFile, FileDir)
+ ModuleFile = path.join(self.OutputDir, self.Name + '.inf')
+ if os.path.exists(ModuleFile):
+ CopyFileOnChange(ModuleFile, FileDir)
+ if not self.OutputFile:
+ Ma = self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
+ self.OutputFile = Ma.Binaries
+ for File in self.OutputFile:
+ File = str(File)
+ if not os.path.isabs(File):
+ File = os.path.join(self.OutputDir, File)
+ if os.path.exists(File):
+ sub_dir = os.path.relpath(File, self.OutputDir)
+ destination_file = os.path.join(FileDir, sub_dir)
+ destination_dir = os.path.dirname(destination_file)
+ CreateDirectory(destination_dir)
+ CopyFileOnChange(File, destination_dir)
+
+ def AttemptModuleCacheCopy(self):
+ # If library or Module is binary do not skip by hash
+ if self.IsBinaryModule:
+ return False
+ # .inc is contains binary information so do not skip by hash as well
+ for f_ext in self.SourceFileList:
+ if '.inc' in str(f_ext):
+ return False
+ FileDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
+ HashFile = path.join(FileDir, self.Name + '.hash')
+ if os.path.exists(HashFile):
+ f = open(HashFile, 'r')
+ CacheHash = f.read()
+ f.close()
+ self.GenModuleHash()
+ if GlobalData.gModuleHash[self.Arch][self.Name]:
+ if CacheHash == GlobalData.gModuleHash[self.Arch][self.Name]:
+ for root, dir, files in os.walk(FileDir):
+ for f in files:
+ if self.Name + '.hash' in f:
+ CopyFileOnChange(HashFile, self.BuildDir)
+ else:
+ File = path.join(root, f)
+ sub_dir = os.path.relpath(File, FileDir)
+ destination_file = os.path.join(self.OutputDir, sub_dir)
+ destination_dir = os.path.dirname(destination_file)
+ CreateDirectory(destination_dir)
+ CopyFileOnChange(File, destination_dir)
+ if self.Name == "PcdPeim" or self.Name == "PcdDxe":
+ CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
+ return True
+ return False
+
+ ## Create makefile for the module and its dependent libraries
+ #
+ # @param CreateLibraryMakeFile Flag indicating if or not the makefiles of
+ # dependent libraries will be created
+ #
+ @cached_class_function
+ def CreateMakeFile(self, CreateLibraryMakeFile=True, GenFfsList = []):
+ # nest this function inside it's only caller.
+ def CreateTimeStamp():
+ FileSet = {self.MetaFile.Path}
+
+ for SourceFile in self.Module.Sources:
+ FileSet.add (SourceFile.Path)
+
+ for Lib in self.DependentLibraryList:
+ FileSet.add (Lib.MetaFile.Path)
+
+ for f in self.AutoGenDepSet:
+ FileSet.add (f.Path)
+
+ if os.path.exists (self.TimeStampPath):
+ os.remove (self.TimeStampPath)
+ with open(self.TimeStampPath, 'w+') as fd:
+ for f in FileSet:
+ fd.write(f)
+ fd.write("\n")
+
+ # Ignore generating makefile when it is a binary module
+ if self.IsBinaryModule:
+ return
+
+ self.GenFfsList = GenFfsList
+
+ if not self.IsLibrary and CreateLibraryMakeFile:
+ for LibraryAutoGen in self.LibraryAutoGenList:
+ LibraryAutoGen.CreateMakeFile()
+ # Don't enable if hash feature enabled, CanSkip uses timestamps to determine build skipping
+ if not GlobalData.gUseHashCache and self.CanSkip():
+ return
+
+ if len(self.CustomMakefile) == 0:
+ Makefile = GenMake.ModuleMakefile(self)
+ else:
+ Makefile = GenMake.CustomMakefile(self)
+ if Makefile.Generate():
+ EdkLogger.debug(EdkLogger.DEBUG_9, "Generated makefile for module %s [%s]" %
+ (self.Name, self.Arch))
+ else:
+ EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation of makefile for module %s [%s]" %
+ (self.Name, self.Arch))
+
+ CreateTimeStamp()
+
+ def CopyBinaryFiles(self):
+ for File in self.Module.Binaries:
+ SrcPath = File.Path
+ DstPath = os.path.join(self.OutputDir, os.path.basename(SrcPath))
+ CopyLongFilePath(SrcPath, DstPath)
+ ## Create autogen code for the module and its dependent libraries
+ #
+ # @param CreateLibraryCodeFile Flag indicating if or not the code of
+ # dependent libraries will be created
+ #
+ def CreateCodeFile(self, CreateLibraryCodeFile=True):
+ if self.IsCodeFileCreated:
+ return
+
+ # Need to generate PcdDatabase even PcdDriver is binarymodule
+ if self.IsBinaryModule and self.PcdIsDriver != '':
+ CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
+ return
+ if self.IsBinaryModule:
+ if self.IsLibrary:
+ self.CopyBinaryFiles()
+ return
+
+ if not self.IsLibrary and CreateLibraryCodeFile:
+ for LibraryAutoGen in self.LibraryAutoGenList:
+ LibraryAutoGen.CreateCodeFile()
+
+ # Don't enable if hash feature enabled, CanSkip uses timestamps to determine build skipping
+ if not GlobalData.gUseHashCache and self.CanSkip():
+ return
+
+ AutoGenList = []
+ IgoredAutoGenList = []
+
+ for File in self.AutoGenFileList:
+ if GenC.Generate(File.Path, self.AutoGenFileList[File], File.IsBinary):
+ AutoGenList.append(str(File))
+ else:
+ IgoredAutoGenList.append(str(File))
+
+
+ for ModuleType in self.DepexList:
+ # Ignore empty [depex] section or [depex] section for SUP_MODULE_USER_DEFINED module
+ if len(self.DepexList[ModuleType]) == 0 or ModuleType == SUP_MODULE_USER_DEFINED or ModuleType == SUP_MODULE_HOST_APPLICATION:
+ continue
+
+ Dpx = GenDepex.DependencyExpression(self.DepexList[ModuleType], ModuleType, True)
+ DpxFile = gAutoGenDepexFileName % {"module_name" : self.Name}
+
+ if len(Dpx.PostfixNotation) != 0:
+ self.DepexGenerated = True
+
+ if Dpx.Generate(path.join(self.OutputDir, DpxFile)):
+ AutoGenList.append(str(DpxFile))
+ else:
+ IgoredAutoGenList.append(str(DpxFile))
+
+ if IgoredAutoGenList == []:
+ EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] files for module %s [%s]" %
+ (" ".join(AutoGenList), self.Name, self.Arch))
+ elif AutoGenList == []:
+ EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation of [%s] files for module %s [%s]" %
+ (" ".join(IgoredAutoGenList), self.Name, self.Arch))
+ else:
+ EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] (skipped %s) files for module %s [%s]" %
+ (" ".join(AutoGenList), " ".join(IgoredAutoGenList), self.Name, self.Arch))
+
+ self.IsCodeFileCreated = True
+ return AutoGenList
+
+ ## Summarize the ModuleAutoGen objects of all libraries used by this module
+ @cached_property
+ def LibraryAutoGenList(self):
+ RetVal = []
+ for Library in self.DependentLibraryList:
+ La = ModuleAutoGen(
+ self.Workspace,
+ Library.MetaFile,
+ self.BuildTarget,
+ self.ToolChain,
+ self.Arch,
+ self.PlatformInfo.MetaFile,
+ self.DataPipe
+ )
+ La.IsLibrary = True
+ if La not in RetVal:
+ RetVal.append(La)
+ for Lib in La.CodaTargetList:
+ self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE)
+ return RetVal
+
+ def GenModuleHash(self):
+ # Initialize a dictionary for each arch type
+ if self.Arch not in GlobalData.gModuleHash:
+ GlobalData.gModuleHash[self.Arch] = {}
+
+ # Early exit if module or library has been hashed and is in memory
+ if self.Name in GlobalData.gModuleHash[self.Arch]:
+ return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8')
+
+ # Initialze hash object
+ m = hashlib.md5()
+
+ # Add Platform level hash
+ m.update(GlobalData.gPlatformHash.encode('utf-8'))
+
+ # Add Package level hash
+ if self.DependentPackageList:
+ for Pkg in sorted(self.DependentPackageList, key=lambda x: x.PackageName):
+ if Pkg.PackageName in GlobalData.gPackageHash:
+ m.update(GlobalData.gPackageHash[Pkg.PackageName].encode('utf-8'))
+
+ # Add Library hash
+ if self.LibraryAutoGenList:
+ for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.Name):
+ if Lib.Name not in GlobalData.gModuleHash[self.Arch]:
+ Lib.GenModuleHash()
+ m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].encode('utf-8'))
+
+ # Add Module self
+ f = open(str(self.MetaFile), 'rb')
+ Content = f.read()
+ f.close()
+ m.update(Content)
+
+ # Add Module's source files
+ if self.SourceFileList:
+ for File in sorted(self.SourceFileList, key=lambda x: str(x)):
+ f = open(str(File), 'rb')
+ Content = f.read()
+ f.close()
+ m.update(Content)
+
+ GlobalData.gModuleHash[self.Arch][self.Name] = m.hexdigest()
+
+ return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8')
+
+ ## Decide whether we can skip the ModuleAutoGen process
+ def CanSkipbyHash(self):
+ # Hashing feature is off
+ if not GlobalData.gUseHashCache:
+ return False
+
+ # Initialize a dictionary for each arch type
+ if self.Arch not in GlobalData.gBuildHashSkipTracking:
+ GlobalData.gBuildHashSkipTracking[self.Arch] = dict()
+
+ # If library or Module is binary do not skip by hash
+ if self.IsBinaryModule:
+ return False
+
+ # .inc is contains binary information so do not skip by hash as well
+ for f_ext in self.SourceFileList:
+ if '.inc' in str(f_ext):
+ return False
+
+ # Use Cache, if exists and if Module has a copy in cache
+ if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy():
+ return True
+
+ # Early exit for libraries that haven't yet finished building
+ HashFile = path.join(self.BuildDir, self.Name + ".hash")
+ if self.IsLibrary and not os.path.exists(HashFile):
+ return False
+
+ # Return a Boolean based on if can skip by hash, either from memory or from IO.
+ if self.Name not in GlobalData.gBuildHashSkipTracking[self.Arch]:
+ # If hashes are the same, SaveFileOnChange() will return False.
+ GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] = not SaveFileOnChange(HashFile, self.GenModuleHash(), True)
+ return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name]
+ else:
+ return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name]
+
+ ## Decide whether we can skip the ModuleAutoGen process
+ # If any source file is newer than the module than we cannot skip
+ #
+ def CanSkip(self):
+ if self.MakeFileDir in GlobalData.gSikpAutoGenCache:
+ return True
+ if not os.path.exists(self.TimeStampPath):
+ return False
+ #last creation time of the module
+ DstTimeStamp = os.stat(self.TimeStampPath)[8]
+
+ SrcTimeStamp = self.Workspace._SrcTimeStamp
+ if SrcTimeStamp > DstTimeStamp:
+ return False
+
+ with open(self.TimeStampPath,'r') as f:
+ for source in f:
+ source = source.rstrip('\n')
+ if not os.path.exists(source):
+ return False
+ if source not in ModuleAutoGen.TimeDict :
+ ModuleAutoGen.TimeDict[source] = os.stat(source)[8]
+ if ModuleAutoGen.TimeDict[source] > DstTimeStamp:
+ return False
+ GlobalData.gSikpAutoGenCache.add(self.MakeFileDir)
+ return True
+
+ @cached_property
+ def TimeStampPath(self):
+ return os.path.join(self.MakeFileDir, 'AutoGenTimeStamp')
diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
new file mode 100644
index 000000000000..5186ca1da3e3
--- /dev/null
+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
@@ -0,0 +1,616 @@
+## @file
+# Create makefile for MS nmake and GNU make
+#
+# Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+#
+from __future__ import absolute_import
+from Workspace.WorkspaceDatabase import WorkspaceDatabase,BuildDB
+from Common.caching import cached_property
+from AutoGen.BuildEngine import BuildRule,AutoGenReqBuildRuleVerNum
+from AutoGen.AutoGen import CalculatePriorityValue
+from Common.Misc import CheckPcdDatum,GuidValue
+from Common.Expression import ValueExpressionEx
+from Common.DataType import *
+from CommonDataClass.Exceptions import *
+from CommonDataClass.CommonClass import SkuInfoClass
+import Common.EdkLogger as EdkLogger
+from Common.BuildToolError import OPTION_CONFLICT,FORMAT_INVALID,RESOURCE_NOT_AVAILABLE
+from Common.MultipleWorkspace import MultipleWorkspace as mws
+from collections import defaultdict
+from Common.Misc import PathClass
+import os
+
+
+#
+# The priority list while override build option
+#
+PrioList = {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
+ "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
+ "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+ "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+ "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+ "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+ "0x10011" : 10, # TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
+ "0x00011" : 9, # ******_*********_****_COMMANDTYPE_ATTRIBUTE
+ "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+ "0x01101" : 7, # ******_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+ "0x10101" : 6, # TARGET_*********_ARCH_***********_ATTRIBUTE
+ "0x00101" : 5, # ******_*********_ARCH_***********_ATTRIBUTE
+ "0x11001" : 4, # TARGET_TOOLCHAIN_****_***********_ATTRIBUTE
+ "0x01001" : 3, # ******_TOOLCHAIN_****_***********_ATTRIBUTE
+ "0x10001" : 2, # TARGET_*********_****_***********_ATTRIBUTE
+ "0x00001" : 1} # ******_*********_****_***********_ATTRIBUTE (Lowest)
+## Base class for AutoGen
+#
+# This class just implements the cache mechanism of AutoGen objects.
+#
+class AutoGenInfo(object):
+ # database to maintain the objects in each child class
+ __ObjectCache = {} # (BuildTarget, ToolChain, ARCH, platform file): AutoGen object
+
+ ## Factory method
+ #
+ # @param Class class object of real AutoGen class
+ # (WorkspaceAutoGen, ModuleAutoGen or PlatformAutoGen)
+ # @param Workspace Workspace directory or WorkspaceAutoGen object
+ # @param MetaFile The path of meta file
+ # @param Target Build target
+ # @param Toolchain Tool chain name
+ # @param Arch Target arch
+ # @param *args The specific class related parameters
+ # @param **kwargs The specific class related dict parameters
+ #
+ @classmethod
+ def GetCache(cls):
+ return cls.__ObjectCache
+ def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
+ # check if the object has been created
+ Key = (Target, Toolchain, Arch, MetaFile)
+ if Key in cls.__ObjectCache:
+ # if it exists, just return it directly
+ return cls.__ObjectCache[Key]
+ # it didnt exist. create it, cache it, then return it
+ RetVal = cls.__ObjectCache[Key] = super(AutoGenInfo, cls).__new__(cls)
+ return RetVal
+
+
+ ## hash() operator
+ #
+ # The file path of platform file will be used to represent hash value of this object
+ #
+ # @retval int Hash value of the file path of platform file
+ #
+ def __hash__(self):
+ return hash(self.MetaFile)
+
+ ## str() operator
+ #
+ # The file path of platform file will be used to represent this object
+ #
+ # @retval string String of platform file path
+ #
+ def __str__(self):
+ return str(self.MetaFile)
+
+ ## "==" operator
+ def __eq__(self, Other):
+ return Other and self.MetaFile == Other
+
+ ## Expand * in build option key
+ #
+ # @param Options Options to be expanded
+ # @param ToolDef Use specified ToolDef instead of full version.
+ # This is needed during initialization to prevent
+ # infinite recursion betweeh BuildOptions,
+ # ToolDefinition, and this function.
+ #
+ # @retval options Options expanded
+ #
+ def _ExpandBuildOption(self, Options, ModuleStyle=None, ToolDef=None):
+ if not ToolDef:
+ ToolDef = self.ToolDefinition
+ BuildOptions = {}
+ FamilyMatch = False
+ FamilyIsNull = True
+
+ OverrideList = {}
+ #
+ # Construct a list contain the build options which need override.
+ #
+ for Key in Options:
+ #
+ # Key[0] -- tool family
+ # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
+ #
+ if (Key[0] == self.BuildRuleFamily and
+ (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 and Key[2] == ModuleStyle))):
+ Target, ToolChain, Arch, CommandType, Attr = Key[1].split('_')
+ if (Target == self.BuildTarget or Target == TAB_STAR) and\
+ (ToolChain == self.ToolChain or ToolChain == TAB_STAR) and\
+ (Arch == self.Arch or Arch == TAB_STAR) and\
+ Options[Key].startswith("="):
+
+ if OverrideList.get(Key[1]) is not None:
+ OverrideList.pop(Key[1])
+ OverrideList[Key[1]] = Options[Key]
+
+ #
+ # Use the highest priority value.
+ #
+ if (len(OverrideList) >= 2):
+ KeyList = list(OverrideList.keys())
+ for Index in range(len(KeyList)):
+ NowKey = KeyList[Index]
+ Target1, ToolChain1, Arch1, CommandType1, Attr1 = NowKey.split("_")
+ for Index1 in range(len(KeyList) - Index - 1):
+ NextKey = KeyList[Index1 + Index + 1]
+ #
+ # Compare two Key, if one is included by another, choose the higher priority one
+ #
+ Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split("_")
+ if (Target1 == Target2 or Target1 == TAB_STAR or Target2 == TAB_STAR) and\
+ (ToolChain1 == ToolChain2 or ToolChain1 == TAB_STAR or ToolChain2 == TAB_STAR) and\
+ (Arch1 == Arch2 or Arch1 == TAB_STAR or Arch2 == TAB_STAR) and\
+ (CommandType1 == CommandType2 or CommandType1 == TAB_STAR or CommandType2 == TAB_STAR) and\
+ (Attr1 == Attr2 or Attr1 == TAB_STAR or Attr2 == TAB_STAR):
+
+ if CalculatePriorityValue(NowKey) > CalculatePriorityValue(NextKey):
+ if Options.get((self.BuildRuleFamily, NextKey)) is not None:
+ Options.pop((self.BuildRuleFamily, NextKey))
+ else:
+ if Options.get((self.BuildRuleFamily, NowKey)) is not None:
+ Options.pop((self.BuildRuleFamily, NowKey))
+
+ for Key in Options:
+ if ModuleStyle is not None and len (Key) > 2:
+ # Check Module style is EDK or EDKII.
+ # Only append build option for the matched style module.
+ if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
+ continue
+ elif ModuleStyle == EDKII_NAME and Key[2] != EDKII_NAME:
+ continue
+ Family = Key[0]
+ Target, Tag, Arch, Tool, Attr = Key[1].split("_")
+ # if tool chain family doesn't match, skip it
+ if Tool in ToolDef and Family != "":
+ FamilyIsNull = False
+ if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY, "") != "":
+ if Family != ToolDef[Tool][TAB_TOD_DEFINES_BUILDRULEFAMILY]:
+ continue
+ elif Family != ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]:
+ continue
+ FamilyMatch = True
+ # expand any wildcard
+ if Target == TAB_STAR or Target == self.BuildTarget:
+ if Tag == TAB_STAR or Tag == self.ToolChain:
+ if Arch == TAB_STAR or Arch == self.Arch:
+ if Tool not in BuildOptions:
+ BuildOptions[Tool] = {}
+ if Attr != "FLAGS" or Attr not in BuildOptions[Tool] or Options[Key].startswith('='):
+ BuildOptions[Tool][Attr] = Options[Key]
+ else:
+ # append options for the same tool except PATH
+ if Attr != 'PATH':
+ BuildOptions[Tool][Attr] += " " + Options[Key]
+ else:
+ BuildOptions[Tool][Attr] = Options[Key]
+ # Build Option Family has been checked, which need't to be checked again for family.
+ if FamilyMatch or FamilyIsNull:
+ return BuildOptions
+
+ for Key in Options:
+ if ModuleStyle is not None and len (Key) > 2:
+ # Check Module style is EDK or EDKII.
+ # Only append build option for the matched style module.
+ if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
+ continue
+ elif ModuleStyle == EDKII_NAME and Key[2] != EDKII_NAME:
+ continue
+ Family = Key[0]
+ Target, Tag, Arch, Tool, Attr = Key[1].split("_")
+ # if tool chain family doesn't match, skip it
+ if Tool not in ToolDef or Family == "":
+ continue
+ # option has been added before
+ if Family != ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]:
+ continue
+
+ # expand any wildcard
+ if Target == TAB_STAR or Target == self.BuildTarget:
+ if Tag == TAB_STAR or Tag == self.ToolChain:
+ if Arch == TAB_STAR or Arch == self.Arch:
+ if Tool not in BuildOptions:
+ BuildOptions[Tool] = {}
+ if Attr != "FLAGS" or Attr not in BuildOptions[Tool] or Options[Key].startswith('='):
+ BuildOptions[Tool][Attr] = Options[Key]
+ else:
+ # append options for the same tool except PATH
+ if Attr != 'PATH':
+ BuildOptions[Tool][Attr] += " " + Options[Key]
+ else:
+ BuildOptions[Tool][Attr] = Options[Key]
+ return BuildOptions
+#
+#This class is the pruned WorkSpaceAutoGen for ModuleAutoGen in multiple thread
+#
+class WorkSpaceInfo(AutoGenInfo):
+ def __init__(self,Workspace, MetaFile, Target, ToolChain, Arch):
+ self._SrcTimeStamp = 0
+ self.Db = BuildDB
+ self.BuildDatabase = self.Db.BuildObject
+ self.Target = Target
+ self.ToolChain = ToolChain
+ self.WorkspaceDir = Workspace
+ self.ActivePlatform = MetaFile
+ self.ArchList = Arch
+
+
+class PlatformInfo(AutoGenInfo):
+ def __init__(self, Workspace, MetaFile, Target, ToolChain, Arch,DataPipe):
+ self.Wa = Workspace
+ self.WorkspaceDir = self.Wa.WorkspaceDir
+ self.MetaFile = MetaFile
+ self.Arch = Arch
+ self.Target = Target
+ self.BuildTarget = Target
+ self.ToolChain = ToolChain
+ self.Platform = self.Wa.BuildDatabase[self.MetaFile, self.Arch, self.Target, self.ToolChain]
+
+ self.SourceDir = MetaFile.SubDir
+ self.DataPipe = DataPipe
+ @cached_property
+ def _AsBuildModuleList(self):
+ retVal = self.DataPipe.Get("AsBuildModuleList")
+ if retVal is None:
+ retVal = {}
+ return retVal
+
+ ## Test if a module is supported by the platform
+ #
+ # An error will be raised directly if the module or its arch is not supported
+ # by the platform or current configuration
+ #
+ def ValidModule(self, Module):
+ return Module in self.Platform.Modules or Module in self.Platform.LibraryInstances \
+ or Module in self._AsBuildModuleList
+
+ @cached_property
+ def ToolChainFamily(self):
+ retVal = self.DataPipe.Get("ToolChainFamily")
+ if retVal is None:
+ retVal = {}
+ return retVal
+
+ @cached_property
+ def BuildRuleFamily(self):
+ retVal = self.DataPipe.Get("BuildRuleFamily")
+ if retVal is None:
+ retVal = {}
+ return retVal
+
+ @cached_property
+ def _MbList(self):
+ return [self.Wa.BuildDatabase[m, self.Arch, self.BuildTarget, self.ToolChain] for m in self.Platform.Modules]
+
+ @cached_property
+ def PackageList(self):
+ RetVal = set()
+ for dec_file,Arch in self.DataPipe.Get("PackageList"):
+ RetVal.add(self.Wa.BuildDatabase[dec_file,Arch,self.BuildTarget, self.ToolChain])
+ return list(RetVal)
+
+ ## Return the directory to store all intermediate and final files built
+ @cached_property
+ def BuildDir(self):
+ if os.path.isabs(self.OutputDir):
+ RetVal = os.path.join(
+ os.path.abspath(self.OutputDir),
+ self.Target + "_" + self.ToolChain,
+ )
+ else:
+ RetVal = os.path.join(
+ self.WorkspaceDir,
+ self.OutputDir,
+ self.Target + "_" + self.ToolChain,
+ )
+ return RetVal
+
+ ## Return the build output directory platform specifies
+ @cached_property
+ def OutputDir(self):
+ return self.Platform.OutputDirectory
+
+ ## Return platform name
+ @cached_property
+ def Name(self):
+ return self.Platform.PlatformName
+
+ ## Return meta-file GUID
+ @cached_property
+ def Guid(self):
+ return self.Platform.Guid
+
+ ## Return platform version
+ @cached_property
+ def Version(self):
+ return self.Platform.Version
+
+ ## Return paths of tools
+ @cached_property
+ def ToolDefinition(self):
+ retVal = self.DataPipe.Get("TOOLDEF")
+ if retVal is None:
+ retVal = {}
+ return retVal
+
+ ## Return build command string
+ #
+ # @retval string Build command string
+ #
+ @cached_property
+ def BuildCommand(self):
+ retVal = self.DataPipe.Get("BuildCommand")
+ if retVal is None:
+ retVal = []
+ return retVal
+
+ @cached_property
+ def PcdTokenNumber(self):
+ retVal = self.DataPipe.Get("PCD_TNUM")
+ if retVal is None:
+ retVal = {}
+ return retVal
+
+ ## Override PCD setting (type, value, ...)
+ #
+ # @param ToPcd The PCD to be overridden
+ # @param FromPcd The PCD overriding from
+ #
+ def _OverridePcd(self, ToPcd, FromPcd, Module="", Msg="", Library=""):
+ #
+ # in case there's PCDs coming from FDF file, which have no type given.
+ # at this point, ToPcd.Type has the type found from dependent
+ # package
+ #
+ TokenCName = ToPcd.TokenCName
+ for PcdItem in self.MixedPcd:
+ if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in self.MixedPcd[PcdItem]:
+ TokenCName = PcdItem[0]
+ break
+ if FromPcd is not None:
+ if ToPcd.Pending and FromPcd.Type:
+ ToPcd.Type = FromPcd.Type
+ elif ToPcd.Type and FromPcd.Type\
+ and ToPcd.Type != FromPcd.Type and ToPcd.Type in FromPcd.Type:
+ if ToPcd.Type.strip() == TAB_PCDS_DYNAMIC_EX:
+ ToPcd.Type = FromPcd.Type
+ elif ToPcd.Type and FromPcd.Type \
+ and ToPcd.Type != FromPcd.Type:
+ if Library:
+ Module = str(Module) + " 's library file (" + str(Library) + ")"
+ EdkLogger.error("build", OPTION_CONFLICT, "Mismatched PCD type",
+ ExtraData="%s.%s is used as [%s] in module %s, but as [%s] in %s."\
+ % (ToPcd.TokenSpaceGuidCName, TokenCName,
+ ToPcd.Type, Module, FromPcd.Type, Msg),
+ File=self.MetaFile)
+
+ if FromPcd.MaxDatumSize:
+ ToPcd.MaxDatumSize = FromPcd.MaxDatumSize
+ ToPcd.MaxSizeUserSet = FromPcd.MaxDatumSize
+ if FromPcd.DefaultValue:
+ ToPcd.DefaultValue = FromPcd.DefaultValue
+ if FromPcd.TokenValue:
+ ToPcd.TokenValue = FromPcd.TokenValue
+ if FromPcd.DatumType:
+ ToPcd.DatumType = FromPcd.DatumType
+ if FromPcd.SkuInfoList:
+ ToPcd.SkuInfoList = FromPcd.SkuInfoList
+ if FromPcd.UserDefinedDefaultStoresFlag:
+ ToPcd.UserDefinedDefaultStoresFlag = FromPcd.UserDefinedDefaultStoresFlag
+ # Add Flexible PCD format parse
+ if ToPcd.DefaultValue:
+ try:
+ ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, self._GuidDict)(True)
+ except BadExpression as Value:
+ EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
+ File=self.MetaFile)
+
+ # check the validation of datum
+ IsValid, Cause = CheckPcdDatum(ToPcd.DatumType, ToPcd.DefaultValue)
+ if not IsValid:
+ EdkLogger.error('build', FORMAT_INVALID, Cause, File=self.MetaFile,
+ ExtraData="%s.%s" % (ToPcd.TokenSpaceGuidCName, TokenCName))
+ ToPcd.validateranges = FromPcd.validateranges
+ ToPcd.validlists = FromPcd.validlists
+ ToPcd.expressions = FromPcd.expressions
+ ToPcd.CustomAttribute = FromPcd.CustomAttribute
+
+ if FromPcd is not None and ToPcd.DatumType == TAB_VOID and not ToPcd.MaxDatumSize:
+ EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specified for PCD %s.%s" \
+ % (ToPcd.TokenSpaceGuidCName, TokenCName))
+ Value = ToPcd.DefaultValue
+ if not Value:
+ ToPcd.MaxDatumSize = '1'
+ elif Value[0] == 'L':
+ ToPcd.MaxDatumSize = str((len(Value) - 2) * 2)
+ elif Value[0] == '{':
+ ToPcd.MaxDatumSize = str(len(Value.split(',')))
+ else:
+ ToPcd.MaxDatumSize = str(len(Value) - 1)
+
+ # apply default SKU for dynamic PCDS if specified one is not available
+ if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYNAMIC_EX_TYPE_SET) \
+ and not ToPcd.SkuInfoList:
+ if self.Platform.SkuName in self.Platform.SkuIds:
+ SkuName = self.Platform.SkuName
+ else:
+ SkuName = TAB_DEFAULT
+ ToPcd.SkuInfoList = {
+ SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[SkuName][0], '', '', '', '', '', ToPcd.DefaultValue)
+ }
+
+ def ApplyPcdSetting(self, Module, Pcds, Library=""):
+ # for each PCD in module
+ for Name, Guid in Pcds:
+ PcdInModule = Pcds[Name, Guid]
+ # find out the PCD setting in platform
+ if (Name, Guid) in self.Pcds:
+ PcdInPlatform = self.Pcds[Name, Guid]
+ else:
+ PcdInPlatform = None
+ # then override the settings if any
+ self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg="DSC PCD sections", Library=Library)
+ # resolve the VariableGuid value
+ for SkuId in PcdInModule.SkuInfoList:
+ Sku = PcdInModule.SkuInfoList[SkuId]
+ if Sku.VariableGuid == '': continue
+ Sku.VariableGuidValue = GuidValue(Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
+ if Sku.VariableGuidValue is None:
+ PackageList = "\n\t".join(str(P) for P in self.PackageList)
+ EdkLogger.error(
+ 'build',
+ RESOURCE_NOT_AVAILABLE,
+ "Value of GUID [%s] is not found in" % Sku.VariableGuid,
+ ExtraData=PackageList + "\n\t(used with %s.%s from module %s)" \
+ % (Guid, Name, str(Module)),
+ File=self.MetaFile
+ )
+
+ # override PCD settings with module specific setting
+ if Module in self.Platform.Modules:
+ PlatformModule = self.Platform.Modules[str(Module)]
+ for Key in PlatformModule.Pcds:
+ if self.BuildOptionPcd:
+ for pcd in self.BuildOptionPcd:
+ (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, _) = pcd
+ if (TokenCName, TokenSpaceGuidCName) == Key and FieldName =="":
+ PlatformModule.Pcds[Key].DefaultValue = pcdvalue
+ PlatformModule.Pcds[Key].PcdValueFromComm = pcdvalue
+ break
+ Flag = False
+ if Key in Pcds:
+ ToPcd = Pcds[Key]
+ Flag = True
+ elif Key in self.MixedPcd:
+ for PcdItem in self.MixedPcd[Key]:
+ if PcdItem in Pcds:
+ ToPcd = Pcds[PcdItem]
+ Flag = True
+ break
+ if Flag:
+ self._OverridePcd(ToPcd, PlatformModule.Pcds[Key], Module, Msg="DSC Components Module scoped PCD section", Library=Library)
+ # use PCD value to calculate the MaxDatumSize when it is not specified
+ for Name, Guid in Pcds:
+ Pcd = Pcds[Name, Guid]
+ if Pcd.DatumType == TAB_VOID and not Pcd.MaxDatumSize:
+ Pcd.MaxSizeUserSet = None
+ Value = Pcd.DefaultValue
+ if not Value:
+ Pcd.MaxDatumSize = '1'
+ elif Value[0] == 'L':
+ Pcd.MaxDatumSize = str((len(Value) - 2) * 2)
+ elif Value[0] == '{':
+ Pcd.MaxDatumSize = str(len(Value.split(',')))
+ else:
+ Pcd.MaxDatumSize = str(len(Value) - 1)
+ return list(Pcds.values())
+
+ @cached_property
+ def Pcds(self):
+ PlatformPcdData = self.DataPipe.Get("PLA_PCD")
+# for pcd in PlatformPcdData:
+# for skuid in pcd.SkuInfoList:
+# pcd.SkuInfoList[skuid] = self.CreateSkuInfoFromDict(pcd.SkuInfoList[skuid])
+ return {(pcddata.TokenCName,pcddata.TokenSpaceGuidCName):pcddata for pcddata in PlatformPcdData}
+
+ def CreateSkuInfoFromDict(self,SkuInfoDict):
+ return SkuInfoClass(
+ SkuInfoDict.get("SkuIdName"),
+ SkuInfoDict.get("SkuId"),
+ SkuInfoDict.get("VariableName"),
+ SkuInfoDict.get("VariableGuid"),
+ SkuInfoDict.get("VariableOffset"),
+ SkuInfoDict.get("HiiDefaultValue"),
+ SkuInfoDict.get("VpdOffset"),
+ SkuInfoDict.get("DefaultValue"),
+ SkuInfoDict.get("VariableGuidValue"),
+ SkuInfoDict.get("VariableAttribute",""),
+ SkuInfoDict.get("DefaultStore",None)
+ )
+ @cached_property
+ def MixedPcd(self):
+ return self.DataPipe.Get("MixedPcd")
+ @cached_property
+ def _GuidDict(self):
+ RetVal = self.DataPipe.Get("GuidDict")
+ if RetVal is None:
+ RetVal = {}
+ return RetVal
+ @cached_property
+ def BuildOptionPcd(self):
+ return self.DataPipe.Get("BuildOptPcd")
+ def ApplyBuildOption(self,module):
+ PlatformOptions = self.DataPipe.Get("PLA_BO")
+ ModuleBuildOptions = self.DataPipe.Get("MOL_BO")
+ ModuleOptionFromDsc = ModuleBuildOptions.get((module.MetaFile.File,module.MetaFile.Root))
+ if ModuleOptionFromDsc:
+ ModuleTypeOptions, PlatformModuleOptions = ModuleOptionFromDsc["ModuleTypeOptions"],ModuleOptionFromDsc["PlatformModuleOptions"]
+ else:
+ ModuleTypeOptions, PlatformModuleOptions = {}, {}
+ ToolDefinition = self.DataPipe.Get("TOOLDEF")
+ ModuleOptions = self._ExpandBuildOption(module.BuildOptions)
+ BuildRuleOrder = None
+ for Options in [ToolDefinition, ModuleOptions, PlatformOptions, ModuleTypeOptions, PlatformModuleOptions]:
+ for Tool in Options:
+ for Attr in Options[Tool]:
+ if Attr == TAB_TOD_DEFINES_BUILDRULEORDER:
+ BuildRuleOrder = Options[Tool][Attr]
+
+ AllTools = set(list(ModuleOptions.keys()) + list(PlatformOptions.keys()) +
+ list(PlatformModuleOptions.keys()) + list(ModuleTypeOptions.keys()) +
+ list(ToolDefinition.keys()))
+ BuildOptions = defaultdict(lambda: defaultdict(str))
+ for Tool in AllTools:
+ for Options in [ToolDefinition, ModuleOptions, PlatformOptions, ModuleTypeOptions, PlatformModuleOptions]:
+ if Tool not in Options:
+ continue
+ for Attr in Options[Tool]:
+ #
+ # Do not generate it in Makefile
+ #
+ if Attr == TAB_TOD_DEFINES_BUILDRULEORDER:
+ continue
+ Value = Options[Tool][Attr]
+ # check if override is indicated
+ if Value.startswith('='):
+ BuildOptions[Tool][Attr] = mws.handleWsMacro(Value[1:])
+ else:
+ if Attr != 'PATH':
+ BuildOptions[Tool][Attr] += " " + mws.handleWsMacro(Value)
+ else:
+ BuildOptions[Tool][Attr] = mws.handleWsMacro(Value)
+
+ return BuildOptions, BuildRuleOrder
+
+ def ApplyLibraryInstance(self,module):
+ alldeps = self.DataPipe.Get("DEPS")
+ if alldeps is None:
+ alldeps = {}
+ mod_libs = alldeps.get((module.MetaFile.File,module.MetaFile.Root,module.Arch),[])
+ retVal = []
+ for (file_path,root,arch) in mod_libs:
+ retVal.append(self.Wa.BuildDatabase[PathClass(file_path,root), arch, self.Target,self.ToolChain])
+ return retVal
+
+ ## Parse build_rule.txt in Conf Directory.
+ #
+ # @retval BuildRule object
+ #
+ @cached_property
+ def BuildRule(self):
+ WInfo = self.DataPipe.Get("P_Info")
+ RetVal = WInfo.get("BuildRuleFile")
+ if RetVal._FileVersion == "":
+ RetVal._FileVersion = AutoGenReqBuildRuleVerNum
+ return RetVal
diff --git a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
new file mode 100644
index 000000000000..6360d4cbd86b
--- /dev/null
+++ b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
@@ -0,0 +1,1493 @@
+## @file
+# Create makefile for MS nmake and GNU make
+#
+# Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+#
+
+## Import Modules
+#
+from __future__ import print_function
+from __future__ import absolute_import
+import os.path as path
+import copy
+from collections import defaultdict
+
+from .BuildEngine import BuildRule,gDefaultBuildRuleFile,AutoGenReqBuildRuleVerNum
+from .GenVar import VariableMgr, var_info
+from . import GenMake
+from AutoGen.DataPipe import MemoryDataPipe
+from AutoGen.ModuleAutoGen import ModuleAutoGen
+from AutoGen.AutoGen import AutoGen
+from AutoGen.AutoGen import CalculatePriorityValue
+from Workspace.WorkspaceCommon import GetModuleLibInstances
+from CommonDataClass.CommonClass import SkuInfoClass
+from Common.caching import cached_class_function
+from Common.Expression import ValueExpressionEx
+from Common.StringUtils import StringToArray,NormPath
+from Common.BuildToolError import *
+from Common.DataType import *
+from Common.Misc import *
+import Common.VpdInfoFile as VpdInfoFile
+
+## Split command line option string to list
+#
+# subprocess.Popen needs the args to be a sequence. Otherwise there's problem
+# in non-windows platform to launch command
+#
+def _SplitOption(OptionString):
+ OptionList = []
+ LastChar = " "
+ OptionStart = 0
+ QuotationMark = ""
+ for Index in range(0, len(OptionString)):
+ CurrentChar = OptionString[Index]
+ if CurrentChar in ['"', "'"]:
+ if QuotationMark == CurrentChar:
+ QuotationMark = ""
+ elif QuotationMark == "":
+ QuotationMark = CurrentChar
+ continue
+ elif QuotationMark:
+ continue
+
+ if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r", "\n"]:
+ if Index > OptionStart:
+ OptionList.append(OptionString[OptionStart:Index - 1])
+ OptionStart = Index
+ LastChar = CurrentChar
+ OptionList.append(OptionString[OptionStart:])
+ return OptionList
+
+## AutoGen class for platform
+#
+# PlatformAutoGen class will process the original information in platform
+# file in order to generate makefile for platform.
+#
+class PlatformAutoGen(AutoGen):
+ # call super().__init__ then call the worker function with different parameter count
+ def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
+ if not hasattr(self, "_Init"):
+ self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch)
+ self._Init = True
+ #
+ # Used to store all PCDs for both PEI and DXE phase, in order to generate
+ # correct PCD database
+ #
+ _DynaPcdList_ = []
+ _NonDynaPcdList_ = []
+ _PlatformPcds = {}
+
+
+
+ ## Initialize PlatformAutoGen
+ #
+ #
+ # @param Workspace WorkspaceAutoGen object
+ # @param PlatformFile Platform file (DSC file)
+ # @param Target Build target (DEBUG, RELEASE)
+ # @param Toolchain Name of tool chain
+ # @param Arch arch of the platform supports
+ #
+ def _InitWorker(self, Workspace, PlatformFile, Target, Toolchain, Arch):
+ EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen platform [%s] [%s]" % (PlatformFile, Arch))
+ GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (PlatformFile, Arch, Toolchain, Target)
+
+ self.MetaFile = PlatformFile
+ self.Workspace = Workspace
+ self.WorkspaceDir = Workspace.WorkspaceDir
+ self.ToolChain = Toolchain
+ self.BuildTarget = Target
+ self.Arch = Arch
+ self.SourceDir = PlatformFile.SubDir
+ self.FdTargetList = self.Workspace.FdTargetList
+ self.FvTargetList = self.Workspace.FvTargetList
+ # get the original module/package/platform objects
+ self.BuildDatabase = Workspace.BuildDatabase
+ self.DscBuildDataObj = Workspace.Platform
+
+ # flag indicating if the makefile/C-code file has been created or not
+ self.IsMakeFileCreated = False
+
+ self._DynamicPcdList = None # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
+ self._NonDynamicPcdList = None # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
+
+ self._AsBuildInfList = []
+ self._AsBuildModuleList = []
+
+ self.VariableInfo = None
+
+ if GlobalData.gFdfParser is not None:
+ self._AsBuildInfList = GlobalData.gFdfParser.Profile.InfList
+ for Inf in self._AsBuildInfList:
+ InfClass = PathClass(NormPath(Inf), GlobalData.gWorkspace, self.Arch)
+ M = self.BuildDatabase[InfClass, self.Arch, self.BuildTarget, self.ToolChain]
+ if not M.IsBinaryModule:
+ continue
+ self._AsBuildModuleList.append(InfClass)
+ # get library/modules for build
+ self.LibraryBuildDirectoryList = []
+ self.ModuleBuildDirectoryList = []
+
+ self.DataPipe = MemoryDataPipe(self.BuildDir)
+ self.DataPipe.FillData(self)
+
+ return True
+ ## hash() operator of PlatformAutoGen
+ #
+ # The platform file path and arch string will be used to represent
+ # hash value of this object
+ #
+ # @retval int Hash value of the platform file path and arch
+ #
+ @cached_class_function
+ def __hash__(self):
+ return hash((self.MetaFile, self.Arch))
+ @cached_class_function
+ def __repr__(self):
+ return "%s [%s]" % (self.MetaFile, self.Arch)
+
+ ## Create autogen code for platform and modules
+ #
+ # Since there's no autogen code for platform, this method will do nothing
+ # if CreateModuleCodeFile is set to False.
+ #
+ # @param CreateModuleCodeFile Flag indicating if creating module's
+ # autogen code file or not
+ #
+ @cached_class_function
+ def CreateCodeFile(self, CreateModuleCodeFile=False):
+ # only module has code to be created, so do nothing if CreateModuleCodeFile is False
+ if not CreateModuleCodeFile:
+ return
+
+ for Ma in self.ModuleAutoGenList:
+ Ma.CreateCodeFile(True)
+
+ ## Generate Fds Command
+ @cached_property
+ def GenFdsCommand(self):
+ return self.Workspace.GenFdsCommand
+
+ ## Create makefile for the platform and modules in it
+ #
+ # @param CreateModuleMakeFile Flag indicating if the makefile for
+ # modules will be created as well
+ #
+ def CreateMakeFile(self, CreateModuleMakeFile=False, FfsCommand = {}):
+ if CreateModuleMakeFile:
+ for Ma in self._MaList:
+ key = (Ma.MetaFile.File, self.Arch)
+ if key in FfsCommand:
+ Ma.CreateMakeFile(True, FfsCommand[key])
+ else:
+ Ma.CreateMakeFile(True)
+
+ # no need to create makefile for the platform more than once
+ if self.IsMakeFileCreated:
+ return
+
+ # create library/module build dirs for platform
+ Makefile = GenMake.PlatformMakefile(self)
+ self.LibraryBuildDirectoryList = Makefile.GetLibraryBuildDirectoryList()
+ self.ModuleBuildDirectoryList = Makefile.GetModuleBuildDirectoryList()
+
+ self.IsMakeFileCreated = True
+
+ @property
+ def AllPcdList(self):
+ return self.DynamicPcdList + self.NonDynamicPcdList
+ ## Deal with Shared FixedAtBuild Pcds
+ #
+ def CollectFixedAtBuildPcds(self):
+ for LibAuto in self.LibraryAutoGenList:
+ FixedAtBuildPcds = {}
+ ShareFixedAtBuildPcdsSameValue = {}
+ for Module in LibAuto.ReferenceModules:
+ for Pcd in set(Module.FixedAtBuildPcds + LibAuto.FixedAtBuildPcds):
+ DefaultValue = Pcd.DefaultValue
+ # Cover the case: DSC component override the Pcd value and the Pcd only used in one Lib
+ if Pcd in Module.LibraryPcdList:
+ Index = Module.LibraryPcdList.index(Pcd)
+ DefaultValue = Module.LibraryPcdList[Index].DefaultValue
+ key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+ if key not in FixedAtBuildPcds:
+ ShareFixedAtBuildPcdsSameValue[key] = True
+ FixedAtBuildPcds[key] = DefaultValue
+ else:
+ if FixedAtBuildPcds[key] != DefaultValue:
+ ShareFixedAtBuildPcdsSameValue[key] = False
+ for Pcd in LibAuto.FixedAtBuildPcds:
+ key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+ if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
+ continue
+ else:
+ DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
+ if DscPcd.Type != TAB_PCDS_FIXED_AT_BUILD:
+ continue
+ if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:
+ LibAuto.ConstPcd[key] = FixedAtBuildPcds[key]
+
+ def CollectVariables(self, DynamicPcdSet):
+ VpdRegionSize = 0
+ VpdRegionBase = 0
+ if self.Workspace.FdfFile:
+ FdDict = self.Workspace.FdfProfile.FdDict[GlobalData.gFdfParser.CurrentFdName]
+ for FdRegion in FdDict.RegionList:
+ for item in FdRegion.RegionDataList:
+ if self.Platform.VpdToolGuid.strip() and self.Platform.VpdToolGuid in item:
+ VpdRegionSize = FdRegion.Size
+ VpdRegionBase = FdRegion.Offset
+ break
+
+ VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(), self.DscBuildDataObj.SkuIds)
+ VariableInfo.SetVpdRegionMaxSize(VpdRegionSize)
+ VariableInfo.SetVpdRegionOffset(VpdRegionBase)
+ Index = 0
+ for Pcd in DynamicPcdSet:
+ pcdname = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+ for SkuName in Pcd.SkuInfoList:
+ Sku = Pcd.SkuInfoList[SkuName]
+ SkuId = Sku.SkuId
+ if SkuId is None or SkuId == '':
+ continue
+ if len(Sku.VariableName) > 0:
+ if Sku.VariableAttribute and 'NV' not in Sku.VariableAttribute:
+ continue
+ VariableGuidStructure = Sku.VariableGuidValue
+ VariableGuid = GuidStructureStringToGuidString(VariableGuidStructure)
+ for StorageName in Sku.DefaultStoreDict:
+ VariableInfo.append_variable(var_info(Index, pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultStoreDict[StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else StringToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAttribute['DscPosition'], Pcd.CustomAttribute.get('IsStru',False)))
+ Index += 1
+ return VariableInfo
+
+ def UpdateNVStoreMaxSize(self, OrgVpdFile):
+ if self.VariableInfo:
+ VpdMapFilePath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
+ PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
+
+ if PcdNvStoreDfBuffer:
+ if os.path.exists(VpdMapFilePath):
+ OrgVpdFile.Read(VpdMapFilePath)
+ PcdItems = OrgVpdFile.GetOffset(PcdNvStoreDfBuffer[0])
+ NvStoreOffset = list(PcdItems.values())[0].strip() if PcdItems else '0'
+ else:
+ EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
+
+ NvStoreOffset = int(NvStoreOffset, 16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
+ default_skuobj = PcdNvStoreDfBuffer[0].SkuInfoList.get(TAB_DEFAULT)
+ maxsize = self.VariableInfo.VpdRegionSize - NvStoreOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultValue.split(","))
+ var_data = self.VariableInfo.PatchNVStoreDefaultMaxSize(maxsize)
+
+ if var_data and default_skuobj:
+ default_skuobj.DefaultValue = var_data
+ PcdNvStoreDfBuffer[0].DefaultValue = var_data
+ PcdNvStoreDfBuffer[0].SkuInfoList.clear()
+ PcdNvStoreDfBuffer[0].SkuInfoList[TAB_DEFAULT] = default_skuobj
+ PcdNvStoreDfBuffer[0].MaxDatumSize = str(len(default_skuobj.DefaultValue.split(",")))
+
+ return OrgVpdFile
+
+ ## Collect dynamic PCDs
+ #
+ # Gather dynamic PCDs list from each module and their settings from platform
+ # This interface should be invoked explicitly when platform action is created.
+ #
+ def CollectPlatformDynamicPcds(self):
+ self.CategoryPcds()
+ self.SortDynamicPcd()
+
+ def CategoryPcds(self):
+ # Category Pcds into DynamicPcds and NonDynamicPcds
+ # for gathering error information
+ NoDatumTypePcdList = set()
+ FdfModuleList = []
+ for InfName in self._AsBuildInfList:
+ InfName = mws.join(self.WorkspaceDir, InfName)
+ FdfModuleList.append(os.path.normpath(InfName))
+ for M in self._MbList:
+# F is the Module for which M is the module autogen
+ ModPcdList = self.ApplyPcdSetting(M, M.ModulePcdList)
+ LibPcdList = []
+ for lib in M.LibraryPcdList:
+ LibPcdList.extend(self.ApplyPcdSetting(M, M.LibraryPcdList[lib], lib))
+ for PcdFromModule in ModPcdList + LibPcdList:
+
+ # make sure that the "VOID*" kind of datum has MaxDatumSize set
+ if PcdFromModule.DatumType == TAB_VOID and not PcdFromModule.MaxDatumSize:
+ NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile))
+
+ # Check the PCD from Binary INF or Source INF
+ if M.IsBinaryModule == True:
+ PcdFromModule.IsFromBinaryInf = True
+
+ # Check the PCD from DSC or not
+ PcdFromModule.IsFromDsc = (PcdFromModule.TokenCName, PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds
+
+ if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET or PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
+ if M.MetaFile.Path not in FdfModuleList:
+ # If one of the Source built modules listed in the DSC is not listed
+ # in FDF modules, and the INF lists a PCD can only use the PcdsDynamic
+ # access method (it is only listed in the DEC file that declares the
+ # PCD as PcdsDynamic), then build tool will report warning message
+ # notify the PI that they are attempting to build a module that must
+ # be included in a flash image in order to be functional. These Dynamic
+ # PCD will not be added into the Database unless it is used by other
+ # modules that are included in the FDF file.
+ if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET and \
+ PcdFromModule.IsFromBinaryInf == False:
+ # Print warning message to let the developer make a determine.
+ continue
+ # If one of the Source built modules listed in the DSC is not listed in
+ # FDF modules, and the INF lists a PCD can only use the PcdsDynamicEx
+ # access method (it is only listed in the DEC file that declares the
+ # PCD as PcdsDynamicEx), then DO NOT break the build; DO NOT add the
+ # PCD to the Platform's PCD Database.
+ if PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
+ continue
+ #
+ # If a dynamic PCD used by a PEM module/PEI module & DXE module,
+ # it should be stored in Pcd PEI database, If a dynamic only
+ # used by DXE module, it should be stored in DXE PCD database.
+ # The default Phase is DXE
+ #
+ if M.ModuleType in SUP_MODULE_SET_PEI:
+ PcdFromModule.Phase = "PEI"
+ if PcdFromModule not in self._DynaPcdList_:
+ self._DynaPcdList_.append(PcdFromModule)
+ elif PcdFromModule.Phase == 'PEI':
+ # overwrite any the same PCD existing, if Phase is PEI
+ Index = self._DynaPcdList_.index(PcdFromModule)
+ self._DynaPcdList_[Index] = PcdFromModule
+ elif PcdFromModule not in self._NonDynaPcdList_:
+ self._NonDynaPcdList_.append(PcdFromModule)
+ elif PcdFromModule in self._NonDynaPcdList_ and PcdFromModule.IsFromBinaryInf == True:
+ Index = self._NonDynaPcdList_.index(PcdFromModule)
+ if self._NonDynaPcdList_[Index].IsFromBinaryInf == False:
+ #The PCD from Binary INF will override the same one from source INF
+ self._NonDynaPcdList_.remove (self._NonDynaPcdList_[Index])
+ PcdFromModule.Pending = False
+ self._NonDynaPcdList_.append (PcdFromModule)
+ DscModuleSet = {os.path.normpath(ModuleInf.Path) for ModuleInf in self.Platform.Modules}
+ # add the PCD from modules that listed in FDF but not in DSC to Database
+ for InfName in FdfModuleList:
+ if InfName not in DscModuleSet:
+ InfClass = PathClass(InfName)
+ M = self.BuildDatabase[InfClass, self.Arch, self.BuildTarget, self.ToolChain]
+ # If a module INF in FDF but not in current arch's DSC module list, it must be module (either binary or source)
+ # for different Arch. PCDs in source module for different Arch is already added before, so skip the source module here.
+ # For binary module, if in current arch, we need to list the PCDs into database.
+ if not M.IsBinaryModule:
+ continue
+ # Override the module PCD setting by platform setting
+ ModulePcdList = self.ApplyPcdSetting(M, M.Pcds)
+ for PcdFromModule in ModulePcdList:
+ PcdFromModule.IsFromBinaryInf = True
+ PcdFromModule.IsFromDsc = False
+ # Only allow the DynamicEx and Patchable PCD in AsBuild INF
+ if PcdFromModule.Type not in PCD_DYNAMIC_EX_TYPE_SET and PcdFromModule.Type not in TAB_PCDS_PATCHABLE_IN_MODULE:
+ EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting error",
+ File=self.MetaFile,
+ ExtraData="\n\tExisted %s PCD %s in:\n\t\t%s\n"
+ % (PcdFromModule.Type, PcdFromModule.TokenCName, InfName))
+ # make sure that the "VOID*" kind of datum has MaxDatumSize set
+ if PcdFromModule.DatumType == TAB_VOID and not PcdFromModule.MaxDatumSize:
+ NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName))
+ if M.ModuleType in SUP_MODULE_SET_PEI:
+ PcdFromModule.Phase = "PEI"
+ if PcdFromModule not in self._DynaPcdList_ and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
+ self._DynaPcdList_.append(PcdFromModule)
+ elif PcdFromModule not in self._NonDynaPcdList_ and PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE:
+ self._NonDynaPcdList_.append(PcdFromModule)
+ if PcdFromModule in self._DynaPcdList_ and PcdFromModule.Phase == 'PEI' and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
+ # Overwrite the phase of any the same PCD existing, if Phase is PEI.
+ # It is to solve the case that a dynamic PCD used by a PEM module/PEI
+ # module & DXE module at a same time.
+ # Overwrite the type of the PCDs in source INF by the type of AsBuild
+ # INF file as DynamicEx.
+ Index = self._DynaPcdList_.index(PcdFromModule)
+ self._DynaPcdList_[Index].Phase = PcdFromModule.Phase
+ self._DynaPcdList_[Index].Type = PcdFromModule.Type
+ for PcdFromModule in self._NonDynaPcdList_:
+ # If a PCD is not listed in the DSC file, but binary INF files used by
+ # this platform all (that use this PCD) list the PCD in a [PatchPcds]
+ # section, AND all source INF files used by this platform the build
+ # that use the PCD list the PCD in either a [Pcds] or [PatchPcds]
+ # section, then the tools must NOT add the PCD to the Platform's PCD
+ # Database; the build must assign the access method for this PCD as
+ # PcdsPatchableInModule.
+ if PcdFromModule not in self._DynaPcdList_:
+ continue
+ Index = self._DynaPcdList_.index(PcdFromModule)
+ if PcdFromModule.IsFromDsc == False and \
+ PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE and \
+ PcdFromModule.IsFromBinaryInf == True and \
+ self._DynaPcdList_[Index].IsFromBinaryInf == False:
+ Index = self._DynaPcdList_.index(PcdFromModule)
+ self._DynaPcdList_.remove (self._DynaPcdList_[Index])
+
+ # print out error information and break the build, if error found
+ if len(NoDatumTypePcdList) > 0:
+ NoDatumTypePcdListString = "\n\t\t".join(NoDatumTypePcdList)
+ EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting error",
+ File=self.MetaFile,
+ ExtraData="\n\tPCD(s) without MaxDatumSize:\n\t\t%s\n"
+ % NoDatumTypePcdListString)
+ self._NonDynamicPcdList = self._NonDynaPcdList_
+ self._DynamicPcdList = self._DynaPcdList_
+
+ def SortDynamicPcd(self):
+ #
+ # Sort dynamic PCD list to:
+ # 1) If PCD's datum type is VOID* and value is unicode string which starts with L, the PCD item should
+ # try to be put header of dynamicd List
+ # 2) If PCD is HII type, the PCD item should be put after unicode type PCD
+ #
+ # The reason of sorting is make sure the unicode string is in double-byte alignment in string table.
+ #
+ UnicodePcdArray = set()
+ HiiPcdArray = set()
+ OtherPcdArray = set()
+ VpdPcdDict = {}
+ VpdFile = VpdInfoFile.VpdInfoFile()
+ NeedProcessVpdMapFile = False
+
+ for pcd in self.Platform.Pcds:
+ if pcd not in self._PlatformPcds:
+ self._PlatformPcds[pcd] = self.Platform.Pcds[pcd]
+
+ for item in self._PlatformPcds:
+ if self._PlatformPcds[item].DatumType and self._PlatformPcds[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+ self._PlatformPcds[item].DatumType = TAB_VOID
+
+ if (self.Workspace.ArchList[-1] == self.Arch):
+ for Pcd in self._DynamicPcdList:
+ # just pick the a value to determine whether is unicode string type
+ Sku = Pcd.SkuInfoList.get(TAB_DEFAULT)
+ Sku.VpdOffset = Sku.VpdOffset.strip()
+
+ if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+ Pcd.DatumType = TAB_VOID
+
+ # if found PCD which datum value is unicode string the insert to left size of UnicodeIndex
+ # if found HII type PCD then insert to right of UnicodeIndex
+ if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+ VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
+
+ #Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
+ PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid"))
+ if PcdNvStoreDfBuffer:
+ self.VariableInfo = self.CollectVariables(self._DynamicPcdList)
+ vardump = self.VariableInfo.dump()
+ if vardump:
+ #
+ #According to PCD_DATABASE_INIT in edk2\MdeModulePkg\Include\Guid\PcdDataBaseSignatureGuid.h,
+ #the max size for string PCD should not exceed USHRT_MAX 65535(0xffff).
+ #typedef UINT16 SIZE_INFO;
+ #//SIZE_INFO SizeTable[];
+ if len(vardump.split(",")) > 0xffff:
+ EdkLogger.error("build", RESOURCE_OVERFLOW, 'The current length of PCD %s value is %d, it exceeds to the max size of String PCD.' %(".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName,PcdNvStoreDfBuffer.TokenCName]) ,len(vardump.split(","))))
+ PcdNvStoreDfBuffer.DefaultValue = vardump
+ for skuname in PcdNvStoreDfBuffer.SkuInfoList:
+ PcdNvStoreDfBuffer.SkuInfoList[skuname].DefaultValue = vardump
+ PcdNvStoreDfBuffer.MaxDatumSize = str(len(vardump.split(",")))
+ else:
+ #If the end user define [DefaultStores] and [XXX.Menufacturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to PcdsDynamicVpd
+ if [Pcd for Pcd in self._DynamicPcdList if Pcd.UserDefinedDefaultStoresFlag]:
+ EdkLogger.warn("build", "PcdNvStoreDefaultValueBuffer should be defined as PcdsDynamicExVpd in dsc file since the DefaultStores is enabled for this platform.\n%s" %self.Platform.MetaFile.Path)
+ PlatformPcds = sorted(self._PlatformPcds.keys())
+ #
+ # Add VPD type PCD into VpdFile and determine whether the VPD PCD need to be fixed up.
+ #
+ VpdSkuMap = {}
+ for PcdKey in PlatformPcds:
+ Pcd = self._PlatformPcds[PcdKey]
+ if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD] and \
+ PcdKey in VpdPcdDict:
+ Pcd = VpdPcdDict[PcdKey]
+ SkuValueMap = {}
+ DefaultSku = Pcd.SkuInfoList.get(TAB_DEFAULT)
+ if DefaultSku:
+ PcdValue = DefaultSku.DefaultValue
+ if PcdValue not in SkuValueMap:
+ SkuValueMap[PcdValue] = []
+ VpdFile.Add(Pcd, TAB_DEFAULT, DefaultSku.VpdOffset)
+ SkuValueMap[PcdValue].append(DefaultSku)
+
+ for (SkuName, Sku) in Pcd.SkuInfoList.items():
+ Sku.VpdOffset = Sku.VpdOffset.strip()
+ PcdValue = Sku.DefaultValue
+ if PcdValue == "":
+ PcdValue = Pcd.DefaultValue
+ if Sku.VpdOffset != TAB_STAR:
+ if PcdValue.startswith("{"):
+ Alignment = 8
+ elif PcdValue.startswith("L"):
+ Alignment = 2
+ else:
+ Alignment = 1
+ try:
+ VpdOffset = int(Sku.VpdOffset)
+ except:
+ try:
+ VpdOffset = int(Sku.VpdOffset, 16)
+ except:
+ EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+ if VpdOffset % Alignment != 0:
+ if PcdValue.startswith("{"):
+ EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" %(Pcd.TokenSpaceGuidCName, Pcd.TokenCName), File=self.MetaFile)
+ else:
+ EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Alignment))
+ if PcdValue not in SkuValueMap:
+ SkuValueMap[PcdValue] = []
+ VpdFile.Add(Pcd, SkuName, Sku.VpdOffset)
+ SkuValueMap[PcdValue].append(Sku)
+ # if the offset of a VPD is *, then it need to be fixed up by third party tool.
+ if not NeedProcessVpdMapFile and Sku.VpdOffset == TAB_STAR:
+ NeedProcessVpdMapFile = True
+ if self.Platform.VpdToolGuid is None or self.Platform.VpdToolGuid == '':
+ EdkLogger.error("Build", FILE_NOT_FOUND, \
+ "Fail to find third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.")
+
+ VpdSkuMap[PcdKey] = SkuValueMap
+ #
+ # Fix the PCDs define in VPD PCD section that never referenced by module.
+ # An example is PCD for signature usage.
+ #
+ for DscPcd in PlatformPcds:
+ DscPcdEntry = self._PlatformPcds[DscPcd]
+ if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+ if not (self.Platform.VpdToolGuid is None or self.Platform.VpdToolGuid == ''):
+ FoundFlag = False
+ for VpdPcd in VpdFile._VpdArray:
+ # This PCD has been referenced by module
+ if (VpdPcd.TokenSpaceGuidCName == DscPcdEntry.TokenSpaceGuidCName) and \
+ (VpdPcd.TokenCName == DscPcdEntry.TokenCName):
+ FoundFlag = True
+
+ # Not found, it should be signature
+ if not FoundFlag :
+ # just pick the a value to determine whether is unicode string type
+ SkuValueMap = {}
+ SkuObjList = list(DscPcdEntry.SkuInfoList.items())
+ DefaultSku = DscPcdEntry.SkuInfoList.get(TAB_DEFAULT)
+ if DefaultSku:
+ defaultindex = SkuObjList.index((TAB_DEFAULT, DefaultSku))
+ SkuObjList[0], SkuObjList[defaultindex] = SkuObjList[defaultindex], SkuObjList[0]
+ for (SkuName, Sku) in SkuObjList:
+ Sku.VpdOffset = Sku.VpdOffset.strip()
+
+ # Need to iterate DEC pcd information to get the value & datumtype
+ for eachDec in self.PackageList:
+ for DecPcd in eachDec.Pcds:
+ DecPcdEntry = eachDec.Pcds[DecPcd]
+ if (DecPcdEntry.TokenSpaceGuidCName == DscPcdEntry.TokenSpaceGuidCName) and \
+ (DecPcdEntry.TokenCName == DscPcdEntry.TokenCName):
+ # Print warning message to let the developer make a determine.
+ EdkLogger.warn("build", "Unreferenced vpd pcd used!",
+ File=self.MetaFile, \
+ ExtraData = "PCD: %s.%s used in the DSC file %s is unreferenced." \
+ %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path))
+
+ DscPcdEntry.DatumType = DecPcdEntry.DatumType
+ DscPcdEntry.DefaultValue = DecPcdEntry.DefaultValue
+ DscPcdEntry.TokenValue = DecPcdEntry.TokenValue
+ DscPcdEntry.TokenSpaceGuidValue = eachDec.Guids[DecPcdEntry.TokenSpaceGuidCName]
+ # Only fix the value while no value provided in DSC file.
+ if not Sku.DefaultValue:
+ DscPcdEntry.SkuInfoList[list(DscPcdEntry.SkuInfoList.keys())[0]].DefaultValue = DecPcdEntry.DefaultValue
+
+ if DscPcdEntry not in self._DynamicPcdList:
+ self._DynamicPcdList.append(DscPcdEntry)
+ Sku.VpdOffset = Sku.VpdOffset.strip()
+ PcdValue = Sku.DefaultValue
+ if PcdValue == "":
+ PcdValue = DscPcdEntry.DefaultValue
+ if Sku.VpdOffset != TAB_STAR:
+ if PcdValue.startswith("{"):
+ Alignment = 8
+ elif PcdValue.startswith("L"):
+ Alignment = 2
+ else:
+ Alignment = 1
+ try:
+ VpdOffset = int(Sku.VpdOffset)
+ except:
+ try:
+ VpdOffset = int(Sku.VpdOffset, 16)
+ except:
+ EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName))
+ if VpdOffset % Alignment != 0:
+ if PcdValue.startswith("{"):
+ EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName), File=self.MetaFile)
+ else:
+ EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment))
+ if PcdValue not in SkuValueMap:
+ SkuValueMap[PcdValue] = []
+ VpdFile.Add(DscPcdEntry, SkuName, Sku.VpdOffset)
+ SkuValueMap[PcdValue].append(Sku)
+ if not NeedProcessVpdMapFile and Sku.VpdOffset == TAB_STAR:
+ NeedProcessVpdMapFile = True
+ if DscPcdEntry.DatumType == TAB_VOID and PcdValue.startswith("L"):
+ UnicodePcdArray.add(DscPcdEntry)
+ elif len(Sku.VariableName) > 0:
+ HiiPcdArray.add(DscPcdEntry)
+ else:
+ OtherPcdArray.add(DscPcdEntry)
+
+ # if the offset of a VPD is *, then it need to be fixed up by third party tool.
+ VpdSkuMap[DscPcd] = SkuValueMap
+ if (self.Platform.FlashDefinition is None or self.Platform.FlashDefinition == '') and \
+ VpdFile.GetCount() != 0:
+ EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
+ "Fail to get FLASH_DEFINITION definition in DSC file %s which is required when DSC contains VPD PCD." % str(self.Platform.MetaFile))
+
+ if VpdFile.GetCount() != 0:
+
+ self.FixVpdOffset(VpdFile)
+
+ self.FixVpdOffset(self.UpdateNVStoreMaxSize(VpdFile))
+ PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
+ if PcdNvStoreDfBuffer:
+ PcdName,PcdGuid = PcdNvStoreDfBuffer[0].TokenCName, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName
+ if (PcdName,PcdGuid) in VpdSkuMap:
+ DefaultSku = PcdNvStoreDfBuffer[0].SkuInfoList.get(TAB_DEFAULT)
+ VpdSkuMap[(PcdName,PcdGuid)] = {DefaultSku.DefaultValue:[SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values() ]}
+
+ # Process VPD map file generated by third party BPDG tool
+ if NeedProcessVpdMapFile:
+ VpdMapFilePath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
+ if os.path.exists(VpdMapFilePath):
+ VpdFile.Read(VpdMapFilePath)
+
+ # Fixup TAB_STAR offset
+ for pcd in VpdSkuMap:
+ vpdinfo = VpdFile.GetVpdInfo(pcd)
+ if vpdinfo is None:
+ # just pick the a value to determine whether is unicode string type
+ continue
+ for pcdvalue in VpdSkuMap[pcd]:
+ for sku in VpdSkuMap[pcd][pcdvalue]:
+ for item in vpdinfo:
+ if item[2] == pcdvalue:
+ sku.VpdOffset = item[1]
+ else:
+ EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
+
+ # Delete the DynamicPcdList At the last time enter into this function
+ for Pcd in self._DynamicPcdList:
+ # just pick the a value to determine whether is unicode string type
+ Sku = Pcd.SkuInfoList.get(TAB_DEFAULT)
+ Sku.VpdOffset = Sku.VpdOffset.strip()
+
+ if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+ Pcd.DatumType = TAB_VOID
+
+ PcdValue = Sku.DefaultValue
+ if Pcd.DatumType == TAB_VOID and PcdValue.startswith("L"):
+ # if found PCD which datum value is unicode string the insert to left size of UnicodeIndex
+ UnicodePcdArray.add(Pcd)
+ elif len(Sku.VariableName) > 0:
+ # if found HII type PCD then insert to right of UnicodeIndex
+ HiiPcdArray.add(Pcd)
+ else:
+ OtherPcdArray.add(Pcd)
+ del self._DynamicPcdList[:]
+ self._DynamicPcdList.extend(list(UnicodePcdArray))
+ self._DynamicPcdList.extend(list(HiiPcdArray))
+ self._DynamicPcdList.extend(list(OtherPcdArray))
+ allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName, Sku) in pcd.SkuInfoList.items()]
+ for pcd in self._DynamicPcdList:
+ if len(pcd.SkuInfoList) == 1:
+ for (SkuName, SkuId) in allskuset:
+ if isinstance(SkuId, str) and eval(SkuId) == 0 or SkuId == 0:
+ continue
+ pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList[TAB_DEFAULT])
+ pcd.SkuInfoList[SkuName].SkuId = SkuId
+ pcd.SkuInfoList[SkuName].SkuIdName = SkuName
+
+ def FixVpdOffset(self, VpdFile ):
+ FvPath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY)
+ if not os.path.exists(FvPath):
+ try:
+ os.makedirs(FvPath)
+ except:
+ EdkLogger.error("build", FILE_WRITE_FAILURE, "Fail to create FV folder under %s" % self.BuildDir)
+
+ VpdFilePath = os.path.join(FvPath, "%s.txt" % self.Platform.VpdToolGuid)
+
+ if VpdFile.Write(VpdFilePath):
+ # retrieve BPDG tool's path from tool_def.txt according to VPD_TOOL_GUID defined in DSC file.
+ BPDGToolName = None
+ for ToolDef in self.ToolDefinition.values():
+ if TAB_GUID in ToolDef and ToolDef[TAB_GUID] == self.Platform.VpdToolGuid:
+ if "PATH" not in ToolDef:
+ EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt" % self.Platform.VpdToolGuid)
+ BPDGToolName = ToolDef["PATH"]
+ break
+ # Call third party GUID BPDG tool.
+ if BPDGToolName is not None:
+ VpdInfoFile.CallExtenalBPDGTool(BPDGToolName, VpdFilePath)
+ else:
+ EdkLogger.error("Build", FILE_NOT_FOUND, "Fail to find third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.")
+
+ ## Return the platform build data object
+ @cached_property
+ def Platform(self):
+ return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
+
+ ## Return platform name
+ @cached_property
+ def Name(self):
+ return self.Platform.PlatformName
+
+ ## Return the meta file GUID
+ @cached_property
+ def Guid(self):
+ return self.Platform.Guid
+
+ ## Return the platform version
+ @cached_property
+ def Version(self):
+ return self.Platform.Version
+
+ ## Return the FDF file name
+ @cached_property
+ def FdfFile(self):
+ if self.Workspace.FdfFile:
+ RetVal= mws.join(self.WorkspaceDir, self.Workspace.FdfFile)
+ else:
+ RetVal = ''
+ return RetVal
+
+ ## Return the build output directory platform specifies
+ @cached_property
+ def OutputDir(self):
+ return self.Platform.OutputDirectory
+
+ ## Return the directory to store all intermediate and final files built
+ @cached_property
+ def BuildDir(self):
+ if os.path.isabs(self.OutputDir):
+ GlobalData.gBuildDirectory = RetVal = path.join(
+ path.abspath(self.OutputDir),
+ self.BuildTarget + "_" + self.ToolChain,
+ )
+ else:
+ GlobalData.gBuildDirectory = RetVal = path.join(
+ self.WorkspaceDir,
+ self.OutputDir,
+ self.BuildTarget + "_" + self.ToolChain,
+ )
+ return RetVal
+
+ ## Return directory of platform makefile
+ #
+ # @retval string Makefile directory
+ #
+ @cached_property
+ def MakeFileDir(self):
+ return path.join(self.BuildDir, self.Arch)
+
+ ## Return build command string
+ #
+ # @retval string Build command string
+ #
+ @cached_property
+ def BuildCommand(self):
+ RetVal = []
+ if "MAKE" in self.ToolDefinition and "PATH" in self.ToolDefinition["MAKE"]:
+ RetVal += _SplitOption(self.ToolDefinition["MAKE"]["PATH"])
+ if "FLAGS" in self.ToolDefinition["MAKE"]:
+ NewOption = self.ToolDefinition["MAKE"]["FLAGS"].strip()
+ if NewOption != '':
+ RetVal += _SplitOption(NewOption)
+ if "MAKE" in self.EdkIIBuildOption:
+ if "FLAGS" in self.EdkIIBuildOption["MAKE"]:
+ Flags = self.EdkIIBuildOption["MAKE"]["FLAGS"]
+ if Flags.startswith('='):
+ RetVal = [RetVal[0]] + [Flags[1:]]
+ else:
+ RetVal.append(Flags)
+ return RetVal
+
+ ## Get tool chain definition
+ #
+ # Get each tool definition for given tool chain from tools_def.txt and platform
+ #
+ @cached_property
+ def ToolDefinition(self):
+ ToolDefinition = self.Workspace.ToolDef.ToolsDefTxtDictionary
+ if TAB_TOD_DEFINES_COMMAND_TYPE not in self.Workspace.ToolDef.ToolsDefTxtDatabase:
+ EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No tools found in configuration",
+ ExtraData="[%s]" % self.MetaFile)
+ RetVal = OrderedDict()
+ DllPathList = set()
+ for Def in ToolDefinition:
+ Target, Tag, Arch, Tool, Attr = Def.split("_")
+ if Target != self.BuildTarget or Tag != self.ToolChain or Arch != self.Arch:
+ continue
+
+ Value = ToolDefinition[Def]
+ # don't record the DLL
+ if Attr == "DLL":
+ DllPathList.add(Value)
+ continue
+
+ if Tool not in RetVal:
+ RetVal[Tool] = OrderedDict()
+ RetVal[Tool][Attr] = Value
+
+ ToolsDef = ''
+ if GlobalData.gOptions.SilentMode and "MAKE" in RetVal:
+ if "FLAGS" not in RetVal["MAKE"]:
+ RetVal["MAKE"]["FLAGS"] = ""
+ RetVal["MAKE"]["FLAGS"] += " -s"
+ MakeFlags = ''
+ for Tool in RetVal:
+ for Attr in RetVal[Tool]:
+ Value = RetVal[Tool][Attr]
+ if Tool in self._BuildOptionWithToolDef(RetVal) and Attr in self._BuildOptionWithToolDef(RetVal)[Tool]:
+ # check if override is indicated
+ if self._BuildOptionWithToolDef(RetVal)[Tool][Attr].startswith('='):
+ Value = self._BuildOptionWithToolDef(RetVal)[Tool][Attr][1:]
+ else:
+ if Attr != 'PATH':
+ Value += " " + self._BuildOptionWithToolDef(RetVal)[Tool][Attr]
+ else:
+ Value = self._BuildOptionWithToolDef(RetVal)[Tool][Attr]
+
+ if Attr == "PATH":
+ # Don't put MAKE definition in the file
+ if Tool != "MAKE":
+ ToolsDef += "%s = %s\n" % (Tool, Value)
+ elif Attr != "DLL":
+ # Don't put MAKE definition in the file
+ if Tool == "MAKE":
+ if Attr == "FLAGS":
+ MakeFlags = Value
+ else:
+ ToolsDef += "%s_%s = %s\n" % (Tool, Attr, Value)
+ ToolsDef += "\n"
+
+ tool_def_file = os.path.join(self.MakeFileDir, "TOOLS_DEF." + self.Arch)
+ SaveFileOnChange(tool_def_file, ToolsDef, False)
+ for DllPath in DllPathList:
+ os.environ["PATH"] = DllPath + os.pathsep + os.environ["PATH"]
+ os.environ["MAKE_FLAGS"] = MakeFlags
+
+ return RetVal
+
+ ## Return the paths of tools
+ @cached_property
+ def ToolDefinitionFile(self):
+ tool_def_file = os.path.join(self.MakeFileDir, "TOOLS_DEF." + self.Arch)
+ if not os.path.exists(tool_def_file):
+ self.ToolDefinition
+ return tool_def_file
+
+ ## Retrieve the toolchain family of given toolchain tag. Default to 'MSFT'.
+ @cached_property
+ def ToolChainFamily(self):
+ ToolDefinition = self.Workspace.ToolDef.ToolsDefTxtDatabase
+ if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \
+ or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_FAMILY] \
+ or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain]:
+ EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT." \
+ % self.ToolChain)
+ RetVal = TAB_COMPILER_MSFT
+ else:
+ RetVal = ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain]
+ return RetVal
+
+ @cached_property
+ def BuildRuleFamily(self):
+ ToolDefinition = self.Workspace.ToolDef.ToolsDefTxtDatabase
+ if TAB_TOD_DEFINES_BUILDRULEFAMILY not in ToolDefinition \
+ or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY] \
+ or not ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.ToolChain]:
+ EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT." \
+ % self.ToolChain)
+ return TAB_COMPILER_MSFT
+
+ return ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.ToolChain]
+
+ ## Return the build options specific for all modules in this platform
+ @cached_property
+ def BuildOption(self):
+ return self._ExpandBuildOption(self.Platform.BuildOptions)
+
+ def _BuildOptionWithToolDef(self, ToolDef):
+ return self._ExpandBuildOption(self.Platform.BuildOptions, ToolDef=ToolDef)
+
+ ## Return the build options specific for EDK modules in this platform
+ @cached_property
+ def EdkBuildOption(self):
+ return self._ExpandBuildOption(self.Platform.BuildOptions, EDK_NAME)
+
+ ## Return the build options specific for EDKII modules in this platform
+ @cached_property
+ def EdkIIBuildOption(self):
+ return self._ExpandBuildOption(self.Platform.BuildOptions, EDKII_NAME)
+
+ ## Parse build_rule.txt in Conf Directory.
+ #
+ # @retval BuildRule object
+ #
+ @cached_property
+ def BuildRule(self):
+ BuildRuleFile = None
+ if TAB_TAT_DEFINES_BUILD_RULE_CONF in self.Workspace.TargetTxt.TargetTxtDictionary:
+ BuildRuleFile = self.Workspace.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_BUILD_RULE_CONF]
+ if not BuildRuleFile:
+ BuildRuleFile = gDefaultBuildRuleFile
+ RetVal = BuildRule(BuildRuleFile)
+ if RetVal._FileVersion == "":
+ RetVal._FileVersion = AutoGenReqBuildRuleVerNum
+ else:
+ if RetVal._FileVersion < AutoGenReqBuildRuleVerNum :
+ # If Build Rule's version is less than the version number required by the tools, halting the build.
+ EdkLogger.error("build", AUTOGEN_ERROR,
+ ExtraData="The version number [%s] of build_rule.txt is less than the version number required by the AutoGen.(the minimum required version number is [%s])"\
+ % (RetVal._FileVersion, AutoGenReqBuildRuleVerNum))
+ return RetVal
+
+ ## Summarize the packages used by modules in this platform
+ @cached_property
+ def PackageList(self):
+ RetVal = set()
+ for Mb in self._MbList:
+ RetVal.update(Mb.Packages)
+ for lb in Mb.LibInstances:
+ RetVal.update(lb.Packages)
+ #Collect package set information from INF of FDF
+ for ModuleFile in self._AsBuildModuleList:
+ if ModuleFile in self.Platform.Modules:
+ continue
+ ModuleData = self.BuildDatabase[ModuleFile, self.Arch, self.BuildTarget, self.ToolChain]
+ RetVal.update(ModuleData.Packages)
+ return list(RetVal)
+
+ @cached_property
+ def NonDynamicPcdDict(self):
+ return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):Pcd for Pcd in self.NonDynamicPcdList}
+
+ ## Get list of non-dynamic PCDs
+ @property
+ def NonDynamicPcdList(self):
+ if not self._NonDynamicPcdList:
+ self.CollectPlatformDynamicPcds()
+ return self._NonDynamicPcdList
+
+ ## Get list of dynamic PCDs
+ @property
+ def DynamicPcdList(self):
+ if not self._DynamicPcdList:
+ self.CollectPlatformDynamicPcds()
+ return self._DynamicPcdList
+
+ ## Generate Token Number for all PCD
+ @cached_property
+ def PcdTokenNumber(self):
+ RetVal = OrderedDict()
+ TokenNumber = 1
+ #
+ # Make the Dynamic and DynamicEx PCD use within different TokenNumber area.
+ # Such as:
+ #
+ # Dynamic PCD:
+ # TokenNumber 0 ~ 10
+ # DynamicEx PCD:
+ # TokeNumber 11 ~ 20
+ #
+ for Pcd in self.DynamicPcdList:
+ if Pcd.Phase == "PEI" and Pcd.Type in PCD_DYNAMIC_TYPE_SET:
+ EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+ RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
+ TokenNumber += 1
+
+ for Pcd in self.DynamicPcdList:
+ if Pcd.Phase == "PEI" and Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
+ EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+ RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
+ TokenNumber += 1
+
+ for Pcd in self.DynamicPcdList:
+ if Pcd.Phase == "DXE" and Pcd.Type in PCD_DYNAMIC_TYPE_SET:
+ EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+ RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
+ TokenNumber += 1
+
+ for Pcd in self.DynamicPcdList:
+ if Pcd.Phase == "DXE" and Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
+ EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+ RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
+ TokenNumber += 1
+
+ for Pcd in self.NonDynamicPcdList:
+ RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
+ TokenNumber += 1
+ return RetVal
+
+ @cached_property
+ def _MbList(self):
+ return [self.BuildDatabase[m, self.Arch, self.BuildTarget, self.ToolChain] for m in self.Platform.Modules]
+
+ @cached_property
+ def _MaList(self):
+ for ModuleFile in self.Platform.Modules:
+ Ma = ModuleAutoGen(
+ self.Workspace,
+ ModuleFile,
+ self.BuildTarget,
+ self.ToolChain,
+ self.Arch,
+ self.MetaFile,
+ self.DataPipe
+ )
+ self.Platform.Modules[ModuleFile].M = Ma
+ return [x.M for x in self.Platform.Modules.values()]
+
+ ## Summarize ModuleAutoGen objects of all modules to be built for this platform
+ @cached_property
+ def ModuleAutoGenList(self):
+ RetVal = []
+ for Ma in self._MaList:
+ if Ma not in RetVal:
+ RetVal.append(Ma)
+ return RetVal
+
+ ## Summarize ModuleAutoGen objects of all libraries to be built for this platform
+ @cached_property
+ def LibraryAutoGenList(self):
+ RetVal = []
+ for Ma in self._MaList:
+ for La in Ma.LibraryAutoGenList:
+ if La not in RetVal:
+ RetVal.append(La)
+ if Ma not in La.ReferenceModules:
+ La.ReferenceModules.append(Ma)
+ return RetVal
+
+ ## Test if a module is supported by the platform
+ #
+ # An error will be raised directly if the module or its arch is not supported
+ # by the platform or current configuration
+ #
+ def ValidModule(self, Module):
+ return Module in self.Platform.Modules or Module in self.Platform.LibraryInstances \
+ or Module in self._AsBuildModuleList
+ @cached_property
+ def GetAllModuleInfo(self,WithoutPcd=True):
+ ModuleLibs = set()
+ for m in self.Platform.Modules:
+ module_obj = self.BuildDatabase[m,self.Arch,self.BuildTarget,self.ToolChain]
+ Libs = GetModuleLibInstances(module_obj, self.Platform, self.BuildDatabase, self.Arch,self.BuildTarget,self.ToolChain)
+ ModuleLibs.update( set([(l.MetaFile.File,l.MetaFile.Root,l.Arch,True) for l in Libs]))
+ if WithoutPcd and module_obj.PcdIsDriver:
+ continue
+ ModuleLibs.add((m.File,m.Root,module_obj.Arch,False))
+
+ return ModuleLibs
+
+ ## Resolve the library classes in a module to library instances
+ #
+ # This method will not only resolve library classes but also sort the library
+ # instances according to the dependency-ship.
+ #
+ # @param Module The module from which the library classes will be resolved
+ #
+ # @retval library_list List of library instances sorted
+ #
+ def ApplyLibraryInstance(self, Module):
+ # Cover the case that the binary INF file is list in the FDF file but not DSC file, return empty list directly
+ if str(Module) not in self.Platform.Modules:
+ return []
+
+ return GetModuleLibInstances(Module,
+ self.Platform,
+ self.BuildDatabase,
+ self.Arch,
+ self.BuildTarget,
+ self.ToolChain,
+ self.MetaFile,
+ EdkLogger)
+
+ ## Override PCD setting (type, value, ...)
+ #
+ # @param ToPcd The PCD to be overridden
+ # @param FromPcd The PCD overriding from
+ #
+ def _OverridePcd(self, ToPcd, FromPcd, Module="", Msg="", Library=""):
+ #
+ # in case there's PCDs coming from FDF file, which have no type given.
+ # at this point, ToPcd.Type has the type found from dependent
+ # package
+ #
+ TokenCName = ToPcd.TokenCName
+ for PcdItem in GlobalData.MixedPcd:
+ if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
+ TokenCName = PcdItem[0]
+ break
+ if FromPcd is not None:
+ if ToPcd.Pending and FromPcd.Type:
+ ToPcd.Type = FromPcd.Type
+ elif ToPcd.Type and FromPcd.Type\
+ and ToPcd.Type != FromPcd.Type and ToPcd.Type in FromPcd.Type:
+ if ToPcd.Type.strip() == TAB_PCDS_DYNAMIC_EX:
+ ToPcd.Type = FromPcd.Type
+ elif ToPcd.Type and FromPcd.Type \
+ and ToPcd.Type != FromPcd.Type:
+ if Library:
+ Module = str(Module) + " 's library file (" + str(Library) + ")"
+ EdkLogger.error("build", OPTION_CONFLICT, "Mismatched PCD type",
+ ExtraData="%s.%s is used as [%s] in module %s, but as [%s] in %s."\
+ % (ToPcd.TokenSpaceGuidCName, TokenCName,
+ ToPcd.Type, Module, FromPcd.Type, Msg),
+ File=self.MetaFile)
+
+ if FromPcd.MaxDatumSize:
+ ToPcd.MaxDatumSize = FromPcd.MaxDatumSize
+ ToPcd.MaxSizeUserSet = FromPcd.MaxDatumSize
+ if FromPcd.DefaultValue:
+ ToPcd.DefaultValue = FromPcd.DefaultValue
+ if FromPcd.TokenValue:
+ ToPcd.TokenValue = FromPcd.TokenValue
+ if FromPcd.DatumType:
+ ToPcd.DatumType = FromPcd.DatumType
+ if FromPcd.SkuInfoList:
+ ToPcd.SkuInfoList = FromPcd.SkuInfoList
+ if FromPcd.UserDefinedDefaultStoresFlag:
+ ToPcd.UserDefinedDefaultStoresFlag = FromPcd.UserDefinedDefaultStoresFlag
+ # Add Flexible PCD format parse
+ if ToPcd.DefaultValue:
+ try:
+ ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, self.Platform._GuidDict)(True)
+ except BadExpression as Value:
+ EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
+ File=self.MetaFile)
+
+ # check the validation of datum
+ IsValid, Cause = CheckPcdDatum(ToPcd.DatumType, ToPcd.DefaultValue)
+ if not IsValid:
+ EdkLogger.error('build', FORMAT_INVALID, Cause, File=self.MetaFile,
+ ExtraData="%s.%s" % (ToPcd.TokenSpaceGuidCName, TokenCName))
+ ToPcd.validateranges = FromPcd.validateranges
+ ToPcd.validlists = FromPcd.validlists
+ ToPcd.expressions = FromPcd.expressions
+ ToPcd.CustomAttribute = FromPcd.CustomAttribute
+
+ if FromPcd is not None and ToPcd.DatumType == TAB_VOID and not ToPcd.MaxDatumSize:
+ EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specified for PCD %s.%s" \
+ % (ToPcd.TokenSpaceGuidCName, TokenCName))
+ Value = ToPcd.DefaultValue
+ if not Value:
+ ToPcd.MaxDatumSize = '1'
+ elif Value[0] == 'L':
+ ToPcd.MaxDatumSize = str((len(Value) - 2) * 2)
+ elif Value[0] == '{':
+ ToPcd.MaxDatumSize = str(len(Value.split(',')))
+ else:
+ ToPcd.MaxDatumSize = str(len(Value) - 1)
+
+ # apply default SKU for dynamic PCDS if specified one is not available
+ if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYNAMIC_EX_TYPE_SET) \
+ and not ToPcd.SkuInfoList:
+ if self.Platform.SkuName in self.Platform.SkuIds:
+ SkuName = self.Platform.SkuName
+ else:
+ SkuName = TAB_DEFAULT
+ ToPcd.SkuInfoList = {
+ SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[SkuName][0], '', '', '', '', '', ToPcd.DefaultValue)
+ }
+
+ ## Apply PCD setting defined platform to a module
+ #
+ # @param Module The module from which the PCD setting will be overridden
+ #
+ # @retval PCD_list The list PCDs with settings from platform
+ #
+ def ApplyPcdSetting(self, Module, Pcds, Library=""):
+ # for each PCD in module
+ for Name, Guid in Pcds:
+ PcdInModule = Pcds[Name, Guid]
+ # find out the PCD setting in platform
+ if (Name, Guid) in self.Platform.Pcds:
+ PcdInPlatform = self.Platform.Pcds[Name, Guid]
+ else:
+ PcdInPlatform = None
+ # then override the settings if any
+ self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg="DSC PCD sections", Library=Library)
+ # resolve the VariableGuid value
+ for SkuId in PcdInModule.SkuInfoList:
+ Sku = PcdInModule.SkuInfoList[SkuId]
+ if Sku.VariableGuid == '': continue
+ Sku.VariableGuidValue = GuidValue(Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
+ if Sku.VariableGuidValue is None:
+ PackageList = "\n\t".join(str(P) for P in self.PackageList)
+ EdkLogger.error(
+ 'build',
+ RESOURCE_NOT_AVAILABLE,
+ "Value of GUID [%s] is not found in" % Sku.VariableGuid,
+ ExtraData=PackageList + "\n\t(used with %s.%s from module %s)" \
+ % (Guid, Name, str(Module)),
+ File=self.MetaFile
+ )
+
+ # override PCD settings with module specific setting
+ if Module in self.Platform.Modules:
+ PlatformModule = self.Platform.Modules[str(Module)]
+ for Key in PlatformModule.Pcds:
+ if GlobalData.BuildOptionPcd:
+ for pcd in GlobalData.BuildOptionPcd:
+ (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, _) = pcd
+ if (TokenCName, TokenSpaceGuidCName) == Key and FieldName =="":
+ PlatformModule.Pcds[Key].DefaultValue = pcdvalue
+ PlatformModule.Pcds[Key].PcdValueFromComm = pcdvalue
+ break
+ Flag = False
+ if Key in Pcds:
+ ToPcd = Pcds[Key]
+ Flag = True
+ elif Key in GlobalData.MixedPcd:
+ for PcdItem in GlobalData.MixedPcd[Key]:
+ if PcdItem in Pcds:
+ ToPcd = Pcds[PcdItem]
+ Flag = True
+ break
+ if Flag:
+ self._OverridePcd(ToPcd, PlatformModule.Pcds[Key], Module, Msg="DSC Components Module scoped PCD section", Library=Library)
+ # use PCD value to calculate the MaxDatumSize when it is not specified
+ for Name, Guid in Pcds:
+ Pcd = Pcds[Name, Guid]
+ if Pcd.DatumType == TAB_VOID and not Pcd.MaxDatumSize:
+ Pcd.MaxSizeUserSet = None
+ Value = Pcd.DefaultValue
+ if not Value:
+ Pcd.MaxDatumSize = '1'
+ elif Value[0] == 'L':
+ Pcd.MaxDatumSize = str((len(Value) - 2) * 2)
+ elif Value[0] == '{':
+ Pcd.MaxDatumSize = str(len(Value.split(',')))
+ else:
+ Pcd.MaxDatumSize = str(len(Value) - 1)
+ return list(Pcds.values())
+
+ ## Append build options in platform to a module
+ #
+ # @param Module The module to which the build options will be appended
+ #
+ # @retval options The options appended with build options in platform
+ #
+ def ApplyBuildOption(self, Module):
+ # Get the different options for the different style module
+ PlatformOptions = self.EdkIIBuildOption
+ ModuleTypeOptions = self.Platform.GetBuildOptionsByModuleType(EDKII_NAME, Module.ModuleType)
+ ModuleTypeOptions = self._ExpandBuildOption(ModuleTypeOptions)
+ ModuleOptions = self._ExpandBuildOption(Module.BuildOptions)
+ if Module in self.Platform.Modules:
+ PlatformModule = self.Platform.Modules[str(Module)]
+ PlatformModuleOptions = self._ExpandBuildOption(PlatformModule.BuildOptions)
+ else:
+ PlatformModuleOptions = {}
+
+ BuildRuleOrder = None
+ for Options in [self.ToolDefinition, ModuleOptions, PlatformOptions, ModuleTypeOptions, PlatformModuleOptions]:
+ for Tool in Options:
+ for Attr in Options[Tool]:
+ if Attr == TAB_TOD_DEFINES_BUILDRULEORDER:
+ BuildRuleOrder = Options[Tool][Attr]
+
+ AllTools = set(list(ModuleOptions.keys()) + list(PlatformOptions.keys()) +
+ list(PlatformModuleOptions.keys()) + list(ModuleTypeOptions.keys()) +
+ list(self.ToolDefinition.keys()))
+ BuildOptions = defaultdict(lambda: defaultdict(str))
+ for Tool in AllTools:
+ for Options in [self.ToolDefinition, ModuleOptions, PlatformOptions, ModuleTypeOptions, PlatformModuleOptions]:
+ if Tool not in Options:
+ continue
+ for Attr in Options[Tool]:
+ #
+ # Do not generate it in Makefile
+ #
+ if Attr == TAB_TOD_DEFINES_BUILDRULEORDER:
+ continue
+ Value = Options[Tool][Attr]
+ # check if override is indicated
+ if Value.startswith('='):
+ BuildOptions[Tool][Attr] = mws.handleWsMacro(Value[1:])
+ else:
+ if Attr != 'PATH':
+ BuildOptions[Tool][Attr] += " " + mws.handleWsMacro(Value)
+ else:
+ BuildOptions[Tool][Attr] = mws.handleWsMacro(Value)
+
+ return BuildOptions, BuildRuleOrder
+
+
+ def GetGlobalBuildOptions(self,Module):
+ ModuleTypeOptions = self.Platform.GetBuildOptionsByModuleType(EDKII_NAME, Module.ModuleType)
+ ModuleTypeOptions = self._ExpandBuildOption(ModuleTypeOptions)
+
+ if Module in self.Platform.Modules:
+ PlatformModule = self.Platform.Modules[str(Module)]
+ PlatformModuleOptions = self._ExpandBuildOption(PlatformModule.BuildOptions)
+ else:
+ PlatformModuleOptions = {}
+
+ return ModuleTypeOptions,PlatformModuleOptions
+
+ @cached_property
+ def UniqueBaseName(self):
+ retVal ={}
+ name_path_map = {}
+ for Module in self._MbList:
+ name_path_map[Module.BaseName] = set()
+ for Module in self._MbList:
+ name_path_map[Module.BaseName].add(Module.MetaFile)
+ for name in name_path_map:
+ if len(name_path_map[name]) > 1:
+ guidset = set()
+ for metafile in name_path_map[name]:
+ m = self.BuildDatabase[metafile, self.Arch, self.BuildTarget, self.ToolChain]
+ retVal[name] = '%s_%s' % (name, m.Guid)
+ guidset.add(m.Guid)
+ samemodules = list(name_path_map[name])
+ if len(guidset) > 1:
+ EdkLogger.error("build", FILE_DUPLICATED, 'Modules have same BaseName and FILE_GUID:\n'
+ ' %s\n %s' % (samemodules[0], samemodules[1]))
+ return retVal
+ ## Expand * in build option key
+ #
+ # @param Options Options to be expanded
+ # @param ToolDef Use specified ToolDef instead of full version.
+ # This is needed during initialization to prevent
+ # infinite recursion betweeh BuildOptions,
+ # ToolDefinition, and this function.
+ #
+ # @retval options Options expanded
+ #
+ def _ExpandBuildOption(self, Options, ModuleStyle=None, ToolDef=None):
+ if not ToolDef:
+ ToolDef = self.ToolDefinition
+ BuildOptions = {}
+ FamilyMatch = False
+ FamilyIsNull = True
+
+ OverrideList = {}
+ #
+ # Construct a list contain the build options which need override.
+ #
+ for Key in Options:
+ #
+ # Key[0] -- tool family
+ # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
+ #
+ if (Key[0] == self.BuildRuleFamily and
+ (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 and Key[2] == ModuleStyle))):
+ Target, ToolChain, Arch, CommandType, Attr = Key[1].split('_')
+ if (Target == self.BuildTarget or Target == TAB_STAR) and\
+ (ToolChain == self.ToolChain or ToolChain == TAB_STAR) and\
+ (Arch == self.Arch or Arch == TAB_STAR) and\
+ Options[Key].startswith("="):
+
+ if OverrideList.get(Key[1]) is not None:
+ OverrideList.pop(Key[1])
+ OverrideList[Key[1]] = Options[Key]
+
+ #
+ # Use the highest priority value.
+ #
+ if (len(OverrideList) >= 2):
+ KeyList = list(OverrideList.keys())
+ for Index in range(len(KeyList)):
+ NowKey = KeyList[Index]
+ Target1, ToolChain1, Arch1, CommandType1, Attr1 = NowKey.split("_")
+ for Index1 in range(len(KeyList) - Index - 1):
+ NextKey = KeyList[Index1 + Index + 1]
+ #
+ # Compare two Key, if one is included by another, choose the higher priority one
+ #
+ Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split("_")
+ if (Target1 == Target2 or Target1 == TAB_STAR or Target2 == TAB_STAR) and\
+ (ToolChain1 == ToolChain2 or ToolChain1 == TAB_STAR or ToolChain2 == TAB_STAR) and\
+ (Arch1 == Arch2 or Arch1 == TAB_STAR or Arch2 == TAB_STAR) and\
+ (CommandType1 == CommandType2 or CommandType1 == TAB_STAR or CommandType2 == TAB_STAR) and\
+ (Attr1 == Attr2 or Attr1 == TAB_STAR or Attr2 == TAB_STAR):
+
+ if CalculatePriorityValue(NowKey) > CalculatePriorityValue(NextKey):
+ if Options.get((self.BuildRuleFamily, NextKey)) is not None:
+ Options.pop((self.BuildRuleFamily, NextKey))
+ else:
+ if Options.get((self.BuildRuleFamily, NowKey)) is not None:
+ Options.pop((self.BuildRuleFamily, NowKey))
+
+ for Key in Options:
+ if ModuleStyle is not None and len (Key) > 2:
+ # Check Module style is EDK or EDKII.
+ # Only append build option for the matched style module.
+ if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
+ continue
+ elif ModuleStyle == EDKII_NAME and Key[2] != EDKII_NAME:
+ continue
+ Family = Key[0]
+ Target, Tag, Arch, Tool, Attr = Key[1].split("_")
+ # if tool chain family doesn't match, skip it
+ if Tool in ToolDef and Family != "":
+ FamilyIsNull = False
+ if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY, "") != "":
+ if Family != ToolDef[Tool][TAB_TOD_DEFINES_BUILDRULEFAMILY]:
+ continue
+ elif Family != ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]:
+ continue
+ FamilyMatch = True
+ # expand any wildcard
+ if Target == TAB_STAR or Target == self.BuildTarget:
+ if Tag == TAB_STAR or Tag == self.ToolChain:
+ if Arch == TAB_STAR or Arch == self.Arch:
+ if Tool not in BuildOptions:
+ BuildOptions[Tool] = {}
+ if Attr != "FLAGS" or Attr not in BuildOptions[Tool] or Options[Key].startswith('='):
+ BuildOptions[Tool][Attr] = Options[Key]
+ else:
+ # append options for the same tool except PATH
+ if Attr != 'PATH':
+ BuildOptions[Tool][Attr] += " " + Options[Key]
+ else:
+ BuildOptions[Tool][Attr] = Options[Key]
+ # Build Option Family has been checked, which need't to be checked again for family.
+ if FamilyMatch or FamilyIsNull:
+ return BuildOptions
+
+ for Key in Options:
+ if ModuleStyle is not None and len (Key) > 2:
+ # Check Module style is EDK or EDKII.
+ # Only append build option for the matched style module.
+ if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
+ continue
+ elif ModuleStyle == EDKII_NAME and Key[2] != EDKII_NAME:
+ continue
+ Family = Key[0]
+ Target, Tag, Arch, Tool, Attr = Key[1].split("_")
+ # if tool chain family doesn't match, skip it
+ if Tool not in ToolDef or Family == "":
+ continue
+ # option has been added before
+ if Family != ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]:
+ continue
+
+ # expand any wildcard
+ if Target == TAB_STAR or Target == self.BuildTarget:
+ if Tag == TAB_STAR or Tag == self.ToolChain:
+ if Arch == TAB_STAR or Arch == self.Arch:
+ if Tool not in BuildOptions:
+ BuildOptions[Tool] = {}
+ if Attr != "FLAGS" or Attr not in BuildOptions[Tool] or Options[Key].startswith('='):
+ BuildOptions[Tool][Attr] = Options[Key]
+ else:
+ # append options for the same tool except PATH
+ if Attr != 'PATH':
+ BuildOptions[Tool][Attr] += " " + Options[Key]
+ else:
+ BuildOptions[Tool][Attr] = Options[Key]
+ return BuildOptions
diff --git a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
new file mode 100644
index 000000000000..ab58b21772c3
--- /dev/null
+++ b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
@@ -0,0 +1,905 @@
+## @file
+# Create makefile for MS nmake and GNU make
+#
+# Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+#
+
+## Import Modules
+#
+from __future__ import print_function
+from __future__ import absolute_import
+import os.path as path
+import hashlib
+from collections import defaultdict
+from GenFds.FdfParser import FdfParser
+from Workspace.WorkspaceCommon import GetModuleLibInstances
+from AutoGen import GenMake
+from AutoGen.AutoGen import AutoGen
+from AutoGen.PlatformAutoGen import PlatformAutoGen
+from AutoGen.BuildEngine import gDefaultBuildRuleFile
+from Common.ToolDefClassObject import gDefaultToolsDefFile
+from Common.StringUtils import NormPath
+from Common.BuildToolError import *
+from Common.DataType import *
+from Common.Misc import *
+
+## Regular expression for splitting Dependency Expression string into tokens
+gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
+
+## Regular expression for match: PCD(xxxx.yyy)
+gPCDAsGuidPattern = re.compile(r"^PCD\(.+\..+\)$")
+
+## Workspace AutoGen class
+#
+# This class is used mainly to control the whole platform build for different
+# architecture. This class will generate top level makefile.
+#
+class WorkspaceAutoGen(AutoGen):
+ # call super().__init__ then call the worker function with different parameter count
+ def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
+ if not hasattr(self, "_Init"):
+ self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
+ self._Init = True
+
+ ## Initialize WorkspaceAutoGen
+ #
+ # @param WorkspaceDir Root directory of workspace
+ # @param ActivePlatform Meta-file of active platform
+ # @param Target Build target
+ # @param Toolchain Tool chain name
+ # @param ArchList List of architecture of current build
+ # @param MetaFileDb Database containing meta-files
+ # @param BuildConfig Configuration of build
+ # @param ToolDefinition Tool chain definitions
+ # @param FlashDefinitionFile File of flash definition
+ # @param Fds FD list to be generated
+ # @param Fvs FV list to be generated
+ # @param Caps Capsule list to be generated
+ # @param SkuId SKU id from command line
+ #
+ def _InitWorker(self, WorkspaceDir, ActivePlatform, Target, Toolchain, ArchList, MetaFileDb,
+ BuildConfig, ToolDefinition, FlashDefinitionFile='', Fds=None, Fvs=None, Caps=None, SkuId='', UniFlag=None,
+ Progress=None, BuildModule=None):
+ self.BuildDatabase = MetaFileDb
+ self.MetaFile = ActivePlatform
+ self.WorkspaceDir = WorkspaceDir
+ self.Platform = self.BuildDatabase[self.MetaFile, TAB_ARCH_COMMON, Target, Toolchain]
+ GlobalData.gActivePlatform = self.Platform
+ self.BuildTarget = Target
+ self.ToolChain = Toolchain
+ self.ArchList = ArchList
+ self.SkuId = SkuId
+ self.UniFlag = UniFlag
+
+ self.TargetTxt = BuildConfig
+ self.ToolDef = ToolDefinition
+ self.FdfFile = FlashDefinitionFile
+ self.FdTargetList = Fds if Fds else []
+ self.FvTargetList = Fvs if Fvs else []
+ self.CapTargetList = Caps if Caps else []
+ self.AutoGenObjectList = []
+ self._GuidDict = {}
+
+ # there's many relative directory operations, so ...
+ os.chdir(self.WorkspaceDir)
+
+ self.MergeArch()
+ self.ValidateBuildTarget()
+
+ EdkLogger.info("")
+ if self.ArchList:
+ EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
+ EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
+ EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
+
+ EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
+ if BuildModule:
+ EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
+
+ if self.FdfFile:
+ EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
+
+ EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
+
+ if Progress:
+ Progress.Start("\nProcessing meta-data")
+ #
+ # Mark now build in AutoGen Phase
+ #
+ GlobalData.gAutoGenPhase = True
+ self.ProcessModuleFromPdf()
+ self.ProcessPcdType()
+ self.ProcessMixedPcd()
+ self.VerifyPcdsFromFDF()
+ self.CollectAllPcds()
+ self.GeneratePkgLevelHash()
+ #
+ # Check PCDs token value conflict in each DEC file.
+ #
+ self._CheckAllPcdsTokenValueConflict()
+ #
+ # Check PCD type and definition between DSC and DEC
+ #
+ self._CheckPcdDefineAndType()
+
+ self.CreateBuildOptionsFile()
+ self.CreatePcdTokenNumberFile()
+ self.CreateModuleHashInfo()
+ GlobalData.gAutoGenPhase = False
+
+ #
+ # Merge Arch
+ #
+ def MergeArch(self):
+ if not self.ArchList:
+ ArchList = set(self.Platform.SupArchList)
+ else:
+ ArchList = set(self.ArchList) & set(self.Platform.SupArchList)
+ if not ArchList:
+ EdkLogger.error("build", PARAMETER_INVALID,
+ ExtraData = "Invalid ARCH specified. [Valid ARCH: %s]" % (" ".join(self.Platform.SupArchList)))
+ elif self.ArchList and len(ArchList) != len(self.ArchList):
+ SkippedArchList = set(self.ArchList).symmetric_difference(set(self.Platform.SupArchList))
+ EdkLogger.verbose("\nArch [%s] is ignored because the platform supports [%s] only!"
+ % (" ".join(SkippedArchList), " ".join(self.Platform.SupArchList)))
+ self.ArchList = tuple(ArchList)
+
+ # Validate build target
+ def ValidateBuildTarget(self):
+ if self.BuildTarget not in self.Platform.BuildTargets:
+ EdkLogger.error("build", PARAMETER_INVALID,
+ ExtraData="Build target [%s] is not supported by the platform. [Valid target: %s]"
+ % (self.BuildTarget, " ".join(self.Platform.BuildTargets)))
+ @cached_property
+ def FdfProfile(self):
+ if not self.FdfFile:
+ self.FdfFile = self.Platform.FlashDefinition
+
+ FdfProfile = None
+ if self.FdfFile:
+ Fdf = FdfParser(self.FdfFile.Path)
+ Fdf.ParseFile()
+ GlobalData.gFdfParser = Fdf
+ if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.FdDict:
+ FdDict = Fdf.Profile.FdDict[Fdf.CurrentFdName]
+ for FdRegion in FdDict.RegionList:
+ if str(FdRegion.RegionType) is 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
+ if int(FdRegion.Offset) % 8 != 0:
+ EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
+ FdfProfile = Fdf.Profile
+ else:
+ if self.FdTargetList:
+ EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
+ self.FdTargetList = []
+ if self.FvTargetList:
+ EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
+ self.FvTargetList = []
+ if self.CapTargetList:
+ EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
+ self.CapTargetList = []
+
+ return FdfProfile
+
+ def ProcessModuleFromPdf(self):
+
+ if self.FdfProfile:
+ for fvname in self.FvTargetList:
+ if fvname.upper() not in self.FdfProfile.FvDict:
+ EdkLogger.error("build", OPTION_VALUE_INVALID,
+ "No such an FV in FDF file: %s" % fvname)
+
+ # In DSC file may use FILE_GUID to override the module, then in the Platform.Modules use FILE_GUIDmodule.inf as key,
+ # but the path (self.MetaFile.Path) is the real path
+ for key in self.FdfProfile.InfDict:
+ if key == 'ArchTBD':
+ MetaFile_cache = defaultdict(set)
+ for Arch in self.ArchList:
+ Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ for Pkey in Current_Platform_cache.Modules:
+ MetaFile_cache[Arch].add(Current_Platform_cache.Modules[Pkey].MetaFile)
+ for Inf in self.FdfProfile.InfDict[key]:
+ ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
+ for Arch in self.ArchList:
+ if ModuleFile in MetaFile_cache[Arch]:
+ break
+ else:
+ ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
+ if not ModuleData.IsBinaryModule:
+ EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
+
+ else:
+ for Arch in self.ArchList:
+ if Arch == key:
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ MetaFileList = set()
+ for Pkey in Platform.Modules:
+ MetaFileList.add(Platform.Modules[Pkey].MetaFile)
+ for Inf in self.FdfProfile.InfDict[key]:
+ ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
+ if ModuleFile in MetaFileList:
+ continue
+ ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
+ if not ModuleData.IsBinaryModule:
+ EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
+
+
+
+ # parse FDF file to get PCDs in it, if any
+ def VerifyPcdsFromFDF(self):
+
+ if self.FdfProfile:
+ PcdSet = self.FdfProfile.PcdDict
+ self.VerifyPcdDeclearation(PcdSet)
+
+ def ProcessPcdType(self):
+ for Arch in self.ArchList:
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ Platform.Pcds
+ # generate the SourcePcdDict and BinaryPcdDict
+ Libs = []
+ for BuildData in list(self.BuildDatabase._CACHE_.values()):
+ if BuildData.Arch != Arch:
+ continue
+ if BuildData.MetaFile.Ext == '.inf' and str(BuildData) in Platform.Modules :
+ Libs.extend(GetModuleLibInstances(BuildData, Platform,
+ self.BuildDatabase,
+ Arch,
+ self.BuildTarget,
+ self.ToolChain
+ ))
+ for BuildData in list(self.BuildDatabase._CACHE_.values()):
+ if BuildData.Arch != Arch:
+ continue
+ if BuildData.MetaFile.Ext == '.inf':
+ for key in BuildData.Pcds:
+ if BuildData.Pcds[key].Pending:
+ if key in Platform.Pcds:
+ PcdInPlatform = Platform.Pcds[key]
+ if PcdInPlatform.Type:
+ BuildData.Pcds[key].Type = PcdInPlatform.Type
+ BuildData.Pcds[key].Pending = False
+
+ if BuildData.MetaFile in Platform.Modules:
+ PlatformModule = Platform.Modules[str(BuildData.MetaFile)]
+ if key in PlatformModule.Pcds:
+ PcdInPlatform = PlatformModule.Pcds[key]
+ if PcdInPlatform.Type:
+ BuildData.Pcds[key].Type = PcdInPlatform.Type
+ BuildData.Pcds[key].Pending = False
+ else:
+ #Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
+ if BuildData.Pcds[key].Pending:
+ if bool(BuildData.LibraryClass):
+ if BuildData in set(Libs):
+ ReferenceModules = BuildData.ReferenceModules
+ for ReferenceModule in ReferenceModules:
+ if ReferenceModule.MetaFile in Platform.Modules:
+ RefPlatformModule = Platform.Modules[str(ReferenceModule.MetaFile)]
+ if key in RefPlatformModule.Pcds:
+ PcdInReferenceModule = RefPlatformModule.Pcds[key]
+ if PcdInReferenceModule.Type:
+ BuildData.Pcds[key].Type = PcdInReferenceModule.Type
+ BuildData.Pcds[key].Pending = False
+ break
+
+ def ProcessMixedPcd(self):
+ for Arch in self.ArchList:
+ SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
+ BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
+ SourcePcdDict_Keys = SourcePcdDict.keys()
+ BinaryPcdDict_Keys = BinaryPcdDict.keys()
+
+ # generate the SourcePcdDict and BinaryPcdDict
+
+ for BuildData in list(self.BuildDatabase._CACHE_.values()):
+ if BuildData.Arch != Arch:
+ continue
+ if BuildData.MetaFile.Ext == '.inf':
+ for key in BuildData.Pcds:
+ if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].Type:
+ if BuildData.IsBinaryModule:
+ BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+ else:
+ SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+
+ elif TAB_PCDS_PATCHABLE_IN_MODULE in BuildData.Pcds[key].Type:
+ if BuildData.MetaFile.Ext == '.inf':
+ if BuildData.IsBinaryModule:
+ BinaryPcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+ else:
+ SourcePcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+
+ elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Type:
+ SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+ elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[key].Type:
+ SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+
+ #
+ # A PCD can only use one type for all source modules
+ #
+ for i in SourcePcdDict_Keys:
+ for j in SourcePcdDict_Keys:
+ if i != j:
+ Intersections = SourcePcdDict[i].intersection(SourcePcdDict[j])
+ if len(Intersections) > 0:
+ EdkLogger.error(
+ 'build',
+ FORMAT_INVALID,
+ "Building modules from source INFs, following PCD use %s and %s access method. It must be corrected to use only one access method." % (i, j),
+ ExtraData='\n\t'.join(str(P[1]+'.'+P[0]) for P in Intersections)
+ )
+
+ #
+ # intersection the BinaryPCD for Mixed PCD
+ #
+ for i in BinaryPcdDict_Keys:
+ for j in BinaryPcdDict_Keys:
+ if i != j:
+ Intersections = BinaryPcdDict[i].intersection(BinaryPcdDict[j])
+ for item in Intersections:
+ NewPcd1 = (item[0] + '_' + i, item[1])
+ NewPcd2 = (item[0] + '_' + j, item[1])
+ if item not in GlobalData.MixedPcd:
+ GlobalData.MixedPcd[item] = [NewPcd1, NewPcd2]
+ else:
+ if NewPcd1 not in GlobalData.MixedPcd[item]:
+ GlobalData.MixedPcd[item].append(NewPcd1)
+ if NewPcd2 not in GlobalData.MixedPcd[item]:
+ GlobalData.MixedPcd[item].append(NewPcd2)
+
+ #
+ # intersection the SourcePCD and BinaryPCD for Mixed PCD
+ #
+ for i in SourcePcdDict_Keys:
+ for j in BinaryPcdDict_Keys:
+ if i != j:
+ Intersections = SourcePcdDict[i].intersection(BinaryPcdDict[j])
+ for item in Intersections:
+ NewPcd1 = (item[0] + '_' + i, item[1])
+ NewPcd2 = (item[0] + '_' + j, item[1])
+ if item not in GlobalData.MixedPcd:
+ GlobalData.MixedPcd[item] = [NewPcd1, NewPcd2]
+ else:
+ if NewPcd1 not in GlobalData.MixedPcd[item]:
+ GlobalData.MixedPcd[item].append(NewPcd1)
+ if NewPcd2 not in GlobalData.MixedPcd[item]:
+ GlobalData.MixedPcd[item].append(NewPcd2)
+
+ BuildData = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ for key in BuildData.Pcds:
+ for SinglePcd in GlobalData.MixedPcd:
+ if (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName) == SinglePcd:
+ for item in GlobalData.MixedPcd[SinglePcd]:
+ Pcd_Type = item[0].split('_')[-1]
+ if (Pcd_Type == BuildData.Pcds[key].Type) or (Pcd_Type == TAB_PCDS_DYNAMIC_EX and BuildData.Pcds[key].Type in PCD_DYNAMIC_EX_TYPE_SET) or \
+ (Pcd_Type == TAB_PCDS_DYNAMIC and BuildData.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET):
+ Value = BuildData.Pcds[key]
+ Value.TokenCName = BuildData.Pcds[key].TokenCName + '_' + Pcd_Type
+ if len(key) == 2:
+ newkey = (Value.TokenCName, key[1])
+ elif len(key) == 3:
+ newkey = (Value.TokenCName, key[1], key[2])
+ del BuildData.Pcds[key]
+ BuildData.Pcds[newkey] = Value
+ break
+ break
+
+ if self.FdfProfile:
+ PcdSet = self.FdfProfile.PcdDict
+ # handle the mixed pcd in FDF file
+ for key in PcdSet:
+ if key in GlobalData.MixedPcd:
+ Value = PcdSet[key]
+ del PcdSet[key]
+ for item in GlobalData.MixedPcd[key]:
+ PcdSet[item] = Value
+
+ #Collect package set information from INF of FDF
+ @cached_property
+ def PkgSet(self):
+ if not self.FdfFile:
+ self.FdfFile = self.Platform.FlashDefinition
+
+ if self.FdfFile:
+ ModuleList = self.FdfProfile.InfList
+ else:
+ ModuleList = []
+ Pkgs = {}
+ for Arch in self.ArchList:
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ PkgSet = set()
+ for mb in [self.BuildDatabase[m, Arch, self.BuildTarget, self.ToolChain] for m in Platform.Modules]:
+ PkgSet.update(mb.Packages)
+ for Inf in ModuleList:
+ ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
+ if ModuleFile in Platform.Modules:
+ continue
+ ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
+ PkgSet.update(ModuleData.Packages)
+ Pkgs[Arch] = list(PkgSet)
+ return Pkgs
+
+ def VerifyPcdDeclearation(self,PcdSet):
+ for Arch in self.ArchList:
+ Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+ Pkgs = self.PkgSet[Arch]
+ DecPcds = set()
+ DecPcdsKey = set()
+ for Pkg in Pkgs:
+ for Pcd in Pkg.Pcds:
+ DecPcds.add((Pcd[0], Pcd[1]))
+ DecPcdsKey.add((Pcd[0], Pcd[1], Pcd[2]))
+
+ Platform.SkuName = self.SkuId
+ for Name, Guid,Fileds in PcdSet:
+ if (Name, Guid) not in DecPcds:
+ EdkLogger.error(
+ 'build',
+ PARSER_ERROR,
+ "PCD (%s.%s) used in FDF is not declared in DEC files." % (Guid, Name),
+ File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
+ Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
+ )
+ else:
+ # Check whether Dynamic or DynamicEx PCD used in FDF file. If used, build break and give a error message.
+ if (Name, Guid, TAB_PCDS_FIXED_AT_BUILD) in DecPcdsKey \
+ or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE) in DecPcdsKey \
+ or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in DecPcdsKey:
+ continue
+ elif (Name, Guid, TAB_PCDS_DYNAMIC) in DecPcdsKey or (Name, Guid, TAB_PCDS_DYNAMIC_EX) in DecPcdsKey:
+ EdkLogger.error(
+ 'build',
+ PARSER_ERROR,
+ "Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (Guid, Name),
+ File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
+ Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
+ )
+ def CollectAllPcds(self):
+
+ for Arch in self.ArchList:
+ Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
+ #
+ # Explicitly collect platform's dynamic PCDs
+ #
+ Pa.CollectPlatformDynamicPcds()
+ Pa.CollectFixedAtBuildPcds()
+ self.AutoGenObjectList.append(Pa)
+ # We need to calculate the PcdTokenNumber after all Arch Pcds are collected.
+ for Arch in self.ArchList:
+ #Pcd TokenNumber
+ Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
+ self.UpdateModuleDataPipe(Arch, {"PCD_TNUM":Pa.PcdTokenNumber})
+
+ def UpdateModuleDataPipe(self,arch, attr_dict):
+ for (Target, Toolchain, Arch, MetaFile) in AutoGen.Cache():
+ if Arch != arch:
+ continue
+ try:
+ AutoGen.Cache()[(Target, Toolchain, Arch, MetaFile)].DataPipe.DataContainer = attr_dict
+ except Exception:
+ pass
+ #
+ # Generate Package level hash value
+ #
+ def GeneratePkgLevelHash(self):
+ for Arch in self.ArchList:
+ GlobalData.gPackageHash = {}
+ if GlobalData.gUseHashCache:
+ for Pkg in self.PkgSet[Arch]:
+ self._GenPkgLevelHash(Pkg)
+
+
+ def CreateBuildOptionsFile(self):
+ #
+ # Create BuildOptions Macro & PCD metafile, also add the Active Platform and FDF file.
+ #
+ content = 'gCommandLineDefines: '
+ content += str(GlobalData.gCommandLineDefines)
+ content += TAB_LINE_BREAK
+ content += 'BuildOptionPcd: '
+ content += str(GlobalData.BuildOptionPcd)
+ content += TAB_LINE_BREAK
+ content += 'Active Platform: '
+ content += str(self.Platform)
+ content += TAB_LINE_BREAK
+ if self.FdfFile:
+ content += 'Flash Image Definition: '
+ content += str(self.FdfFile)
+ content += TAB_LINE_BREAK
+ SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'), content, False)
+
+ def CreatePcdTokenNumberFile(self):
+ #
+ # Create PcdToken Number file for Dynamic/DynamicEx Pcd.
+ #
+ PcdTokenNumber = 'PcdTokenNumber: '
+ for Arch in self.ArchList:
+ Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
+ if Pa.PcdTokenNumber:
+ if Pa.DynamicPcdList:
+ for Pcd in Pa.DynamicPcdList:
+ PcdTokenNumber += TAB_LINE_BREAK
+ PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
+ PcdTokenNumber += ' : '
+ PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
+ SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'), PcdTokenNumber, False)
+
+ def CreateModuleHashInfo(self):
+ #
+ # Get set of workspace metafiles
+ #
+ AllWorkSpaceMetaFiles = self._GetMetaFiles(self.BuildTarget, self.ToolChain)
+
+ #
+ # Retrieve latest modified time of all metafiles
+ #
+ SrcTimeStamp = 0
+ for f in AllWorkSpaceMetaFiles:
+ if os.stat(f)[8] > SrcTimeStamp:
+ SrcTimeStamp = os.stat(f)[8]
+ self._SrcTimeStamp = SrcTimeStamp
+
+ if GlobalData.gUseHashCache:
+ m = hashlib.md5()
+ for files in AllWorkSpaceMetaFiles:
+ if files.endswith('.dec'):
+ continue
+ f = open(files, 'rb')
+ Content = f.read()
+ f.close()
+ m.update(Content)
+ SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'), m.hexdigest(), False)
+ GlobalData.gPlatformHash = m.hexdigest()
+
+ #
+ # Write metafile list to build directory
+ #
+ AutoGenFilePath = os.path.join(self.BuildDir, 'AutoGen')
+ if os.path.exists (AutoGenFilePath):
+ os.remove(AutoGenFilePath)
+ if not os.path.exists(self.BuildDir):
+ os.makedirs(self.BuildDir)
+ with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file:
+ for f in AllWorkSpaceMetaFiles:
+ print(f, file=file)
+ return True
+
+ def _GenPkgLevelHash(self, Pkg):
+ if Pkg.PackageName in GlobalData.gPackageHash:
+ return
+
+ PkgDir = os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageName)
+ CreateDirectory(PkgDir)
+ HashFile = os.path.join(PkgDir, Pkg.PackageName + '.hash')
+ m = hashlib.md5()
+ # Get .dec file's hash value
+ f = open(Pkg.MetaFile.Path, 'rb')
+ Content = f.read()
+ f.close()
+ m.update(Content)
+ # Get include files hash value
+ if Pkg.Includes:
+ for inc in sorted(Pkg.Includes, key=lambda x: str(x)):
+ for Root, Dirs, Files in os.walk(str(inc)):
+ for File in sorted(Files):
+ File_Path = os.path.join(Root, File)
+ f = open(File_Path, 'rb')
+ Content = f.read()
+ f.close()
+ m.update(Content)
+ SaveFileOnChange(HashFile, m.hexdigest(), False)
+ GlobalData.gPackageHash[Pkg.PackageName] = m.hexdigest()
+
+ def _GetMetaFiles(self, Target, Toolchain):
+ AllWorkSpaceMetaFiles = set()
+ #
+ # add fdf
+ #
+ if self.FdfFile:
+ AllWorkSpaceMetaFiles.add (self.FdfFile.Path)
+ for f in GlobalData.gFdfParser.GetAllIncludedFile():
+ AllWorkSpaceMetaFiles.add (f.FileName)
+ #
+ # add dsc
+ #
+ AllWorkSpaceMetaFiles.add(self.MetaFile.Path)
+
+ #
+ # add build_rule.txt & tools_def.txt
+ #
+ AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirectory, gDefaultBuildRuleFile))
+ AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirectory, gDefaultToolsDefFile))
+
+ # add BuildOption metafile
+ #
+ AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'BuildOptions'))
+
+ # add PcdToken Number file for Dynamic/DynamicEx Pcd
+ #
+ AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'PcdTokenNumber'))
+
+ for Pa in self.AutoGenObjectList:
+ AllWorkSpaceMetaFiles.add(Pa.ToolDefinitionFile)
+
+ for Arch in self.ArchList:
+ #
+ # add dec
+ #
+ for Package in PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch).PackageList:
+ AllWorkSpaceMetaFiles.add(Package.MetaFile.Path)
+
+ #
+ # add included dsc
+ #
+ for filePath in self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]._RawData.IncludedFiles:
+ AllWorkSpaceMetaFiles.add(filePath.Path)
+
+ return AllWorkSpaceMetaFiles
+
+ def _CheckPcdDefineAndType(self):
+ PcdTypeSet = {TAB_PCDS_FIXED_AT_BUILD,
+ TAB_PCDS_PATCHABLE_IN_MODULE,
+ TAB_PCDS_FEATURE_FLAG,
+ TAB_PCDS_DYNAMIC,
+ TAB_PCDS_DYNAMIC_EX}
+
+ # This dict store PCDs which are not used by any modules with specified arches
+ UnusedPcd = OrderedDict()
+ for Pa in self.AutoGenObjectList:
+ # Key of DSC's Pcds dictionary is PcdCName, TokenSpaceGuid
+ for Pcd in Pa.Platform.Pcds:
+ PcdType = Pa.Platform.Pcds[Pcd].Type
+
+ # If no PCD type, this PCD comes from FDF
+ if not PcdType:
+ continue
+
+ # Try to remove Hii and Vpd suffix
+ if PcdType.startswith(TAB_PCDS_DYNAMIC_EX):
+ PcdType = TAB_PCDS_DYNAMIC_EX
+ elif PcdType.startswith(TAB_PCDS_DYNAMIC):
+ PcdType = TAB_PCDS_DYNAMIC
+
+ for Package in Pa.PackageList:
+ # Key of DEC's Pcds dictionary is PcdCName, TokenSpaceGuid, PcdType
+ if (Pcd[0], Pcd[1], PcdType) in Package.Pcds:
+ break
+ for Type in PcdTypeSet:
+ if (Pcd[0], Pcd[1], Type) in Package.Pcds:
+ EdkLogger.error(
+ 'build',
+ FORMAT_INVALID,
+ "Type [%s] of PCD [%s.%s] in DSC file doesn't match the type [%s] defined in DEC file." \
+ % (Pa.Platform.Pcds[Pcd].Type, Pcd[1], Pcd[0], Type),
+ ExtraData=None
+ )
+ return
+ else:
+ UnusedPcd.setdefault(Pcd, []).append(Pa.Arch)
+
+ for Pcd in UnusedPcd:
+ EdkLogger.warn(
+ 'build',
+ "The PCD was not specified by any INF module in the platform for the given architecture.\n"
+ "\tPCD: [%s.%s]\n\tPlatform: [%s]\n\tArch: %s"
+ % (Pcd[1], Pcd[0], os.path.basename(str(self.MetaFile)), str(UnusedPcd[Pcd])),
+ ExtraData=None
+ )
+
+ def __repr__(self):
+ return "%s [%s]" % (self.MetaFile, ", ".join(self.ArchList))
+
+ ## Return the directory to store FV files
+ @cached_property
+ def FvDir(self):
+ return path.join(self.BuildDir, TAB_FV_DIRECTORY)
+
+ ## Return the directory to store all intermediate and final files built
+ @cached_property
+ def BuildDir(self):
+ return self.AutoGenObjectList[0].BuildDir
+
+ ## Return the build output directory platform specifies
+ @cached_property
+ def OutputDir(self):
+ return self.Platform.OutputDirectory
+
+ ## Return platform name
+ @cached_property
+ def Name(self):
+ return self.Platform.PlatformName
+
+ ## Return meta-file GUID
+ @cached_property
+ def Guid(self):
+ return self.Platform.Guid
+
+ ## Return platform version
+ @cached_property
+ def Version(self):
+ return self.Platform.Version
+
+ ## Return paths of tools
+ @cached_property
+ def ToolDefinition(self):
+ return self.AutoGenObjectList[0].ToolDefinition
+
+ ## Return directory of platform makefile
+ #
+ # @retval string Makefile directory
+ #
+ @cached_property
+ def MakeFileDir(self):
+ return self.BuildDir
+
+ ## Return build command string
+ #
+ # @retval string Build command string
+ #
+ @cached_property
+ def BuildCommand(self):
+ # BuildCommand should be all the same. So just get one from platform AutoGen
+ return self.AutoGenObjectList[0].BuildCommand
+
+ ## Check the PCDs token value conflict in each DEC file.
+ #
+ # Will cause build break and raise error message while two PCDs conflict.
+ #
+ # @return None
+ #
+ def _CheckAllPcdsTokenValueConflict(self):
+ for Pa in self.AutoGenObjectList:
+ for Package in Pa.PackageList:
+ PcdList = list(Package.Pcds.values())
+ PcdList.sort(key=lambda x: int(x.TokenValue, 0))
+ Count = 0
+ while (Count < len(PcdList) - 1) :
+ Item = PcdList[Count]
+ ItemNext = PcdList[Count + 1]
+ #
+ # Make sure in the same token space the TokenValue should be unique
+ #
+ if (int(Item.TokenValue, 0) == int(ItemNext.TokenValue, 0)):
+ SameTokenValuePcdList = []
+ SameTokenValuePcdList.append(Item)
+ SameTokenValuePcdList.append(ItemNext)
+ RemainPcdListLength = len(PcdList) - Count - 2
+ for ValueSameCount in range(RemainPcdListLength):
+ if int(PcdList[len(PcdList) - RemainPcdListLength + ValueSameCount].TokenValue, 0) == int(Item.TokenValue, 0):
+ SameTokenValuePcdList.append(PcdList[len(PcdList) - RemainPcdListLength + ValueSameCount])
+ else:
+ break;
+ #
+ # Sort same token value PCD list with TokenGuid and TokenCName
+ #
+ SameTokenValuePcdList.sort(key=lambda x: "%s.%s" % (x.TokenSpaceGuidCName, x.TokenCName))
+ SameTokenValuePcdListCount = 0
+ while (SameTokenValuePcdListCount < len(SameTokenValuePcdList) - 1):
+ Flag = False
+ TemListItem = SameTokenValuePcdList[SameTokenValuePcdListCount]
+ TemListItemNext = SameTokenValuePcdList[SameTokenValuePcdListCount + 1]
+
+ if (TemListItem.TokenSpaceGuidCName == TemListItemNext.TokenSpaceGuidCName) and (TemListItem.TokenCName != TemListItemNext.TokenCName):
+ for PcdItem in GlobalData.MixedPcd:
+ if (TemListItem.TokenCName, TemListItem.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem] or \
+ (TemListItemNext.TokenCName, TemListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
+ Flag = True
+ if not Flag:
+ EdkLogger.error(
+ 'build',
+ FORMAT_INVALID,
+ "The TokenValue [%s] of PCD [%s.%s] is conflict with: [%s.%s] in %s"\
+ % (TemListItem.TokenValue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemNext.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package),
+ ExtraData=None
+ )
+ SameTokenValuePcdListCount += 1
+ Count += SameTokenValuePcdListCount
+ Count += 1
+
+ PcdList = list(Package.Pcds.values())
+ PcdList.sort(key=lambda x: "%s.%s" % (x.TokenSpaceGuidCName, x.TokenCName))
+ Count = 0
+ while (Count < len(PcdList) - 1) :
+ Item = PcdList[Count]
+ ItemNext = PcdList[Count + 1]
+ #
+ # Check PCDs with same TokenSpaceGuidCName.TokenCName have same token value as well.
+ #
+ if (Item.TokenSpaceGuidCName == ItemNext.TokenSpaceGuidCName) and (Item.TokenCName == ItemNext.TokenCName) and (int(Item.TokenValue, 0) != int(ItemNext.TokenValue, 0)):
+ EdkLogger.error(
+ 'build',
+ FORMAT_INVALID,
+ "The TokenValue [%s] of PCD [%s.%s] in %s defined in two places should be same as well."\
+ % (Item.TokenValue, Item.TokenSpaceGuidCName, Item.TokenCName, Package),
+ ExtraData=None
+ )
+ Count += 1
+ ## Generate fds command
+ @property
+ def GenFdsCommand(self):
+ return (GenMake.TopLevelMakefile(self)._TEMPLATE_.Replace(GenMake.TopLevelMakefile(self)._TemplateDict)).strip()
+
+ @property
+ def GenFdsCommandDict(self):
+ FdsCommandDict = {}
+ LogLevel = EdkLogger.GetLevel()
+ if LogLevel == EdkLogger.VERBOSE:
+ FdsCommandDict["verbose"] = True
+ elif LogLevel <= EdkLogger.DEBUG_9:
+ FdsCommandDict["debug"] = LogLevel - 1
+ elif LogLevel == EdkLogger.QUIET:
+ FdsCommandDict["quiet"] = True
+
+ if GlobalData.gEnableGenfdsMultiThread:
+ FdsCommandDict["GenfdsMultiThread"] = True
+ if GlobalData.gIgnoreSource:
+ FdsCommandDict["IgnoreSources"] = True
+
+ FdsCommandDict["OptionPcd"] = []
+ for pcd in GlobalData.BuildOptionPcd:
+ if pcd[2]:
+ pcdname = '.'.join(pcd[0:3])
+ else:
+ pcdname = '.'.join(pcd[0:2])
+ if pcd[3].startswith('{'):
+ FdsCommandDict["OptionPcd"].append(pcdname + '=' + 'H' + '"' + pcd[3] + '"')
+ else:
+ FdsCommandDict["OptionPcd"].append(pcdname + '=' + pcd[3])
+
+ MacroList = []
+ # macros passed to GenFds
+ MacroDict = {}
+ MacroDict.update(GlobalData.gGlobalDefines)
+ MacroDict.update(GlobalData.gCommandLineDefines)
+ for MacroName in MacroDict:
+ if MacroDict[MacroName] != "":
+ MacroList.append('"%s=%s"' % (MacroName, MacroDict[MacroName].replace('\\', '\\\\')))
+ else:
+ MacroList.append('"%s"' % MacroName)
+ FdsCommandDict["macro"] = MacroList
+
+ FdsCommandDict["fdf_file"] = [self.FdfFile]
+ FdsCommandDict["build_target"] = self.BuildTarget
+ FdsCommandDict["toolchain_tag"] = self.ToolChain
+ FdsCommandDict["active_platform"] = str(self)
+
+ FdsCommandDict["conf_directory"] = GlobalData.gConfDirectory
+ FdsCommandDict["build_architecture_list"] = ','.join(self.ArchList)
+ FdsCommandDict["platform_build_directory"] = self.BuildDir
+
+ FdsCommandDict["fd"] = self.FdTargetList
+ FdsCommandDict["fv"] = self.FvTargetList
+ FdsCommandDict["cap"] = self.CapTargetList
+ return FdsCommandDict
+
+ ## Create makefile for the platform and modules in it
+ #
+ # @param CreateDepsMakeFile Flag indicating if the makefile for
+ # modules will be created as well
+ #
+ def CreateMakeFile(self, CreateDepsMakeFile=False):
+ if not CreateDepsMakeFile:
+ return
+ for Pa in self.AutoGenObjectList:
+ Pa.CreateMakeFile(True)
+
+ ## Create autogen code for platform and modules
+ #
+ # Since there's no autogen code for platform, this method will do nothing
+ # if CreateModuleCodeFile is set to False.
+ #
+ # @param CreateDepsCodeFile Flag indicating if creating module's
+ # autogen code file or not
+ #
+ def CreateCodeFile(self, CreateDepsCodeFile=False):
+ if not CreateDepsCodeFile:
+ return
+ for Pa in self.AutoGenObjectList:
+ Pa.CreateCodeFile(True)
+
+ ## Create AsBuilt INF file the platform
+ #
+ def CreateAsBuiltInf(self):
+ return
+
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 27dbdace4252..8150279094e6 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -647,11 +647,10 @@ def GuidValue(CName, PackageList, Inffile = None):
if not Inffile.startswith(P.MetaFile.Dir):
GuidKeys = [x for x in P.Guids if x not in P._PrivateGuids]
if CName in GuidKeys:
return P.Guids[CName]
return None
- return None
## A string template class
#
# This class implements a template for string replacement. A string template
# looks like following
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index 02735e165ca1..d35cd792704c 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -9,11 +9,10 @@
# Import Modules
#
import Common.LongFilePathOs as os
from Common.LongFilePathSupport import OpenLongFilePath as open
import sys
-import re
from optparse import OptionParser
from optparse import make_option
from Common.BuildToolError import *
import Common.EdkLogger as EdkLogger
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index dd5c3c2bd1f2..37976d067ed9 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1371,15 +1371,15 @@ class DscBuildData(PlatformBuildClassObject):
if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE],
self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG],
self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC],
self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX]]:
- self.Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
- self.Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid, Name)][0]
+ self._Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
+ self._Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid, Name)][0]
if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC],
self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX]]:
- self.Pcds[Name, Guid].SkuInfoList = {TAB_DEFAULT:SkuInfoClass(TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', '', '', NoFiledValues[( Guid, Name)][0])}
+ self._Pcds[Name, Guid].SkuInfoList = {TAB_DEFAULT:SkuInfoClass(TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', '', '', NoFiledValues[( Guid, Name)][0])}
return AllPcds
def OverrideByFdfOverAll(self,AllPcds):
if GlobalData.gFdfParser is None:
@@ -1417,12 +1417,12 @@ class DscBuildData(PlatformBuildClassObject):
if PcdInDec:
PcdInDec.PcdValueFromFdf = Value
if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE],
self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]]:
- self.Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
- self.Pcds[Name, Guid].DefaultValue = Value
+ self._Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
+ self._Pcds[Name, Guid].DefaultValue = Value
return AllPcds
def ParsePcdNameStruct(self,NamePart1,NamePart2):
TokenSpaceCName = PcdCName = DimensionAttr = Field = ""
if "." in NamePart1:
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index da35391d3aff..e63246b03b6e 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -152,10 +152,17 @@ class InfBuildData(ModuleBuildClassObject):
self._GuidsUsedByPcd = OrderedDict()
self._GuidComments = None
self._PcdComments = None
self._BuildOptions = None
self._DependencyFileList = None
+ self.LibInstances = []
+ self.ReferenceModules = set()
+ self.Guids
+ self.Pcds
+ def SetReferenceModule(self,Module):
+ self.ReferenceModules.add(Module)
+ return self
## XXX[key] = value
def __setitem__(self, key, value):
self.__dict__[self._PROPERTY_[key]] = value
@@ -703,10 +710,29 @@ class InfBuildData(ModuleBuildClassObject):
RetVal.update(self._GetPcd(MODEL_PCD_DYNAMIC))
RetVal.update(self._GetPcd(MODEL_PCD_DYNAMIC_EX))
return RetVal
@cached_property
+ def ModulePcdList(self):
+ RetVal = self.Pcds
+ return RetVal
+ @cached_property
+ def LibraryPcdList(self):
+ if bool(self.LibraryClass):
+ return []
+ RetVal = {}
+ Pcds = set()
+ for Library in self.LibInstances:
+ PcdsInLibrary = OrderedDict()
+ for Key in Library.Pcds:
+ if Key in self.Pcds or Key in Pcds:
+ continue
+ Pcds.add(Key)
+ PcdsInLibrary[Key] = copy.copy(Library.Pcds[Key])
+ RetVal[Library] = PcdsInLibrary
+ return RetVal
+ @cached_property
def PcdsName(self):
PcdsName = set()
for Type in (MODEL_PCD_FIXED_AT_BUILD,MODEL_PCD_PATCHABLE_IN_MODULE,MODEL_PCD_FEATURE_FLAG,MODEL_PCD_DYNAMIC,MODEL_PCD_DYNAMIC_EX):
RecordList = self._RawData[Type, self._Arch, self._Platform]
for TokenSpaceGuid, PcdCName, _, _, _, _, _ in RecordList:
@@ -1028,5 +1054,8 @@ class InfBuildData(ModuleBuildClassObject):
@property
def IsBinaryModule(self):
if (self.Binaries and not self.Sources) or GlobalData.gIgnoreSource:
return True
return False
+def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict):
+ for Key in CopyFromDict:
+ CopyToDict[Key].extend(CopyFromDict[Key])
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
index 41ae684d3ee9..76583f46e500 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
@@ -86,10 +86,12 @@ def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, additionalP
#
def GetLiabraryInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain):
return GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain)
def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain, FileName = '', EdkLogger = None):
+ if Module.LibInstances:
+ return Module.LibInstances
ModuleType = Module.ModuleType
# add forced library instances (specified under LibraryClasses sections)
#
# If a module has a MODULE_TYPE of USER_DEFINED,
@@ -244,6 +246,8 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
#
# Build the list of constructor and destructor names
# The DAG Topo sort produces the destructor order, so the list of constructors must generated in the reverse order
#
SortedLibraryList.reverse()
+ Module.LibInstances = SortedLibraryList
+ SortedLibraryList = [lib.SetReferenceModule(Module) for lib in SortedLibraryList]
return SortedLibraryList
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
index 28a975f54e51..ab7b4506c1c1 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
@@ -60,10 +60,12 @@ class WorkspaceDatabase(object):
MODEL_FILE_DEC : DecBuildData,
MODEL_FILE_DSC : DscBuildData,
}
_CACHE_ = {} # (FilePath, Arch) : <object>
+ def GetCache(self):
+ return self._CACHE_
# constructor
def __init__(self, WorkspaceDb):
self.WorkspaceDb = WorkspaceDb
@@ -201,10 +203,11 @@ class WorkspaceDatabase(object):
Platform = self.BuildObject[PathClass(Dscfile), TAB_COMMON]
if Platform is None:
EdkLogger.error('build', PARSER_ERROR, "Failed to parser DSC file: %s" % Dscfile)
return Platform
+BuildDB = WorkspaceDatabase()
##
#
# This acts like the main() function for the script, unless it is 'import'ed into another
# script.
#
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index a3eb3b2383e4..a54c7f4ca547 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -32,11 +32,11 @@ from Common.BuildToolError import CODE_ERROR
from Common.BuildToolError import COMMAND_FAILURE
from Common.BuildToolError import FORMAT_INVALID
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.MultipleWorkspace import MultipleWorkspace as mws
import Common.GlobalData as GlobalData
-from AutoGen.AutoGen import ModuleAutoGen
+from AutoGen.ModuleAutoGen import ModuleAutoGen
from Common.Misc import PathClass
from Common.StringUtils import NormPath
from Common.DataType import *
import collections
from Common.Expression import *
@@ -2138,11 +2138,11 @@ class PlatformReport(object):
if GlobalData.gFdfParser is not None:
if Pa.Arch in GlobalData.gFdfParser.Profile.InfDict:
INFList = GlobalData.gFdfParser.Profile.InfDict[Pa.Arch]
for InfName in INFList:
InfClass = PathClass(NormPath(InfName), Wa.WorkspaceDir, Pa.Arch)
- Ma = ModuleAutoGen(Wa, InfClass, Pa.BuildTarget, Pa.ToolChain, Pa.Arch, Wa.MetaFile)
+ Ma = ModuleAutoGen(Wa, InfClass, Pa.BuildTarget, Pa.ToolChain, Pa.Arch, Wa.MetaFile,Pa.DataPile)
if Ma is None:
continue
if Ma not in ModuleAutoGenList:
ModuleAutoGenList.append(Ma)
for MGen in ModuleAutoGenList:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 07693b97359e..3d083f4eaade 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -10,46 +10,49 @@
##
# Import Modules
#
from __future__ import print_function
-import Common.LongFilePathOs as os
-import re
+from __future__ import absolute_import
+import os.path as path
import sys
+import os
+import re
import glob
import time
import platform
import traceback
-import encodings.ascii
import multiprocessing
-
-from struct import *
-from threading import *
+from threading import Thread,Event,BoundedSemaphore
import threading
+from subprocess import Popen,PIPE
+from collections import OrderedDict, defaultdict
from optparse import OptionParser
-from subprocess import *
+from AutoGen.PlatformAutoGen import PlatformAutoGen
+from AutoGen.ModuleAutoGen import ModuleAutoGen
+from AutoGen.WorkspaceAutoGen import WorkspaceAutoGen
+from AutoGen import GenMake
from Common import Misc as Utils
-from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.TargetTxtClassObject import TargetTxt
from Common.ToolDefClassObject import ToolDef
+from Common.Misc import PathClass,SaveFileOnChange,RemoveDirectory
+from Common.StringUtils import NormPath
+from Common.MultipleWorkspace import MultipleWorkspace as mws
+from Common.BuildToolError import *
from Common.DataType import *
+import Common.EdkLogger as EdkLogger
from Common.BuildVersion import gBUILD_VERSION
-from AutoGen.AutoGen import *
-from Common.BuildToolError import *
-from Workspace.WorkspaceDatabase import WorkspaceDatabase
-from Common.MultipleWorkspace import MultipleWorkspace as mws
+from Workspace.WorkspaceDatabase import BuildDB
from BuildReport import BuildReport
-from GenPatchPcdTable.GenPatchPcdTable import *
-from PatchPcdValue.PatchPcdValue import *
+from GenPatchPcdTable.GenPatchPcdTable import PeImageClass,parsePcdInfoFromMapFile
+from PatchPcdValue.PatchPcdValue import PatchBinaryFile
-import Common.EdkLogger
import Common.GlobalData as GlobalData
from GenFds.GenFds import GenFds, GenFdsApi
-from collections import OrderedDict, defaultdict
# Version and Copyright
VersionNumber = "0.60" + ' ' + gBUILD_VERSION
__version__ = "%prog Version " + VersionNumber
__copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation All rights reserved."
@@ -773,11 +776,11 @@ class Build():
ConfDirectoryPath = mws.join(self.WorkspaceDir, 'Conf')
GlobalData.gConfDirectory = ConfDirectoryPath
GlobalData.gDatabasePath = os.path.normpath(os.path.join(ConfDirectoryPath, GlobalData.gDatabasePath))
if not os.path.exists(os.path.join(GlobalData.gConfDirectory, '.cache')):
os.makedirs(os.path.join(GlobalData.gConfDirectory, '.cache'))
- self.Db = WorkspaceDatabase()
+ self.Db = BuildDB
self.BuildDatabase = self.Db.BuildObject
self.Platform = None
self.ToolChainFamily = None
self.LoadFixAddress = 0
self.UniFlag = BuildOptions.Flag
@@ -1698,17 +1701,21 @@ class Build():
CmdListDict = {}
if GlobalData.gEnableGenfdsMultiThread and self.Fdf:
CmdListDict = self._GenFfsCmd(Wa.ArchList)
for Arch in Wa.ArchList:
+ PcdMaList = []
GlobalData.gGlobalDefines['ARCH'] = Arch
Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
for Module in Pa.Platform.Modules:
# Get ModuleAutoGen object to generate C code file and makefile
- Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile)
+ Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
if Ma is None:
continue
+ if Ma.PcdIsDriver:
+ Ma.PlatformInfo = Pa
+ PcdMaList.append(Ma)
self.BuildModules.append(Ma)
self._BuildPa(self.Target, Pa, FfsCommand=CmdListDict)
# Create MAP file when Load Fix Address is enabled.
if self.Target in ["", "all", "fds"]:
@@ -1800,11 +1807,11 @@ class Build():
AutoGenStart = time.time()
GlobalData.gGlobalDefines['ARCH'] = Arch
Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
for Module in Pa.Platform.Modules:
if self.ModuleFile.Dir == Module.Dir and self.ModuleFile.Name == Module.Name:
- Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile)
+ Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
if Ma is None:
continue
MaList.append(Ma)
if Ma.CanSkipbyHash():
self.HashSkipModules.append(Ma)
@@ -1980,10 +1987,11 @@ class Build():
# multi-thread exit flag
ExitFlag = threading.Event()
ExitFlag.clear()
self.AutoGenTime += int(round((time.time() - WorkspaceAutoGenTime)))
for Arch in Wa.ArchList:
+ PcdMaList = []
AutoGenStart = time.time()
GlobalData.gGlobalDefines['ARCH'] = Arch
Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
if Pa is None:
continue
@@ -1997,14 +2005,17 @@ class Build():
if Inf in Pa.Platform.Modules:
continue
ModuleList.append(Inf)
for Module in ModuleList:
# Get ModuleAutoGen object to generate C code file and makefile
- Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile)
+ Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
if Ma is None:
continue
+ if Ma.PcdIsDriver:
+ Ma.PlatformInfo = Pa
+ PcdMaList.append(Ma)
if Ma.CanSkipbyHash():
self.HashSkipModules.append(Ma)
if GlobalData.gBinCacheSource:
EdkLogger.quiet("cache hit: %s[%s]" % (Ma.MetaFile.Path, Ma.Arch))
continue
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 05/11] BaseTools: Enable Multiple Process AutoGen
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (3 preceding siblings ...)
2019-07-29 8:44 ` [Patch 04/11] BaseTools: Decouple AutoGen Objects Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 06/11] BaseTools: Add shared data for processes Bob Feng
` (6 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
Assign the Module AutoGen tasks into multiple
sub process.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
.../Source/Python/AutoGen/AutoGenWorker.py | 162 ++++++++++++++++++
BaseTools/Source/Python/AutoGen/DataPipe.py | 6 +
BaseTools/Source/Python/AutoGen/GenC.py | 4 +-
.../Source/Python/AutoGen/ModuleAutoGen.py | 7 +-
.../Source/Python/AutoGen/PlatformAutoGen.py | 4 +-
BaseTools/Source/Python/build/build.py | 114 +++++++-----
6 files changed, 243 insertions(+), 54 deletions(-)
create mode 100644 BaseTools/Source/Python/AutoGen/AutoGenWorker.py
diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
new file mode 100644
index 000000000000..edec346abd06
--- /dev/null
+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
@@ -0,0 +1,162 @@
+## @file
+# Create makefile for MS nmake and GNU make
+#
+# Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+#
+from __future__ import absolute_import
+import multiprocessing as mp
+import threading
+from Common.Misc import PathClass
+from AutoGen.ModuleAutoGen import ModuleAutoGen
+from AutoGen.ModuleAutoGenHelper import WorkSpaceInfo,AutoGenInfo
+import Common.GlobalData as GlobalData
+import Common.EdkLogger as EdkLogger
+import os
+from Common.MultipleWorkspace import MultipleWorkspace as mws
+from AutoGen.AutoGen import AutoGen
+from Workspace.WorkspaceDatabase import BuildDB
+import time
+from queue import Empty
+import traceback
+import sys
+from AutoGen.DataPipe import MemoryDataPipe
+class AutoGenManager(threading.Thread):
+ def __init__(self,autogen_workers, feedback_q):
+ super(AutoGenManager,self).__init__()
+ self.autogen_workers = autogen_workers
+ self.feedback_q = feedback_q
+ self.terminate = False
+ self.Status = True
+ def run(self):
+ try:
+ while True:
+ if self.terminate:
+ break
+ if self.feedback_q.empty():
+ time.sleep(1)
+ continue
+ badnews = self.feedback_q.get(False)
+ if badnews:
+ print(badnews)
+ self.Status = False
+ self.TerminateWorkers()
+ break
+ except Exception:
+ return
+
+ def kill(self):
+ self.terminate = True
+
+ def TerminateWorkers(self):
+ for w in self.autogen_workers:
+ if w.is_alive():
+ w.terminate()
+
+class AutoGenWorkerInProcess(mp.Process):
+ def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock):
+ mp.Process.__init__(self)
+ self.module_queue = module_queue
+ self.data_pipe_file_path =data_pipe_file_path
+ self.data_pipe = None
+ self.feedback_q = feedback_q
+ self.PlatformMetaFileSet = {}
+ self.file_lock = file_lock
+ def GetPlatformMetaFile(self,filepath,root):
+ try:
+ return self.PlatformMetaFileSet[(filepath,root)]
+ except:
+ self.PlatformMetaFileSet[(filepath,root)] = filepath
+ return self.PlatformMetaFileSet[(filepath,root)]
+ def run(self):
+ try:
+ taskname = "Init"
+ with self.file_lock:
+ if not os.path.exists(self.data_pipe_file_path):
+ self.feedback_q.put(taskname + ":" + "load data pipe %s failed." % self.data_pipe_file_path)
+ self.data_pipe = MemoryDataPipe()
+ self.data_pipe.load(self.data_pipe_file_path)
+ EdkLogger.Initialize()
+ loglevel = self.data_pipe.Get("LogLevel")
+ if not loglevel:
+ loglevel = EdkLogger.INFO
+ EdkLogger.SetLevel(loglevel)
+ logfile = self.data_pipe.Get("LogFile")
+ if logfile:
+ EdkLogger.SetLogFile(logfile)
+ target = self.data_pipe.Get("P_Info").get("Target")
+ toolchain = self.data_pipe.Get("P_Info").get("ToolChain")
+ archlist = self.data_pipe.Get("P_Info").get("ArchList")
+
+ active_p = self.data_pipe.Get("P_Info").get("ActivePlatform")
+ workspacedir = self.data_pipe.Get("P_Info").get("WorkspaceDir")
+ PackagesPath = os.getenv("PACKAGES_PATH")
+ mws.setWs(workspacedir, PackagesPath)
+ self.Wa = WorkSpaceInfo(
+ workspacedir,active_p,target,toolchain,archlist
+ )
+ self.Wa._SrcTimeStamp = self.data_pipe.Get("Workspace_timestamp")
+ GlobalData.gGlobalDefines = self.data_pipe.Get("G_defines")
+ GlobalData.gCommandLineDefines = self.data_pipe.Get("CL_defines")
+ os.environ._data = self.data_pipe.Get("Env_Var")
+ GlobalData.gWorkspace = workspacedir
+ GlobalData.gDisableIncludePathCheck = False
+ GlobalData.gFdfParser = self.data_pipe.Get("FdfParser")
+ GlobalData.gDatabasePath = self.data_pipe.Get("DatabasePath")
+ module_count = 0
+ FfsCmd = self.data_pipe.Get("FfsCommand")
+ if FfsCmd is None:
+ FfsCmd = {}
+ PlatformMetaFile = self.GetPlatformMetaFile(self.data_pipe.Get("P_Info").get("ActivePlatform"),
+ self.data_pipe.Get("P_Info").get("WorkspaceDir"))
+ while not self.module_queue.empty():
+ module_count += 1
+ module_file,module_root,module_path,module_basename,module_originalpath,module_arch,IsLib = self.module_queue.get()
+ modulefullpath = os.path.join(module_root,module_file)
+ taskname = " : ".join((modulefullpath,module_arch))
+ module_metafile = PathClass(module_file,module_root)
+ if module_path:
+ module_metafile.Path = module_path
+ if module_basename:
+ module_metafile.BaseName = module_basename
+ if module_originalpath:
+ module_metafile.OriginalPath = PathClass(module_originalpath,module_root)
+ arch = module_arch
+ target = self.data_pipe.Get("P_Info").get("Target")
+ toolchain = self.data_pipe.Get("P_Info").get("ToolChain")
+ Ma = ModuleAutoGen(self.Wa,module_metafile,target,toolchain,arch,PlatformMetaFile,self.data_pipe)
+ Ma.IsLibrary = IsLib
+ Ma.CreateCodeFile()
+ Ma.CreateMakeFile(GenFfsList=FfsCmd.get((Ma.MetaFile.File, Ma.Arch),[]))
+ Ma.CreateAsBuiltInf()
+ except Empty:
+ pass
+ except:
+ traceback.print_exc(file=sys.stdout)
+ self.feedback_q.put(taskname)
+
+ def printStatus(self):
+ print("Processs ID: %d Run %d modules in AutoGen " % (os.getpid(),len(AutoGen.Cache())))
+ print("Processs ID: %d Run %d modules in AutoGenInfo " % (os.getpid(),len(AutoGenInfo.GetCache())))
+ groupobj = {}
+ for buildobj in BuildDB.BuildObject.GetCache().values():
+ if str(buildobj).lower().endswith("dec"):
+ try:
+ groupobj['dec'].append(str(buildobj))
+ except:
+ groupobj['dec'] = [str(buildobj)]
+ if str(buildobj).lower().endswith("dsc"):
+ try:
+ groupobj['dsc'].append(str(buildobj))
+ except:
+ groupobj['dsc'] = [str(buildobj)]
+
+ if str(buildobj).lower().endswith("inf"):
+ try:
+ groupobj['inf'].append(str(buildobj))
+ except:
+ groupobj['inf'] = [str(buildobj)]
+
+ print("Processs ID: %d Run %d pkg in WDB " % (os.getpid(),len(groupobj.get("dec",[]))))
+ print("Processs ID: %d Run %d pla in WDB " % (os.getpid(),len(groupobj.get("dsc",[]))))
+ print("Processs ID: %d Run %d inf in WDB " % (os.getpid(),len(groupobj.get("inf",[]))))
diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source/Python/AutoGen/DataPipe.py
index 5bcc39bd380d..9478f41d481b 100644
--- a/BaseTools/Source/Python/AutoGen/DataPipe.py
+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py
@@ -9,10 +9,11 @@ from Workspace.WorkspaceDatabase import BuildDB
from Workspace.WorkspaceCommon import GetModuleLibInstances
import Common.GlobalData as GlobalData
import os
import pickle
from pickle import HIGHEST_PROTOCOL
+from Common import EdkLogger
class PCD_DATA():
def __init__(self,TokenCName,TokenSpaceGuidCName,Type,DatumType,SkuInfoList,DefaultValue,
MaxDatumSize,UserDefinedDefaultStoresFlag,validateranges,
validlists,expressions,CustomAttribute,TokenValue):
@@ -32,17 +33,19 @@ class PCD_DATA():
class DataPipe(object):
def __init__(self, BuildDir=None):
self.data_container = {}
self.BuildDir = BuildDir
+ self.dump_file = ""
class MemoryDataPipe(DataPipe):
def Get(self,key):
return self.data_container.get(key)
def dump(self,file_path):
+ self.dump_file = file_path
with open(file_path,'wb') as fd:
pickle.dump(self.data_container,fd,pickle.HIGHEST_PROTOCOL)
def load(self,file_path):
with open(file_path,'rb') as fd:
@@ -141,7 +144,10 @@ class MemoryDataPipe(DataPipe):
self.DataContainer = {"PackageList": [(dec.MetaFile,dec.Arch) for dec in PlatformInfo.PackageList]}
self.DataContainer = {"GuidDict": PlatformInfo.Platform._GuidDict}
+ self.DataContainer = {"DatabasePath":GlobalData.gDatabasePath}
self.DataContainer = {"FdfParser": True if GlobalData.gFdfParser else False}
+ self.DataContainer = {"LogLevel": EdkLogger.GetLevel()}
+ self.DataContainer = {"LogFile": GlobalData.gOptions.LogFile if GlobalData.gOptions.LogFile is not None else ""}
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 4c3f4e3e55ae..910c8fe3706c 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1470,12 +1470,12 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
'UefiSpecVersion': UefiSpecVersion + 'U'
}
if Info.ModuleType in [SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE]:
if Info.SourceFileList:
- if NumEntryPoints != 1:
- EdkLogger.error(
+ if NumEntryPoints != 1:
+ EdkLogger.error(
"build",
AUTOGEN_ERROR,
'%s must have exactly one entry point' % Info.ModuleType,
File=str(Info),
ExtraData= ", ".join(Info.Module.ModuleEntryPointList)
diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
index f0a4afc3a664..36bbaffa56d3 100644
--- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
@@ -1681,13 +1681,11 @@ class ModuleAutoGen(AutoGen):
if self.IsBinaryModule:
return
self.GenFfsList = GenFfsList
- if not self.IsLibrary and CreateLibraryMakeFile:
- for LibraryAutoGen in self.LibraryAutoGenList:
- LibraryAutoGen.CreateMakeFile()
+
# Don't enable if hash feature enabled, CanSkip uses timestamps to determine build skipping
if not GlobalData.gUseHashCache and self.CanSkip():
return
if len(self.CustomMakefile) == 0:
@@ -1724,13 +1722,10 @@ class ModuleAutoGen(AutoGen):
if self.IsBinaryModule:
if self.IsLibrary:
self.CopyBinaryFiles()
return
- if not self.IsLibrary and CreateLibraryCodeFile:
- for LibraryAutoGen in self.LibraryAutoGenList:
- LibraryAutoGen.CreateCodeFile()
# Don't enable if hash feature enabled, CanSkip uses timestamps to determine build skipping
if not GlobalData.gUseHashCache and self.CanSkip():
return
diff --git a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
index 6360d4cbd86b..9885c6a3a3bf 100644
--- a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
@@ -1081,14 +1081,14 @@ class PlatformAutoGen(AutoGen):
def GetAllModuleInfo(self,WithoutPcd=True):
ModuleLibs = set()
for m in self.Platform.Modules:
module_obj = self.BuildDatabase[m,self.Arch,self.BuildTarget,self.ToolChain]
Libs = GetModuleLibInstances(module_obj, self.Platform, self.BuildDatabase, self.Arch,self.BuildTarget,self.ToolChain)
- ModuleLibs.update( set([(l.MetaFile.File,l.MetaFile.Root,l.Arch,True) for l in Libs]))
+ ModuleLibs.update( set([(l.MetaFile.File,l.MetaFile.Root,l.MetaFile.Path,l.MetaFile.BaseName,l.MetaFile.OriginalPath,l.Arch,True) for l in Libs]))
if WithoutPcd and module_obj.PcdIsDriver:
continue
- ModuleLibs.add((m.File,m.Root,module_obj.Arch,False))
+ ModuleLibs.add((m.File,m.Root,m.Path,m.BaseName,m.OriginalPath,module_obj.Arch,bool(module_obj.LibraryClass)))
return ModuleLibs
## Resolve the library classes in a module to library instances
#
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 3d083f4eaade..590584b3c67f 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -28,10 +28,11 @@ from subprocess import Popen,PIPE
from collections import OrderedDict, defaultdict
from optparse import OptionParser
from AutoGen.PlatformAutoGen import PlatformAutoGen
from AutoGen.ModuleAutoGen import ModuleAutoGen
from AutoGen.WorkspaceAutoGen import WorkspaceAutoGen
+from AutoGen.AutoGenWorker import AutoGenWorkerInProcess,AutoGenManager
from AutoGen import GenMake
from Common import Misc as Utils
from Common.TargetTxtClassObject import TargetTxt
from Common.ToolDefClassObject import ToolDef
@@ -48,11 +49,11 @@ from BuildReport import BuildReport
from GenPatchPcdTable.GenPatchPcdTable import PeImageClass,parsePcdInfoFromMapFile
from PatchPcdValue.PatchPcdValue import PatchBinaryFile
import Common.GlobalData as GlobalData
from GenFds.GenFds import GenFds, GenFdsApi
-
+import multiprocessing as mp
# Version and Copyright
VersionNumber = "0.60" + ' ' + gBUILD_VERSION
__version__ = "%prog Version " + VersionNumber
__copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation All rights reserved."
@@ -341,13 +342,13 @@ class ModuleMakeUnit(BuildUnit):
#
# @param self The object pointer
# @param Obj The ModuleAutoGen object the build is working on
# @param Target The build target name, one of gSupportedTarget
#
- def __init__(self, Obj, Target):
- Dependency = [ModuleMakeUnit(La, Target) for La in Obj.LibraryAutoGenList]
- BuildUnit.__init__(self, Obj, Obj.BuildCommand, Target, Dependency, Obj.MakeFileDir)
+ def __init__(self, Obj, BuildCommand,Target):
+ Dependency = [ModuleMakeUnit(La, BuildCommand,Target) for La in Obj.LibraryAutoGenList]
+ BuildUnit.__init__(self, Obj, BuildCommand, Target, Dependency, Obj.MakeFileDir)
if Target in [None, "", "all"]:
self.Target = "tbuild"
## The smallest platform unit that can be built by nmake/make command in multi-thread build mode
#
@@ -362,14 +363,14 @@ class PlatformMakeUnit(BuildUnit):
#
# @param self The object pointer
# @param Obj The PlatformAutoGen object the build is working on
# @param Target The build target name, one of gSupportedTarget
#
- def __init__(self, Obj, Target):
- Dependency = [ModuleMakeUnit(Lib, Target) for Lib in self.BuildObject.LibraryAutoGenList]
- Dependency.extend([ModuleMakeUnit(Mod, Target) for Mod in self.BuildObject.ModuleAutoGenList])
- BuildUnit.__init__(self, Obj, Obj.BuildCommand, Target, Dependency, Obj.MakeFileDir)
+ def __init__(self, Obj, BuildCommand, Target):
+ Dependency = [ModuleMakeUnit(Lib, BuildCommand, Target) for Lib in self.BuildObject.LibraryAutoGenList]
+ Dependency.extend([ModuleMakeUnit(Mod, BuildCommand,Target) for Mod in self.BuildObject.ModuleAutoGenList])
+ BuildUnit.__init__(self, Obj, BuildCommand, Target, Dependency, Obj.MakeFileDir)
## The class representing the task of a module build or platform build
#
# This class manages the build tasks in multi-thread build mode. Its jobs include
# scheduling thread running, catching thread error, monitor the thread status, etc.
@@ -822,12 +823,35 @@ class Build():
self.TargetTxt = TargetTxt
self.ToolDef = ToolDef
if not (self.LaunchPrebuildFlag and os.path.exists(self.PlatformBuildPath)):
self.InitBuild()
+ self.AutoGenMgr = None
EdkLogger.info("")
os.chdir(self.WorkspaceDir)
+ def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList):
+ if SkipAutoGen:
+ return
+ feedback_q = mp.Queue()
+ file_lock = mp.Lock()
+ auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_lock) for _ in range(self.ThreadNumber)]
+ self.AutoGenMgr = AutoGenManager(auto_workers,feedback_q)
+ self.AutoGenMgr.start()
+ for w in auto_workers:
+ w.start()
+ if PcdMaList is not None:
+ for PcdMa in PcdMaList:
+ PcdMa.CreateCodeFile(True)
+ PcdMa.CreateMakeFile(GenFfsList = DataPipe.Get("FfsCommand").get((PcdMa.MetaFile.File, PcdMa.Arch),[]))
+ PcdMa.CreateAsBuiltInf()
+ for w in auto_workers:
+ w.join()
+ rt = self.AutoGenMgr.Status
+ self.AutoGenMgr.kill()
+ self.AutoGenMgr.join()
+ self.AutoGenMgr = None
+ return rt
## Load configuration
#
# This method will parse target.txt and get the build configurations.
#
@@ -1188,30 +1212,29 @@ class Build():
# @param CreateDepModuleCodeFile Flag used to indicate creating code
# for dependent modules/Libraries
# @param CreateDepModuleMakeFile Flag used to indicate creating makefile
# for dependent modules/Libraries
#
- def _BuildPa(self, Target, AutoGenObject, CreateDepsCodeFile=True, CreateDepsMakeFile=True, BuildModule=False, FfsCommand={}):
+ def _BuildPa(self, Target, AutoGenObject, CreateDepsCodeFile=True, CreateDepsMakeFile=True, BuildModule=False, FfsCommand=None, PcdMaList=None):
if AutoGenObject is None:
return False
-
+ if FfsCommand is None:
+ FfsCommand = {}
# skip file generation for cleanxxx targets, run and fds target
if Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
# for target which must generate AutoGen code and makefile
- if not self.SkipAutoGen or Target == 'genc':
- self.Progress.Start("Generating code")
- AutoGenObject.CreateCodeFile(CreateDepsCodeFile)
- self.Progress.Stop("done!")
- if Target == "genc":
- return True
+ mqueue = mp.Queue()
+ for m in AutoGenObject.GetAllModuleInfo:
+ mqueue.put(m)
- if not self.SkipAutoGen or Target == 'genmake':
- self.Progress.Start("Generating makefile")
- AutoGenObject.CreateMakeFile(CreateDepsMakeFile, FfsCommand)
- self.Progress.Stop("done!")
- if Target == "genmake":
- return True
+ AutoGenObject.DataPipe.DataContainer = {"FfsCommand":FfsCommand}
+ self.Progress.Start("Generating makefile and code")
+ data_pipe_file = os.path.join(AutoGenObject.BuildDir, "GlobalVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch))
+ AutoGenObject.DataPipe.dump(data_pipe_file)
+ autogen_rt = self.StartAutoGen(mqueue, AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList)
+ self.Progress.Stop("done!")
+ return autogen_rt
else:
# always recreate top/platform makefile when clean, just in case of inconsistency
AutoGenObject.CreateCodeFile(False)
AutoGenObject.CreateMakeFile(False)
@@ -1234,11 +1257,10 @@ class Build():
# build modules
if BuildModule:
BuildCommand = BuildCommand + [Target]
LaunchCommand(BuildCommand, AutoGenObject.MakeFileDir)
- self.CreateAsBuiltInf()
if GlobalData.gBinCacheDest:
self.UpdateBuildCache()
self.BuildModules = []
return True
@@ -1255,11 +1277,10 @@ class Build():
NewBuildCommand = BuildCommand + ['-f', os.path.normpath(os.path.join(Lib, makefile)), 'pbuild']
LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir)
for Mod in AutoGenObject.ModuleBuildDirectoryList:
NewBuildCommand = BuildCommand + ['-f', os.path.normpath(os.path.join(Mod, makefile)), 'pbuild']
LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir)
- self.CreateAsBuiltInf()
if GlobalData.gBinCacheDest:
self.UpdateBuildCache()
self.BuildModules = []
return True
@@ -1713,11 +1734,11 @@ class Build():
continue
if Ma.PcdIsDriver:
Ma.PlatformInfo = Pa
PcdMaList.append(Ma)
self.BuildModules.append(Ma)
- self._BuildPa(self.Target, Pa, FfsCommand=CmdListDict)
+ self._BuildPa(self.Target, Pa, FfsCommand=CmdListDict,PcdMaList=PcdMaList)
# Create MAP file when Load Fix Address is enabled.
if self.Target in ["", "all", "fds"]:
for Arch in Wa.ArchList:
GlobalData.gGlobalDefines['ARCH'] = Arch
@@ -1848,11 +1869,11 @@ class Build():
GlobalData.gModuleBuildTracking[Ma.Arch][Ma] = 'FAIL'
self.AutoGenTime += int(round((time.time() - AutoGenStart)))
MakeStart = time.time()
for Ma in self.BuildModules:
if not Ma.IsBinaryModule:
- Bt = BuildTask.New(ModuleMakeUnit(Ma, self.Target))
+ Bt = BuildTask.New(ModuleMakeUnit(Ma, Pa.BuildCommand,self.Target))
# Break build if any build thread has error
if BuildTask.HasError():
# we need a full version of makefile for platform
ExitFlag.set()
BuildTask.WaitForComplete()
@@ -1978,18 +1999,19 @@ class Build():
self.LoadFixAddress = Wa.Platform.LoadFixAddress
self.BuildReport.AddPlatformReport(Wa)
Wa.CreateMakeFile(False)
# Add ffs build to makefile
- CmdListDict = None
+ CmdListDict = {}
if GlobalData.gEnableGenfdsMultiThread and self.Fdf:
CmdListDict = self._GenFfsCmd(Wa.ArchList)
# multi-thread exit flag
ExitFlag = threading.Event()
ExitFlag.clear()
self.AutoGenTime += int(round((time.time() - WorkspaceAutoGenTime)))
+ BuildModules = []
for Arch in Wa.ArchList:
PcdMaList = []
AutoGenStart = time.time()
GlobalData.gGlobalDefines['ARCH'] = Arch
Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
@@ -2003,18 +2025,21 @@ class Build():
for InfName in GlobalData.gFdfParser.Profile.InfList:
Inf = PathClass(NormPath(InfName), self.WorkspaceDir, Arch)
if Inf in Pa.Platform.Modules:
continue
ModuleList.append(Inf)
+ Pa.DataPipe.DataContainer = {"FfsCommand":CmdListDict}
+ Pa.DataPipe.DataContainer = {"Workspace_timestamp": Wa._SrcTimeStamp}
for Module in ModuleList:
# Get ModuleAutoGen object to generate C code file and makefile
Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
if Ma is None:
continue
if Ma.PcdIsDriver:
Ma.PlatformInfo = Pa
+ Ma.Workspace = Wa
PcdMaList.append(Ma)
if Ma.CanSkipbyHash():
self.HashSkipModules.append(Ma)
if GlobalData.gBinCacheSource:
EdkLogger.quiet("cache hit: %s[%s]" % (Ma.MetaFile.Path, Ma.Arch))
@@ -2022,38 +2047,34 @@ class Build():
else:
if GlobalData.gBinCacheSource:
EdkLogger.quiet("cache miss: %s[%s]" % (Ma.MetaFile.Path, Ma.Arch))
# Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run', 'fds'
- if self.Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
# for target which must generate AutoGen code and makefile
- if not self.SkipAutoGen or self.Target == 'genc':
- Ma.CreateCodeFile(True)
- if self.Target == "genc":
- continue
- if not self.SkipAutoGen or self.Target == 'genmake':
- if CmdListDict and self.Fdf and (Module.File, Arch) in CmdListDict:
- Ma.CreateMakeFile(True, CmdListDict[Module.File, Arch])
- del CmdListDict[Module.File, Arch]
- else:
- Ma.CreateMakeFile(True)
- if self.Target == "genmake":
- continue
- self.BuildModules.append(Ma)
+ BuildModules.append(Ma)
# Initialize all modules in tracking to 'FAIL'
if Ma.Arch not in GlobalData.gModuleBuildTracking:
GlobalData.gModuleBuildTracking[Ma.Arch] = dict()
if Ma not in GlobalData.gModuleBuildTracking[Ma.Arch]:
GlobalData.gModuleBuildTracking[Ma.Arch][Ma] = 'FAIL'
+ mqueue = mp.Queue()
+ for m in Pa.GetAllModuleInfo:
+ mqueue.put(m)
+ data_pipe_file = os.path.join(Pa.BuildDir, "GlobalVar_%s_%s.bin" % (str(Pa.Guid),Pa.Arch))
+ Pa.DataPipe.dump(data_pipe_file)
+ autogen_rt = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList)
self.Progress.Stop("done!")
self.AutoGenTime += int(round((time.time() - AutoGenStart)))
+ if not autogen_rt:
+ return
+ for Arch in Wa.ArchList:
MakeStart = time.time()
- for Ma in self.BuildModules:
+ for Ma in BuildModules:
# Generate build task for the module
if not Ma.IsBinaryModule:
- Bt = BuildTask.New(ModuleMakeUnit(Ma, self.Target))
+ Bt = BuildTask.New(ModuleMakeUnit(Ma, Pa.BuildCommand,self.Target))
# Break build if any build thread has error
if BuildTask.HasError():
# we need a full version of makefile for platform
ExitFlag.set()
BuildTask.WaitForComplete()
@@ -2078,11 +2099,10 @@ class Build():
# All modules have been put in build tasks queue. Tell task scheduler
# to exit if all tasks are completed
#
ExitFlag.set()
BuildTask.WaitForComplete()
- self.CreateAsBuiltInf()
if GlobalData.gBinCacheDest:
self.UpdateBuildCache()
self.BuildModules = []
self.MakeTime += int(round((time.time() - MakeContiue)))
#
@@ -2496,10 +2516,16 @@ def Main():
EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
else:
EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName, Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
ReturnCode = FORMAT_INVALID
except KeyboardInterrupt:
+ if MyBuild is not None:
+ if MyBuild.AutoGenMgr:
+ MyBuild.AutoGenMgr.TerminateWorkers()
+ MyBuild.AutoGenMgr.kill()
+ # for multi-thread build exits safely
+ MyBuild.Relinquish()
ReturnCode = ABORT_ERROR
if Option is not None and Option.debug is not None:
EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
except:
if MyBuild is not None:
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 06/11] BaseTools: Add shared data for processes
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (4 preceding siblings ...)
2019-07-29 8:44 ` [Patch 05/11] BaseTools: Enable Multiple Process AutoGen Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 07/11] BaseTools: Add LogAgent to support multiple process Autogen Bob Feng
` (5 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
Add shared data for autogen processes.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
BaseTools/Source/Python/AutoGen/AutoGenWorker.py | 3 ++-
BaseTools/Source/Python/build/build.py | 10 ++++++----
2 files changed, 8 insertions(+), 5 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
index edec346abd06..1419d92b5e39 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
@@ -52,18 +52,19 @@ class AutoGenManager(threading.Thread):
for w in self.autogen_workers:
if w.is_alive():
w.terminate()
class AutoGenWorkerInProcess(mp.Process):
- def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock):
+ def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock, share_data):
mp.Process.__init__(self)
self.module_queue = module_queue
self.data_pipe_file_path =data_pipe_file_path
self.data_pipe = None
self.feedback_q = feedback_q
self.PlatformMetaFileSet = {}
self.file_lock = file_lock
+ self.share_data = share_data
def GetPlatformMetaFile(self,filepath,root):
try:
return self.PlatformMetaFileSet[(filepath,root)]
except:
self.PlatformMetaFileSet[(filepath,root)] = filepath
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 590584b3c67f..d49554ec0282 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -50,10 +50,11 @@ from GenPatchPcdTable.GenPatchPcdTable import PeImageClass,parsePcdInfoFromMapFi
from PatchPcdValue.PatchPcdValue import PatchBinaryFile
import Common.GlobalData as GlobalData
from GenFds.GenFds import GenFds, GenFdsApi
import multiprocessing as mp
+from multiprocessing import Manager
# Version and Copyright
VersionNumber = "0.60" + ' ' + gBUILD_VERSION
__version__ = "%prog Version " + VersionNumber
__copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation All rights reserved."
@@ -826,16 +827,17 @@ class Build():
self.InitBuild()
self.AutoGenMgr = None
EdkLogger.info("")
os.chdir(self.WorkspaceDir)
- def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList):
+ self.share_data = Manager().dict()
+ def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList,share_data):
if SkipAutoGen:
return
feedback_q = mp.Queue()
file_lock = mp.Lock()
- auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_lock) for _ in range(self.ThreadNumber)]
+ auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_lock,share_data) for _ in range(self.ThreadNumber)]
self.AutoGenMgr = AutoGenManager(auto_workers,feedback_q)
self.AutoGenMgr.start()
for w in auto_workers:
w.start()
if PcdMaList is not None:
@@ -1228,11 +1230,11 @@ class Build():
AutoGenObject.DataPipe.DataContainer = {"FfsCommand":FfsCommand}
self.Progress.Start("Generating makefile and code")
data_pipe_file = os.path.join(AutoGenObject.BuildDir, "GlobalVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch))
AutoGenObject.DataPipe.dump(data_pipe_file)
- autogen_rt = self.StartAutoGen(mqueue, AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList)
+ autogen_rt = self.StartAutoGen(mqueue, AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
self.Progress.Stop("done!")
return autogen_rt
else:
# always recreate top/platform makefile when clean, just in case of inconsistency
AutoGenObject.CreateCodeFile(False)
@@ -2060,11 +2062,11 @@ class Build():
mqueue = mp.Queue()
for m in Pa.GetAllModuleInfo:
mqueue.put(m)
data_pipe_file = os.path.join(Pa.BuildDir, "GlobalVar_%s_%s.bin" % (str(Pa.Guid),Pa.Arch))
Pa.DataPipe.dump(data_pipe_file)
- autogen_rt = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList)
+ autogen_rt = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
self.Progress.Stop("done!")
self.AutoGenTime += int(round((time.time() - AutoGenStart)))
if not autogen_rt:
return
for Arch in Wa.ArchList:
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 07/11] BaseTools: Add LogAgent to support multiple process Autogen
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (5 preceding siblings ...)
2019-07-29 8:44 ` [Patch 06/11] BaseTools: Add shared data for processes Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 08/11] BaseTools: Move BuildOption parser out of build.py Bob Feng
` (4 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
AutoGen processes race the logfile. To resolve this issue,
this patch create a LogAgent thread in main process to write
the log content to console or file, Other process will send
the log content to the LogAgent.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
.../Source/Python/AutoGen/AutoGenWorker.py | 86 +++++++++++++++----
BaseTools/Source/Python/AutoGen/DataPipe.py | 2 +-
BaseTools/Source/Python/Common/EdkLogger.py | 33 ++++++-
BaseTools/Source/Python/build/build.py | 27 ++++--
4 files changed, 120 insertions(+), 28 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
index 1419d92b5e39..f5bc705fbd04 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
@@ -19,52 +19,111 @@ from Workspace.WorkspaceDatabase import BuildDB
import time
from queue import Empty
import traceback
import sys
from AutoGen.DataPipe import MemoryDataPipe
+import logging
+
+class LogAgent(threading.Thread):
+ def __init__(self,log_q,log_level,log_file=None):
+ super(LogAgent,self).__init__()
+ self.log_q = log_q
+ self.log_level = log_level
+ self.log_file = log_file
+ def InitLogger(self):
+ # For DEBUG level (All DEBUG_0~9 are applicable)
+ self._DebugLogger_agent = logging.getLogger("tool_debug_agent")
+ _DebugFormatter = logging.Formatter("[%(asctime)s.%(msecs)d]: %(message)s", datefmt="%H:%M:%S")
+ self._DebugLogger_agent.setLevel(self.log_level)
+ _DebugChannel = logging.StreamHandler(sys.stdout)
+ _DebugChannel.setFormatter(_DebugFormatter)
+ self._DebugLogger_agent.addHandler(_DebugChannel)
+
+ # For VERBOSE, INFO, WARN level
+ self._InfoLogger_agent = logging.getLogger("tool_info_agent")
+ _InfoFormatter = logging.Formatter("%(message)s")
+ self._InfoLogger_agent.setLevel(self.log_level)
+ _InfoChannel = logging.StreamHandler(sys.stdout)
+ _InfoChannel.setFormatter(_InfoFormatter)
+ self._InfoLogger_agent.addHandler(_InfoChannel)
+
+ # For ERROR level
+ self._ErrorLogger_agent = logging.getLogger("tool_error_agent")
+ _ErrorFormatter = logging.Formatter("%(message)s")
+ self._ErrorLogger_agent.setLevel(self.log_level)
+ _ErrorCh = logging.StreamHandler(sys.stderr)
+ _ErrorCh.setFormatter(_ErrorFormatter)
+ self._ErrorLogger_agent.addHandler(_ErrorCh)
+
+ if self.log_file:
+ if os.path.exists(self.log_file):
+ os.remove(self.log_file)
+ _Ch = logging.FileHandler(self.log_file)
+ _Ch.setFormatter(_DebugFormatter)
+ self._DebugLogger_agent.addHandler(_Ch)
+
+ _Ch= logging.FileHandler(self.log_file)
+ _Ch.setFormatter(_InfoFormatter)
+ self._InfoLogger_agent.addHandler(_Ch)
+
+ _Ch = logging.FileHandler(self.log_file)
+ _Ch.setFormatter(_ErrorFormatter)
+ self._ErrorLogger_agent.addHandler(_Ch)
+
+ def run(self):
+ self.InitLogger()
+ while True:
+ log_message = self.log_q.get()
+ if log_message is None:
+ break
+ if log_message.name == "tool_error":
+ self._ErrorLogger_agent.log(log_message.levelno,log_message.getMessage())
+ elif log_message.name == "tool_info":
+ self._InfoLogger_agent.log(log_message.levelno,log_message.getMessage())
+ elif log_message.name == "tool_debug":
+ self._DebugLogger_agent.log(log_message.levelno,log_message.getMessage())
+ else:
+ self._InfoLogger_agent.log(log_message.levelno,log_message.getMessage())
+
+ def kill(self):
+ self.log_q.put(None)
class AutoGenManager(threading.Thread):
def __init__(self,autogen_workers, feedback_q):
super(AutoGenManager,self).__init__()
self.autogen_workers = autogen_workers
self.feedback_q = feedback_q
- self.terminate = False
self.Status = True
def run(self):
try:
while True:
- if self.terminate:
- break
- if self.feedback_q.empty():
- time.sleep(1)
- continue
- badnews = self.feedback_q.get(False)
- if badnews:
- print(badnews)
+ badnews = self.feedback_q.get()
+ if badnews is None:
self.Status = False
self.TerminateWorkers()
break
except Exception:
return
def kill(self):
- self.terminate = True
+ self.feedback_q.put(None)
def TerminateWorkers(self):
for w in self.autogen_workers:
if w.is_alive():
w.terminate()
class AutoGenWorkerInProcess(mp.Process):
- def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock, share_data):
+ def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock, share_data,log_q):
mp.Process.__init__(self)
self.module_queue = module_queue
self.data_pipe_file_path =data_pipe_file_path
self.data_pipe = None
self.feedback_q = feedback_q
self.PlatformMetaFileSet = {}
self.file_lock = file_lock
self.share_data = share_data
+ self.log_q = log_q
def GetPlatformMetaFile(self,filepath,root):
try:
return self.PlatformMetaFileSet[(filepath,root)]
except:
self.PlatformMetaFileSet[(filepath,root)] = filepath
@@ -75,18 +134,15 @@ class AutoGenWorkerInProcess(mp.Process):
with self.file_lock:
if not os.path.exists(self.data_pipe_file_path):
self.feedback_q.put(taskname + ":" + "load data pipe %s failed." % self.data_pipe_file_path)
self.data_pipe = MemoryDataPipe()
self.data_pipe.load(self.data_pipe_file_path)
- EdkLogger.Initialize()
+ EdkLogger.LogClientInitialize(self.log_q)
loglevel = self.data_pipe.Get("LogLevel")
if not loglevel:
loglevel = EdkLogger.INFO
EdkLogger.SetLevel(loglevel)
- logfile = self.data_pipe.Get("LogFile")
- if logfile:
- EdkLogger.SetLogFile(logfile)
target = self.data_pipe.Get("P_Info").get("Target")
toolchain = self.data_pipe.Get("P_Info").get("ToolChain")
archlist = self.data_pipe.Get("P_Info").get("ArchList")
active_p = self.data_pipe.Get("P_Info").get("ActivePlatform")
diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source/Python/AutoGen/DataPipe.py
index 9478f41d481b..33d2b14c9add 100644
--- a/BaseTools/Source/Python/AutoGen/DataPipe.py
+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py
@@ -145,9 +145,9 @@ class MemoryDataPipe(DataPipe):
self.DataContainer = {"PackageList": [(dec.MetaFile,dec.Arch) for dec in PlatformInfo.PackageList]}
self.DataContainer = {"GuidDict": PlatformInfo.Platform._GuidDict}
self.DataContainer = {"DatabasePath":GlobalData.gDatabasePath}
+
self.DataContainer = {"FdfParser": True if GlobalData.gFdfParser else False}
self.DataContainer = {"LogLevel": EdkLogger.GetLevel()}
- self.DataContainer = {"LogFile": GlobalData.gOptions.LogFile if GlobalData.gOptions.LogFile is not None else ""}
diff --git a/BaseTools/Source/Python/Common/EdkLogger.py b/BaseTools/Source/Python/Common/EdkLogger.py
index ae2070bebba3..f6a5e3b4daf9 100644
--- a/BaseTools/Source/Python/Common/EdkLogger.py
+++ b/BaseTools/Source/Python/Common/EdkLogger.py
@@ -8,10 +8,11 @@
## Import modules
from __future__ import absolute_import
import Common.LongFilePathOs as os, sys, logging
import traceback
from .BuildToolError import *
+import logging.handlers
## Log level constants
DEBUG_0 = 1
DEBUG_1 = 2
DEBUG_2 = 3
@@ -198,30 +199,30 @@ def error(ToolName, ErrorCode, Message=None, File=None, Line=None, ExtraData=Non
# Log information which should be always put out
quiet = _ErrorLogger.error
## Initialize log system
-def Initialize():
+def LogClientInitialize(log_q):
#
# Since we use different format to log different levels of message into different
# place (stdout or stderr), we have to use different "Logger" objects to do this.
#
# For DEBUG level (All DEBUG_0~9 are applicable)
_DebugLogger.setLevel(INFO)
- _DebugChannel = logging.StreamHandler(sys.stdout)
+ _DebugChannel = logging.handlers.QueueHandler(log_q)
_DebugChannel.setFormatter(_DebugFormatter)
_DebugLogger.addHandler(_DebugChannel)
# For VERBOSE, INFO, WARN level
_InfoLogger.setLevel(INFO)
- _InfoChannel = logging.StreamHandler(sys.stdout)
+ _InfoChannel = logging.handlers.QueueHandler(log_q)
_InfoChannel.setFormatter(_InfoFormatter)
_InfoLogger.addHandler(_InfoChannel)
# For ERROR level
_ErrorLogger.setLevel(INFO)
- _ErrorCh = logging.StreamHandler(sys.stderr)
+ _ErrorCh = logging.handlers.QueueHandler(log_q)
_ErrorCh.setFormatter(_ErrorFormatter)
_ErrorLogger.addHandler(_ErrorCh)
## Set log level
#
@@ -232,10 +233,34 @@ def SetLevel(Level):
Level = INFO
_DebugLogger.setLevel(Level)
_InfoLogger.setLevel(Level)
_ErrorLogger.setLevel(Level)
+## Initialize log system
+def Initialize():
+ #
+ # Since we use different format to log different levels of message into different
+ # place (stdout or stderr), we have to use different "Logger" objects to do this.
+ #
+ # For DEBUG level (All DEBUG_0~9 are applicable)
+ _DebugLogger.setLevel(INFO)
+ _DebugChannel = logging.StreamHandler(sys.stdout)
+ _DebugChannel.setFormatter(_DebugFormatter)
+ _DebugLogger.addHandler(_DebugChannel)
+
+ # For VERBOSE, INFO, WARN level
+ _InfoLogger.setLevel(INFO)
+ _InfoChannel = logging.StreamHandler(sys.stdout)
+ _InfoChannel.setFormatter(_InfoFormatter)
+ _InfoLogger.addHandler(_InfoChannel)
+
+ # For ERROR level
+ _ErrorLogger.setLevel(INFO)
+ _ErrorCh = logging.StreamHandler(sys.stderr)
+ _ErrorCh.setFormatter(_ErrorFormatter)
+ _ErrorLogger.addHandler(_ErrorCh)
+
def InitializeForUnitTest():
Initialize()
SetLevel(SILENT)
## Get current log level
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index d49554ec0282..74df68350434 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -28,11 +28,12 @@ from subprocess import Popen,PIPE
from collections import OrderedDict, defaultdict
from optparse import OptionParser
from AutoGen.PlatformAutoGen import PlatformAutoGen
from AutoGen.ModuleAutoGen import ModuleAutoGen
from AutoGen.WorkspaceAutoGen import WorkspaceAutoGen
-from AutoGen.AutoGenWorker import AutoGenWorkerInProcess,AutoGenManager
+from AutoGen.AutoGenWorker import AutoGenWorkerInProcess,AutoGenManager,\
+ LogAgent
from AutoGen import GenMake
from Common import Misc as Utils
from Common.TargetTxtClassObject import TargetTxt
from Common.ToolDefClassObject import ToolDef
@@ -697,11 +698,11 @@ class Build():
#
# @param Target The build command target, one of gSupportedTarget
# @param WorkspaceDir The directory of workspace
# @param BuildOptions Build options passed from command line
#
- def __init__(self, Target, WorkspaceDir, BuildOptions):
+ def __init__(self, Target, WorkspaceDir, BuildOptions,log_q):
self.WorkspaceDir = WorkspaceDir
self.Target = Target
self.PlatformFile = BuildOptions.PlatformFile
self.ModuleFile = BuildOptions.ModuleFile
self.ArchList = BuildOptions.TargetArch
@@ -828,16 +829,17 @@ class Build():
self.AutoGenMgr = None
EdkLogger.info("")
os.chdir(self.WorkspaceDir)
self.share_data = Manager().dict()
+ self.log_q = log_q
def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList,share_data):
if SkipAutoGen:
return
feedback_q = mp.Queue()
file_lock = mp.Lock()
- auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_lock,share_data) for _ in range(self.ThreadNumber)]
+ auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_lock,share_data,self.log_q) for _ in range(self.ThreadNumber)]
self.AutoGenMgr = AutoGenManager(auto_workers,feedback_q)
self.AutoGenMgr.start()
for w in auto_workers:
w.start()
if PcdMaList is not None:
@@ -2393,35 +2395,42 @@ def MyOptionParser():
# @retval 1 Tool failed
#
def Main():
StartTime = time.time()
+ #
+ # Create a log Queue
+ #
+ LogQ = mp.Queue()
# Initialize log system
- EdkLogger.Initialize()
+ EdkLogger.LogClientInitialize(LogQ)
GlobalData.gCommand = sys.argv[1:]
#
# Parse the options and args
#
(Option, Target) = MyOptionParser()
GlobalData.gOptions = Option
GlobalData.gCaseInsensitive = Option.CaseInsensitive
# Set log level
+ LogLevel = EdkLogger.INFO
if Option.verbose is not None:
EdkLogger.SetLevel(EdkLogger.VERBOSE)
+ LogLevel = EdkLogger.VERBOSE
elif Option.quiet is not None:
EdkLogger.SetLevel(EdkLogger.QUIET)
+ LogLevel = EdkLogger.QUIET
elif Option.debug is not None:
EdkLogger.SetLevel(Option.debug + 1)
+ LogLevel = Option.debug + 1
else:
EdkLogger.SetLevel(EdkLogger.INFO)
- if Option.LogFile is not None:
- EdkLogger.SetLogFile(Option.LogFile)
-
if Option.WarningAsError == True:
EdkLogger.SetWarningAsError()
+ Log_Agent = LogAgent(LogQ,LogLevel,Option.LogFile)
+ Log_Agent.start()
if platform.platform().find("Windows") >= 0:
GlobalData.gIsWindows = True
else:
GlobalData.gIsWindows = False
@@ -2491,11 +2500,11 @@ def Main():
EdkLogger.error("build", ErrorCode, ExtraData=ErrorInfo)
if Option.Flag is not None and Option.Flag not in ['-c', '-s']:
EdkLogger.error("build", OPTION_VALUE_INVALID, "UNI flag must be one of -c or -s")
- MyBuild = Build(Target, Workspace, Option)
+ MyBuild = Build(Target, Workspace, Option,LogQ)
GlobalData.gCommandLineDefines['ARCH'] = ' '.join(MyBuild.ArchList)
if not (MyBuild.LaunchPrebuildFlag and os.path.exists(MyBuild.PlatformBuildPath)):
MyBuild.Launch()
#
@@ -2577,10 +2586,12 @@ def Main():
EdkLogger.SetLevel(EdkLogger.QUIET)
EdkLogger.quiet("\n- %s -" % Conclusion)
EdkLogger.quiet(time.strftime("Build end time: %H:%M:%S, %b.%d %Y", time.localtime()))
EdkLogger.quiet("Build total time: %s\n" % BuildDurationStr)
+ Log_Agent.kill()
+ Log_Agent.join()
return ReturnCode
if __name__ == '__main__':
r = Main()
## 0-127 is a safe return range, and 1 is a standard default error
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 08/11] BaseTools: Move BuildOption parser out of build.py
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (6 preceding siblings ...)
2019-07-29 8:44 ` [Patch 07/11] BaseTools: Add LogAgent to support multiple process Autogen Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 09/11] BaseTools: Add the support for python 2 Bob Feng
` (3 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Bob Feng, Liming Gao
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
Build tool supports user to specify the conf folder.
To make the build options be evaluated at the beginning
of launching build, extract the buildoption function
from build.py to a new .py file.
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
---
.../Python/Common/TargetTxtClassObject.py | 28 ++++-
BaseTools/Source/Python/build/build.py | 108 +-----------------
BaseTools/Source/Python/build/buildoptions.py | 92 +++++++++++++++
3 files changed, 121 insertions(+), 107 deletions(-)
create mode 100644 BaseTools/Source/Python/build/buildoptions.py
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 79a5acc01074..16cc75ccb7c8 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -8,16 +8,19 @@
##
# Import Modules
#
from __future__ import print_function
from __future__ import absolute_import
+from buildoptions import BuildOption,BuildTarget
+import Common.GlobalData as GlobalData
import Common.LongFilePathOs as os
from . import EdkLogger
from . import DataType
from .BuildToolError import *
-from . import GlobalData
+
from Common.LongFilePathSupport import OpenLongFilePath as open
+from Common.MultipleWorkspace import MultipleWorkspace as mws
gDefaultTargetTxtFile = "target.txt"
## TargetTxtClassObject
#
@@ -139,16 +142,33 @@ class TargetTxtClassObject(object):
#
# @param ConfDir: Conf dir
#
# @retval Target An instance of TargetTxtClassObject() with loaded target.txt
#
-def TargetTxtDict(ConfDir):
+def TargetTxtDict():
Target = TargetTxtClassObject()
- Target.LoadTargetTxtFile(os.path.normpath(os.path.join(ConfDir, gDefaultTargetTxtFile)))
+ if BuildOption.ConfDirectory:
+ # Get alternate Conf location, if it is absolute, then just use the absolute directory name
+ ConfDirectoryPath = os.path.normpath(BuildOption.ConfDirectory)
+
+ if not os.path.isabs(ConfDirectoryPath):
+ # Since alternate directory name is not absolute, the alternate directory is located within the WORKSPACE
+ # This also handles someone specifying the Conf directory in the workspace. Using --conf=Conf
+ ConfDirectoryPath = mws.join(os.environ["WORKSPACE"], ConfDirectoryPath)
+ else:
+ if "CONF_PATH" in os.environ:
+ ConfDirectoryPath = os.path.normcase(os.path.normpath(os.environ["CONF_PATH"]))
+ else:
+ # Get standard WORKSPACE/Conf use the absolute path to the WORKSPACE/Conf
+ ConfDirectoryPath = mws.join(os.environ["WORKSPACE"], 'Conf')
+ GlobalData.gConfDirectory = ConfDirectoryPath
+ targettxt = os.path.normpath(os.path.join(ConfDirectoryPath, gDefaultTargetTxtFile))
+ if os.path.exists(targettxt):
+ Target.LoadTargetTxtFile(targettxt)
return Target
-TargetTxt = TargetTxtDict(os.path.join(os.getenv("WORKSPACE"),"Conf"))
+TargetTxt = TargetTxtDict()
##
#
# This acts like the main() function for the script, unless it is 'import'ed into another
# script.
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 74df68350434..4125b2832946 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -24,11 +24,11 @@ import traceback
import multiprocessing
from threading import Thread,Event,BoundedSemaphore
import threading
from subprocess import Popen,PIPE
from collections import OrderedDict, defaultdict
-from optparse import OptionParser
+from buildoptions import BuildOption,BuildTarget
from AutoGen.PlatformAutoGen import PlatformAutoGen
from AutoGen.ModuleAutoGen import ModuleAutoGen
from AutoGen.WorkspaceAutoGen import WorkspaceAutoGen
from AutoGen.AutoGenWorker import AutoGenWorkerInProcess,AutoGenManager,\
LogAgent
@@ -41,11 +41,11 @@ from Common.Misc import PathClass,SaveFileOnChange,RemoveDirectory
from Common.StringUtils import NormPath
from Common.MultipleWorkspace import MultipleWorkspace as mws
from Common.BuildToolError import *
from Common.DataType import *
import Common.EdkLogger as EdkLogger
-from Common.BuildVersion import gBUILD_VERSION
+
from Workspace.WorkspaceDatabase import BuildDB
from BuildReport import BuildReport
from GenPatchPcdTable.GenPatchPcdTable import PeImageClass,parsePcdInfoFromMapFile
from PatchPcdValue.PatchPcdValue import PatchBinaryFile
@@ -53,14 +53,10 @@ from PatchPcdValue.PatchPcdValue import PatchBinaryFile
import Common.GlobalData as GlobalData
from GenFds.GenFds import GenFds, GenFdsApi
import multiprocessing as mp
from multiprocessing import Manager
-# Version and Copyright
-VersionNumber = "0.60" + ' ' + gBUILD_VERSION
-__version__ = "%prog Version " + VersionNumber
-__copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation All rights reserved."
## standard targets of build command
gSupportedTarget = ['all', 'genc', 'genmake', 'modules', 'libraries', 'fds', 'clean', 'cleanall', 'cleanlib', 'run']
## build configuration file
@@ -761,26 +757,11 @@ class Build():
GlobalData.gBinCacheDest = BinCacheDest
else:
if GlobalData.gBinCacheDest is not None:
EdkLogger.error("build", OPTION_VALUE_INVALID, ExtraData="Invalid value of option --binary-destination.")
- if self.ConfDirectory:
- # Get alternate Conf location, if it is absolute, then just use the absolute directory name
- ConfDirectoryPath = os.path.normpath(self.ConfDirectory)
-
- if not os.path.isabs(ConfDirectoryPath):
- # Since alternate directory name is not absolute, the alternate directory is located within the WORKSPACE
- # This also handles someone specifying the Conf directory in the workspace. Using --conf=Conf
- ConfDirectoryPath = mws.join(self.WorkspaceDir, ConfDirectoryPath)
- else:
- if "CONF_PATH" in os.environ:
- ConfDirectoryPath = os.path.normcase(os.path.normpath(os.environ["CONF_PATH"]))
- else:
- # Get standard WORKSPACE/Conf use the absolute path to the WORKSPACE/Conf
- ConfDirectoryPath = mws.join(self.WorkspaceDir, 'Conf')
- GlobalData.gConfDirectory = ConfDirectoryPath
- GlobalData.gDatabasePath = os.path.normpath(os.path.join(ConfDirectoryPath, GlobalData.gDatabasePath))
+ GlobalData.gDatabasePath = os.path.normpath(os.path.join(GlobalData.gConfDirectory, GlobalData.gDatabasePath))
if not os.path.exists(os.path.join(GlobalData.gConfDirectory, '.cache')):
os.makedirs(os.path.join(GlobalData.gConfDirectory, '.cache'))
self.Db = BuildDB
self.BuildDatabase = self.Db.BuildObject
self.Platform = None
@@ -2290,17 +2271,11 @@ def ParseDefines(DefineList=[]):
DefineDict[DefineTokenList[0]] = "TRUE"
else:
DefineDict[DefineTokenList[0]] = DefineTokenList[1].strip()
return DefineDict
-gParamCheck = []
-def SingleCheckCallback(option, opt_str, value, parser):
- if option not in gParamCheck:
- setattr(parser.values, option.dest, value)
- gParamCheck.append(option)
- else:
- parser.error("Option %s only allows one instance in command line!" % option)
+
def LogBuildTime(Time):
if Time:
TimeDurStr = ''
TimeDur = time.gmtime(Time)
@@ -2310,83 +2285,10 @@ def LogBuildTime(Time):
TimeDurStr = time.strftime("%H:%M:%S", TimeDur)
return TimeDurStr
else:
return None
-## Parse command line options
-#
-# Using standard Python module optparse to parse command line option of this tool.
-#
-# @retval Opt A optparse.Values object containing the parsed options
-# @retval Args Target of build command
-#
-def MyOptionParser():
- Parser = OptionParser(description=__copyright__, version=__version__, prog="build.exe", usage="%prog [options] [all|fds|genc|genmake|clean|cleanall|cleanlib|modules|libraries|run]")
- Parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32', 'X64', 'EBC', 'ARM', 'AARCH64'], dest="TargetArch",
- help="ARCHS is one of list: IA32, X64, ARM, AARCH64 or EBC, which overrides target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option.")
- Parser.add_option("-p", "--platform", action="callback", type="string", dest="PlatformFile", callback=SingleCheckCallback,
- help="Build the platform specified by the DSC file name argument, overriding target.txt's ACTIVE_PLATFORM definition.")
- Parser.add_option("-m", "--module", action="callback", type="string", dest="ModuleFile", callback=SingleCheckCallback,
- help="Build the module specified by the INF file name argument.")
- Parser.add_option("-b", "--buildtarget", type="string", dest="BuildTarget", help="Using the TARGET to build the platform, overriding target.txt's TARGET definition.",
- action="append")
- Parser.add_option("-t", "--tagname", action="append", type="string", dest="ToolChain",
- help="Using the Tool Chain Tagname to build the platform, overriding target.txt's TOOL_CHAIN_TAG definition.")
- Parser.add_option("-x", "--sku-id", action="callback", type="string", dest="SkuId", callback=SingleCheckCallback,
- help="Using this name of SKU ID to build the platform, overriding SKUID_IDENTIFIER in DSC file.")
-
- Parser.add_option("-n", action="callback", type="int", dest="ThreadNumber", callback=SingleCheckCallback,
- help="Build the platform using multi-threaded compiler. The value overrides target.txt's MAX_CONCURRENT_THREAD_NUMBER. When value is set to 0, tool automatically detect number of "\
- "processor threads, set value to 1 means disable multi-thread build, and set value to more than 1 means user specify the threads number to build.")
-
- Parser.add_option("-f", "--fdf", action="callback", type="string", dest="FdfFile", callback=SingleCheckCallback,
- help="The name of the FDF file to use, which overrides the setting in the DSC file.")
- Parser.add_option("-r", "--rom-image", action="append", type="string", dest="RomImage", default=[],
- help="The name of FD to be generated. The name must be from [FD] section in FDF file.")
- Parser.add_option("-i", "--fv-image", action="append", type="string", dest="FvImage", default=[],
- help="The name of FV to be generated. The name must be from [FV] section in FDF file.")
- Parser.add_option("-C", "--capsule-image", action="append", type="string", dest="CapName", default=[],
- help="The name of Capsule to be generated. The name must be from [Capsule] section in FDF file.")
- Parser.add_option("-u", "--skip-autogen", action="store_true", dest="SkipAutoGen", help="Skip AutoGen step.")
- Parser.add_option("-e", "--re-parse", action="store_true", dest="Reparse", help="Re-parse all meta-data files.")
-
- Parser.add_option("-c", "--case-insensitive", action="store_true", dest="CaseInsensitive", default=False, help="Don't check case of file name.")
-
- Parser.add_option("-w", "--warning-as-error", action="store_true", dest="WarningAsError", help="Treat warning in tools as error.")
- Parser.add_option("-j", "--log", action="store", dest="LogFile", help="Put log in specified file as well as on console.")
-
- Parser.add_option("-s", "--silent", action="store_true", type=None, dest="SilentMode",
- help="Make use of silent mode of (n)make.")
- Parser.add_option("-q", "--quiet", action="store_true", type=None, help="Disable all messages except FATAL ERRORS.")
- Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "\
- "including library instances selected, final dependency expression, "\
- "and warning messages, etc.")
- Parser.add_option("-d", "--debug", action="store", type="int", help="Enable debug messages at specified level.")
- Parser.add_option("-D", "--define", action="append", type="string", dest="Macros", help="Macro: \"Name [= Value]\".")
-
- Parser.add_option("-y", "--report-file", action="store", dest="ReportFile", help="Create/overwrite the report to the specified filename.")
- Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD', 'LIBRARY', 'FLASH', 'DEPEX', 'BUILD_FLAGS', 'FIXED_ADDRESS', 'HASH', 'EXECUTION_ORDER'], dest="ReportType", default=[],
- help="Flags that control the type of build report to generate. Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HASH, EXECUTION_ORDER]. "\
- "To specify more than one flag, repeat this option on the command line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BUILD_FLAGS, FIXED_ADDRESS]")
- Parser.add_option("-F", "--flag", action="store", type="string", dest="Flag",
- help="Specify the specific option to parse EDK UNI file. Must be one of: [-c, -s]. -c is for EDK framework UNI file, and -s is for EDK UEFI UNI file. "\
- "This option can also be specified by setting *_*_*_BUILD_FLAGS in [BuildOptions] section of platform DSC. If they are both specified, this value "\
- "will override the setting in [BuildOptions] section of platform DSC.")
- Parser.add_option("-N", "--no-cache", action="store_true", dest="DisableCache", default=False, help="Disable build cache mechanism")
- Parser.add_option("--conf", action="store", type="string", dest="ConfDirectory", help="Specify the customized Conf directory.")
- Parser.add_option("--check-usage", action="store_true", dest="CheckUsage", default=False, help="Check usage content of entries listed in INF file.")
- Parser.add_option("--ignore-sources", action="store_true", dest="IgnoreSources", default=False, help="Focus to a binary build and ignore all source files")
- Parser.add_option("--pcd", action="append", dest="OptionPcd", help="Set PCD value by command line. Format: \"PcdName=Value\" ")
- Parser.add_option("-l", "--cmd-len", action="store", type="int", dest="CommandLength", help="Specify the maximum line length of build command. Default is 4096.")
- Parser.add_option("--hash", action="store_true", dest="UseHashCache", default=False, help="Enable hash-based caching during build process.")
- Parser.add_option("--binary-destination", action="store", type="string", dest="BinCacheDest", help="Generate a cache of binary files in the specified directory.")
- Parser.add_option("--binary-source", action="store", type="string", dest="BinCacheSource", help="Consume a cache of binary files from the specified directory.")
- Parser.add_option("--genfds-multi-thread", action="store_true", dest="GenfdsMultiThread", default=False, help="Enable GenFds multi thread to generate ffs file.")
- Parser.add_option("--disable-include-path-check", action="store_true", dest="DisableIncludePathCheck", default=False, help="Disable the include path check for outside of package.")
- (Opt, Args) = Parser.parse_args()
- return (Opt, Args)
-
## Tool entrance method
#
# This method mainly dispatch specific methods per the command line options.
# If no error found, return zero value so the caller of this tool can know
# if it's executed successfully or not.
@@ -2405,11 +2307,11 @@ def Main():
EdkLogger.LogClientInitialize(LogQ)
GlobalData.gCommand = sys.argv[1:]
#
# Parse the options and args
#
- (Option, Target) = MyOptionParser()
+ Option, Target = BuildOption, BuildTarget
GlobalData.gOptions = Option
GlobalData.gCaseInsensitive = Option.CaseInsensitive
# Set log level
LogLevel = EdkLogger.INFO
diff --git a/BaseTools/Source/Python/build/buildoptions.py b/BaseTools/Source/Python/build/buildoptions.py
new file mode 100644
index 000000000000..7161aa66f23e
--- /dev/null
+++ b/BaseTools/Source/Python/build/buildoptions.py
@@ -0,0 +1,92 @@
+## @file
+# build a platform or a module
+#
+# Copyright (c) 2014, Hewlett-Packard Development Company, L.P.<BR>
+# Copyright (c) 2007 - 2019, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2018, Hewlett Packard Enterprise Development, L.P.<BR>
+#
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+#
+
+# Version and Copyright
+from Common.BuildVersion import gBUILD_VERSION
+from optparse import OptionParser
+VersionNumber = "0.60" + ' ' + gBUILD_VERSION
+__version__ = "%prog Version " + VersionNumber
+__copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation All rights reserved."
+
+gParamCheck = []
+def SingleCheckCallback(option, opt_str, value, parser):
+ if option not in gParamCheck:
+ setattr(parser.values, option.dest, value)
+ gParamCheck.append(option)
+ else:
+ parser.error("Option %s only allows one instance in command line!" % option)
+
+def MyOptionParser():
+ Parser = OptionParser(description=__copyright__, version=__version__, prog="build.exe", usage="%prog [options] [all|fds|genc|genmake|clean|cleanall|cleanlib|modules|libraries|run]")
+ Parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32', 'X64', 'EBC', 'ARM', 'AARCH64'], dest="TargetArch",
+ help="ARCHS is one of list: IA32, X64, ARM, AARCH64 or EBC, which overrides target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option.")
+ Parser.add_option("-p", "--platform", action="callback", type="string", dest="PlatformFile", callback=SingleCheckCallback,
+ help="Build the platform specified by the DSC file name argument, overriding target.txt's ACTIVE_PLATFORM definition.")
+ Parser.add_option("-m", "--module", action="callback", type="string", dest="ModuleFile", callback=SingleCheckCallback,
+ help="Build the module specified by the INF file name argument.")
+ Parser.add_option("-b", "--buildtarget", type="string", dest="BuildTarget", help="Using the TARGET to build the platform, overriding target.txt's TARGET definition.",
+ action="append")
+ Parser.add_option("-t", "--tagname", action="append", type="string", dest="ToolChain",
+ help="Using the Tool Chain Tagname to build the platform, overriding target.txt's TOOL_CHAIN_TAG definition.")
+ Parser.add_option("-x", "--sku-id", action="callback", type="string", dest="SkuId", callback=SingleCheckCallback,
+ help="Using this name of SKU ID to build the platform, overriding SKUID_IDENTIFIER in DSC file.")
+
+ Parser.add_option("-n", action="callback", type="int", dest="ThreadNumber", callback=SingleCheckCallback,
+ help="Build the platform using multi-threaded compiler. The value overrides target.txt's MAX_CONCURRENT_THREAD_NUMBER. When value is set to 0, tool automatically detect number of "\
+ "processor threads, set value to 1 means disable multi-thread build, and set value to more than 1 means user specify the threads number to build.")
+
+ Parser.add_option("-f", "--fdf", action="callback", type="string", dest="FdfFile", callback=SingleCheckCallback,
+ help="The name of the FDF file to use, which overrides the setting in the DSC file.")
+ Parser.add_option("-r", "--rom-image", action="append", type="string", dest="RomImage", default=[],
+ help="The name of FD to be generated. The name must be from [FD] section in FDF file.")
+ Parser.add_option("-i", "--fv-image", action="append", type="string", dest="FvImage", default=[],
+ help="The name of FV to be generated. The name must be from [FV] section in FDF file.")
+ Parser.add_option("-C", "--capsule-image", action="append", type="string", dest="CapName", default=[],
+ help="The name of Capsule to be generated. The name must be from [Capsule] section in FDF file.")
+ Parser.add_option("-u", "--skip-autogen", action="store_true", dest="SkipAutoGen", help="Skip AutoGen step.")
+ Parser.add_option("-e", "--re-parse", action="store_true", dest="Reparse", help="Re-parse all meta-data files.")
+
+ Parser.add_option("-c", "--case-insensitive", action="store_true", dest="CaseInsensitive", default=False, help="Don't check case of file name.")
+
+ Parser.add_option("-w", "--warning-as-error", action="store_true", dest="WarningAsError", help="Treat warning in tools as error.")
+ Parser.add_option("-j", "--log", action="store", dest="LogFile", help="Put log in specified file as well as on console.")
+
+ Parser.add_option("-s", "--silent", action="store_true", type=None, dest="SilentMode",
+ help="Make use of silent mode of (n)make.")
+ Parser.add_option("-q", "--quiet", action="store_true", type=None, help="Disable all messages except FATAL ERRORS.")
+ Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "\
+ "including library instances selected, final dependency expression, "\
+ "and warning messages, etc.")
+ Parser.add_option("-d", "--debug", action="store", type="int", help="Enable debug messages at specified level.")
+ Parser.add_option("-D", "--define", action="append", type="string", dest="Macros", help="Macro: \"Name [= Value]\".")
+
+ Parser.add_option("-y", "--report-file", action="store", dest="ReportFile", help="Create/overwrite the report to the specified filename.")
+ Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD', 'LIBRARY', 'FLASH', 'DEPEX', 'BUILD_FLAGS', 'FIXED_ADDRESS', 'HASH', 'EXECUTION_ORDER'], dest="ReportType", default=[],
+ help="Flags that control the type of build report to generate. Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HASH, EXECUTION_ORDER]. "\
+ "To specify more than one flag, repeat this option on the command line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BUILD_FLAGS, FIXED_ADDRESS]")
+ Parser.add_option("-F", "--flag", action="store", type="string", dest="Flag",
+ help="Specify the specific option to parse EDK UNI file. Must be one of: [-c, -s]. -c is for EDK framework UNI file, and -s is for EDK UEFI UNI file. "\
+ "This option can also be specified by setting *_*_*_BUILD_FLAGS in [BuildOptions] section of platform DSC. If they are both specified, this value "\
+ "will override the setting in [BuildOptions] section of platform DSC.")
+ Parser.add_option("-N", "--no-cache", action="store_true", dest="DisableCache", default=False, help="Disable build cache mechanism")
+ Parser.add_option("--conf", action="store", type="string", dest="ConfDirectory", help="Specify the customized Conf directory.")
+ Parser.add_option("--check-usage", action="store_true", dest="CheckUsage", default=False, help="Check usage content of entries listed in INF file.")
+ Parser.add_option("--ignore-sources", action="store_true", dest="IgnoreSources", default=False, help="Focus to a binary build and ignore all source files")
+ Parser.add_option("--pcd", action="append", dest="OptionPcd", help="Set PCD value by command line. Format: \"PcdName=Value\" ")
+ Parser.add_option("-l", "--cmd-len", action="store", type="int", dest="CommandLength", help="Specify the maximum line length of build command. Default is 4096.")
+ Parser.add_option("--hash", action="store_true", dest="UseHashCache", default=False, help="Enable hash-based caching during build process.")
+ Parser.add_option("--binary-destination", action="store", type="string", dest="BinCacheDest", help="Generate a cache of binary files in the specified directory.")
+ Parser.add_option("--binary-source", action="store", type="string", dest="BinCacheSource", help="Consume a cache of binary files from the specified directory.")
+ Parser.add_option("--genfds-multi-thread", action="store_true", dest="GenfdsMultiThread", default=False, help="Enable GenFds multi thread to generate ffs file.")
+ Parser.add_option("--disable-include-path-check", action="store_true", dest="DisableIncludePathCheck", default=False, help="Disable the include path check for outside of package.")
+ (Opt, Args) = Parser.parse_args()
+ return (Opt, Args)
+
+BuildOption, BuildTarget = MyOptionParser()
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 09/11] BaseTools: Add the support for python 2
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (7 preceding siblings ...)
2019-07-29 8:44 ` [Patch 08/11] BaseTools: Move BuildOption parser out of build.py Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 10/11] BaseTools: Enable block queue log agent Bob Feng
` (2 subsequent siblings)
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
python3 change the module name of Queue to queue.
python3 add a new log handler of QueueHandler.
This patch is to make Multiple process AutoGen
feature work for python2
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
.../Source/Python/AutoGen/AutoGenWorker.py | 5 +-
BaseTools/Source/Python/Common/EdkLogger.py | 92 ++++++++++++++++++-
2 files changed, 92 insertions(+), 5 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
index f5bc705fbd04..d1c55cffa8d0 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
@@ -15,11 +15,14 @@ import Common.EdkLogger as EdkLogger
import os
from Common.MultipleWorkspace import MultipleWorkspace as mws
from AutoGen.AutoGen import AutoGen
from Workspace.WorkspaceDatabase import BuildDB
import time
-from queue import Empty
+try:
+ from queue import Empty
+except:
+ from Queue import Empty
import traceback
import sys
from AutoGen.DataPipe import MemoryDataPipe
import logging
diff --git a/BaseTools/Source/Python/Common/EdkLogger.py b/BaseTools/Source/Python/Common/EdkLogger.py
index f6a5e3b4daf9..15fd1458a95a 100644
--- a/BaseTools/Source/Python/Common/EdkLogger.py
+++ b/BaseTools/Source/Python/Common/EdkLogger.py
@@ -3,16 +3,100 @@
#
# Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
# SPDX-License-Identifier: BSD-2-Clause-Patent
#
+# Copyright 2001-2016 by Vinay Sajip. All Rights Reserved.
+#
+# Permission to use, copy, modify, and distribute this software and its
+# documentation for any purpose and without fee is hereby granted,
+# provided that the above copyright notice appear in all copies and that
+# both that copyright notice and this permission notice appear in
+# supporting documentation, and that the name of Vinay Sajip
+# not be used in advertising or publicity pertaining to distribution
+# of the software without specific, written prior permission.
+# VINAY SAJIP DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING
+# ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL
+# VINAY SAJIP BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR
+# ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER
+# IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
+# OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+# This copyright is for QueueHandler.
+
## Import modules
from __future__ import absolute_import
import Common.LongFilePathOs as os, sys, logging
import traceback
from .BuildToolError import *
-import logging.handlers
+try:
+ from logging.handlers import QueueHandler
+except:
+ class QueueHandler(logging.Handler):
+ """
+ This handler sends events to a queue. Typically, it would be used together
+ with a multiprocessing Queue to centralise logging to file in one process
+ (in a multi-process application), so as to avoid file write contention
+ between processes.
+
+ This code is new in Python 3.2, but this class can be copy pasted into
+ user code for use with earlier Python versions.
+ """
+
+ def __init__(self, queue):
+ """
+ Initialise an instance, using the passed queue.
+ """
+ logging.Handler.__init__(self)
+ self.queue = queue
+
+ def enqueue(self, record):
+ """
+ Enqueue a record.
+
+ The base implementation uses put_nowait. You may want to override
+ this method if you want to use blocking, timeouts or custom queue
+ implementations.
+ """
+ self.queue.put_nowait(record)
+
+ def prepare(self, record):
+ """
+ Prepares a record for queuing. The object returned by this method is
+ enqueued.
+
+ The base implementation formats the record to merge the message
+ and arguments, and removes unpickleable items from the record
+ in-place.
+
+ You might want to override this method if you want to convert
+ the record to a dict or JSON string, or send a modified copy
+ of the record while leaving the original intact.
+ """
+ # The format operation gets traceback text into record.exc_text
+ # (if there's exception data), and also returns the formatted
+ # message. We can then use this to replace the original
+ # msg + args, as these might be unpickleable. We also zap the
+ # exc_info and exc_text attributes, as they are no longer
+ # needed and, if not None, will typically not be pickleable.
+ msg = self.format(record)
+ record.message = msg
+ record.msg = msg
+ record.args = None
+ record.exc_info = None
+ record.exc_text = None
+ return record
+
+ def emit(self, record):
+ """
+ Emit a record.
+
+ Writes the LogRecord to the queue, preparing it for pickling first.
+ """
+ try:
+ self.enqueue(self.prepare(record))
+ except Exception:
+ self.handleError(record)
## Log level constants
DEBUG_0 = 1
DEBUG_1 = 2
DEBUG_2 = 3
@@ -206,23 +290,23 @@ def LogClientInitialize(log_q):
# Since we use different format to log different levels of message into different
# place (stdout or stderr), we have to use different "Logger" objects to do this.
#
# For DEBUG level (All DEBUG_0~9 are applicable)
_DebugLogger.setLevel(INFO)
- _DebugChannel = logging.handlers.QueueHandler(log_q)
+ _DebugChannel = QueueHandler(log_q)
_DebugChannel.setFormatter(_DebugFormatter)
_DebugLogger.addHandler(_DebugChannel)
# For VERBOSE, INFO, WARN level
_InfoLogger.setLevel(INFO)
- _InfoChannel = logging.handlers.QueueHandler(log_q)
+ _InfoChannel = QueueHandler(log_q)
_InfoChannel.setFormatter(_InfoFormatter)
_InfoLogger.addHandler(_InfoChannel)
# For ERROR level
_ErrorLogger.setLevel(INFO)
- _ErrorCh = logging.handlers.QueueHandler(log_q)
+ _ErrorCh = QueueHandler(log_q)
_ErrorCh.setFormatter(_ErrorFormatter)
_ErrorLogger.addHandler(_ErrorCh)
## Set log level
#
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 10/11] BaseTools: Enable block queue log agent.
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (8 preceding siblings ...)
2019-07-29 8:44 ` [Patch 09/11] BaseTools: Add the support for python 2 Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 8:44 ` [Patch 11/11] BaseTools: Enhance Multiple-Process AutoGen Bob Feng
2019-07-29 10:10 ` [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen Laszlo Ersek
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
To support Ctrl+S and Ctrl+Q, we enable block queue
for log.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
BaseTools/Source/Python/Common/EdkLogger.py | 10 ++++++----
BaseTools/Source/Python/build/build.py | 8 +++++---
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/BaseTools/Source/Python/Common/EdkLogger.py b/BaseTools/Source/Python/Common/EdkLogger.py
index 15fd1458a95a..06da4a9d0a1d 100644
--- a/BaseTools/Source/Python/Common/EdkLogger.py
+++ b/BaseTools/Source/Python/Common/EdkLogger.py
@@ -93,11 +93,13 @@ except:
"""
try:
self.enqueue(self.prepare(record))
except Exception:
self.handleError(record)
-
+class BlockQueueHandler(QueueHandler):
+ def enqueue(self, record):
+ self.queue.put(record,True)
## Log level constants
DEBUG_0 = 1
DEBUG_1 = 2
DEBUG_2 = 3
DEBUG_3 = 4
@@ -290,23 +292,23 @@ def LogClientInitialize(log_q):
# Since we use different format to log different levels of message into different
# place (stdout or stderr), we have to use different "Logger" objects to do this.
#
# For DEBUG level (All DEBUG_0~9 are applicable)
_DebugLogger.setLevel(INFO)
- _DebugChannel = QueueHandler(log_q)
+ _DebugChannel = BlockQueueHandler(log_q)
_DebugChannel.setFormatter(_DebugFormatter)
_DebugLogger.addHandler(_DebugChannel)
# For VERBOSE, INFO, WARN level
_InfoLogger.setLevel(INFO)
- _InfoChannel = QueueHandler(log_q)
+ _InfoChannel = BlockQueueHandler(log_q)
_InfoChannel.setFormatter(_InfoFormatter)
_InfoLogger.addHandler(_InfoChannel)
# For ERROR level
_ErrorLogger.setLevel(INFO)
- _ErrorCh = QueueHandler(log_q)
+ _ErrorCh = BlockQueueHandler(log_q)
_ErrorCh.setFormatter(_ErrorFormatter)
_ErrorLogger.addHandler(_ErrorCh)
## Set log level
#
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 4125b2832946..603d3aa6dad4 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -2046,14 +2046,15 @@ class Build():
for m in Pa.GetAllModuleInfo:
mqueue.put(m)
data_pipe_file = os.path.join(Pa.BuildDir, "GlobalVar_%s_%s.bin" % (str(Pa.Guid),Pa.Arch))
Pa.DataPipe.dump(data_pipe_file)
autogen_rt = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
- self.Progress.Stop("done!")
- self.AutoGenTime += int(round((time.time() - AutoGenStart)))
+
if not autogen_rt:
return
+ self.AutoGenTime += int(round((time.time() - AutoGenStart)))
+ self.Progress.Stop("done!")
for Arch in Wa.ArchList:
MakeStart = time.time()
for Ma in BuildModules:
# Generate build task for the module
if not Ma.IsBinaryModule:
@@ -2294,17 +2295,18 @@ def LogBuildTime(Time):
# if it's executed successfully or not.
#
# @retval 0 Tool was successful
# @retval 1 Tool failed
#
+LogQMaxSize = 60
def Main():
StartTime = time.time()
#
# Create a log Queue
#
- LogQ = mp.Queue()
+ LogQ = mp.Queue(LogQMaxSize)
# Initialize log system
EdkLogger.LogClientInitialize(LogQ)
GlobalData.gCommand = sys.argv[1:]
#
# Parse the options and args
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [Patch 11/11] BaseTools: Enhance Multiple-Process AutoGen
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (9 preceding siblings ...)
2019-07-29 8:44 ` [Patch 10/11] BaseTools: Enable block queue log agent Bob Feng
@ 2019-07-29 8:44 ` Bob Feng
2019-07-29 10:10 ` [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen Laszlo Ersek
11 siblings, 0 replies; 18+ messages in thread
From: Bob Feng @ 2019-07-29 8:44 UTC (permalink / raw)
To: devel; +Cc: Liming Gao, Bob Feng
BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
1. Set Log queue maxsize as thread number * 10
2. enhance ModuleUniqueBaseName function
3. fix bugs of build option pcd in sub Process
4. enhance error handling
5. fix bug in the function of duplicate modules handling.
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Bob Feng <bob.c.feng@intel.com>
---
.../Source/Python/AutoGen/AutoGenWorker.py | 55 +++++++++--
BaseTools/Source/Python/AutoGen/DataPipe.py | 11 ++-
.../Python/AutoGen/ModuleAutoGenHelper.py | 9 +-
.../Source/Python/AutoGen/PlatformAutoGen.py | 49 +++++++---
.../Source/Python/AutoGen/WorkspaceAutoGen.py | 2 +
BaseTools/Source/Python/build/build.py | 95 ++++++++++---------
6 files changed, 148 insertions(+), 73 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
index d1c55cffa8d0..de6a17396e12 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
@@ -14,20 +14,24 @@ import Common.GlobalData as GlobalData
import Common.EdkLogger as EdkLogger
import os
from Common.MultipleWorkspace import MultipleWorkspace as mws
from AutoGen.AutoGen import AutoGen
from Workspace.WorkspaceDatabase import BuildDB
-import time
+
try:
from queue import Empty
except:
from Queue import Empty
import traceback
import sys
from AutoGen.DataPipe import MemoryDataPipe
import logging
+def clearQ(q):
+ while not q.empty():
+ q.get_nowait()
+
class LogAgent(threading.Thread):
def __init__(self,log_q,log_level,log_file=None):
super(LogAgent,self).__init__()
self.log_q = log_q
self.log_level = log_level
@@ -88,45 +92,58 @@ class LogAgent(threading.Thread):
self._InfoLogger_agent.log(log_message.levelno,log_message.getMessage())
def kill(self):
self.log_q.put(None)
class AutoGenManager(threading.Thread):
- def __init__(self,autogen_workers, feedback_q):
+ def __init__(self,autogen_workers, feedback_q,error_event):
super(AutoGenManager,self).__init__()
self.autogen_workers = autogen_workers
self.feedback_q = feedback_q
self.Status = True
+ self.error_event = error_event
def run(self):
try:
+ fin_num = 0
while True:
badnews = self.feedback_q.get()
if badnews is None:
+ break
+ if badnews == "Done":
+ fin_num += 1
+ else:
self.Status = False
self.TerminateWorkers()
+ if fin_num == len(self.autogen_workers):
+ self.clearQueue()
+ for w in self.autogen_workers:
+ w.join()
break
except Exception:
return
- def kill(self):
- self.feedback_q.put(None)
-
+ def clearQueue(self):
+ taskq = self.autogen_workers[0].module_queue
+ logq = self.autogen_workers[0].log_q
+ clearQ(taskq)
+ clearQ(self.feedback_q)
+ clearQ(logq)
def TerminateWorkers(self):
- for w in self.autogen_workers:
- if w.is_alive():
- w.terminate()
-
+ self.error_event.set()
+ def kill(self):
+ self.feedback_q.put(None)
class AutoGenWorkerInProcess(mp.Process):
- def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock, share_data,log_q):
+ def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock, share_data,log_q,error_event):
mp.Process.__init__(self)
self.module_queue = module_queue
self.data_pipe_file_path =data_pipe_file_path
self.data_pipe = None
self.feedback_q = feedback_q
self.PlatformMetaFileSet = {}
self.file_lock = file_lock
self.share_data = share_data
self.log_q = log_q
+ self.error_event = error_event
def GetPlatformMetaFile(self,filepath,root):
try:
return self.PlatformMetaFileSet[(filepath,root)]
except:
self.PlatformMetaFileSet[(filepath,root)] = filepath
@@ -161,17 +178,28 @@ class AutoGenWorkerInProcess(mp.Process):
os.environ._data = self.data_pipe.Get("Env_Var")
GlobalData.gWorkspace = workspacedir
GlobalData.gDisableIncludePathCheck = False
GlobalData.gFdfParser = self.data_pipe.Get("FdfParser")
GlobalData.gDatabasePath = self.data_pipe.Get("DatabasePath")
+ pcd_from_build_option = []
+ for pcd_tuple in self.data_pipe.Get("BuildOptPcd"):
+ pcd_id = ".".join((pcd_tuple[0],pcd_tuple[1]))
+ if pcd_tuple[2].strip():
+ pcd_id = ".".join((pcd_id,pcd_tuple[2]))
+ pcd_from_build_option.append("=".join((pcd_id,pcd_tuple[3])))
+ GlobalData.BuildOptionPcd = pcd_from_build_option
module_count = 0
FfsCmd = self.data_pipe.Get("FfsCommand")
if FfsCmd is None:
FfsCmd = {}
PlatformMetaFile = self.GetPlatformMetaFile(self.data_pipe.Get("P_Info").get("ActivePlatform"),
self.data_pipe.Get("P_Info").get("WorkspaceDir"))
+ libConstPcd = self.data_pipe.Get("LibConstPcd")
+ Refes = self.data_pipe.Get("REFS")
while not self.module_queue.empty():
+ if self.error_event.is_set():
+ break
module_count += 1
module_file,module_root,module_path,module_basename,module_originalpath,module_arch,IsLib = self.module_queue.get()
modulefullpath = os.path.join(module_root,module_file)
taskname = " : ".join((modulefullpath,module_arch))
module_metafile = PathClass(module_file,module_root)
@@ -184,18 +212,25 @@ class AutoGenWorkerInProcess(mp.Process):
arch = module_arch
target = self.data_pipe.Get("P_Info").get("Target")
toolchain = self.data_pipe.Get("P_Info").get("ToolChain")
Ma = ModuleAutoGen(self.Wa,module_metafile,target,toolchain,arch,PlatformMetaFile,self.data_pipe)
Ma.IsLibrary = IsLib
+ if IsLib:
+ if (Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path) in libConstPcd:
+ Ma.ConstPcd = libConstPcd[(Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)]
+ if (Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path) in Refes:
+ Ma.ReferenceModules = Refes[(Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)]
Ma.CreateCodeFile()
Ma.CreateMakeFile(GenFfsList=FfsCmd.get((Ma.MetaFile.File, Ma.Arch),[]))
Ma.CreateAsBuiltInf()
except Empty:
pass
except:
traceback.print_exc(file=sys.stdout)
self.feedback_q.put(taskname)
+ finally:
+ self.feedback_q.put("Done")
def printStatus(self):
print("Processs ID: %d Run %d modules in AutoGen " % (os.getpid(),len(AutoGen.Cache())))
print("Processs ID: %d Run %d modules in AutoGenInfo " % (os.getpid(),len(AutoGenInfo.GetCache())))
groupobj = {}
diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source/Python/AutoGen/DataPipe.py
index 33d2b14c9add..2052084bdb4b 100644
--- a/BaseTools/Source/Python/AutoGen/DataPipe.py
+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py
@@ -72,11 +72,11 @@ class MemoryDataPipe(DataPipe):
#Platform Module Pcds
ModulePcds = {}
for m in PlatformInfo.Platform.Modules:
m_pcds = PlatformInfo.Platform.Modules[m].Pcds
if m_pcds:
- ModulePcds[(m.File,m.Root)] = [PCD_DATA(
+ ModulePcds[(m.File,m.Root,m.Arch)] = [PCD_DATA(
pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type,
pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue,
pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.validateranges,
pcd.validlists,pcd.expressions,pcd.CustomAttribute,pcd.TokenValue)
for pcd in PlatformInfo.Platform.Modules[m].Pcds.values()]
@@ -84,15 +84,22 @@ class MemoryDataPipe(DataPipe):
self.DataContainer = {"MOL_PCDS":ModulePcds}
#Module's Library Instance
ModuleLibs = {}
+ libModules = {}
for m in PlatformInfo.Platform.Modules:
module_obj = BuildDB.BuildObject[m,PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain]
Libs = GetModuleLibInstances(module_obj, PlatformInfo.Platform, BuildDB.BuildObject, PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain)
- ModuleLibs[(m.File,m.Root,module_obj.Arch)] = [(l.MetaFile.File,l.MetaFile.Root,l.Arch) for l in Libs]
+ for lib in Libs:
+ try:
+ libModules[(lib.MetaFile.File,lib.MetaFile.Root,lib.Arch,lib.MetaFile.Path)].append((m.File,m.Root,module_obj.Arch,m.Path))
+ except:
+ libModules[(lib.MetaFile.File,lib.MetaFile.Root,lib.Arch,lib.MetaFile.Path)] = [(m.File,m.Root,module_obj.Arch,m.Path)]
+ ModuleLibs[(m.File,m.Root,module_obj.Arch,m.Path)] = [(l.MetaFile.File,l.MetaFile.Root,l.Arch,l.MetaFile.Path) for l in Libs]
self.DataContainer = {"DEPS":ModuleLibs}
+ self.DataContainer = {"REFS":libModules}
#Platform BuildOptions
platform_build_opt = PlatformInfo.EdkIIBuildOption
diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
index 5186ca1da3e3..c7591253debd 100644
--- a/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
@@ -595,14 +595,17 @@ class PlatformInfo(AutoGenInfo):
def ApplyLibraryInstance(self,module):
alldeps = self.DataPipe.Get("DEPS")
if alldeps is None:
alldeps = {}
- mod_libs = alldeps.get((module.MetaFile.File,module.MetaFile.Root,module.Arch),[])
+ mod_libs = alldeps.get((module.MetaFile.File,module.MetaFile.Root,module.Arch,module.MetaFile.Path),[])
retVal = []
- for (file_path,root,arch) in mod_libs:
- retVal.append(self.Wa.BuildDatabase[PathClass(file_path,root), arch, self.Target,self.ToolChain])
+ for (file_path,root,arch,abs_path) in mod_libs:
+ libMetaFile = PathClass(file_path,root)
+ libMetaFile.OriginalPath = PathClass(file_path,root)
+ libMetaFile.Path = abs_path
+ retVal.append(self.Wa.BuildDatabase[libMetaFile, arch, self.Target,self.ToolChain])
return retVal
## Parse build_rule.txt in Conf Directory.
#
# @retval BuildRule object
diff --git a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
index 9885c6a3a3bf..2a614e6a7134 100644
--- a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
@@ -131,10 +131,16 @@ class PlatformAutoGen(AutoGen):
self.DataPipe = MemoryDataPipe(self.BuildDir)
self.DataPipe.FillData(self)
return True
+ def FillData_LibConstPcd(self):
+ libConstPcd = {}
+ for LibAuto in self.LibraryAutoGenList:
+ if LibAuto.ConstPcd:
+ libConstPcd[(LibAuto.MetaFile.File,LibAuto.MetaFile.Root,LibAuto.Arch,LibAuto.MetaFile.Path)] = LibAuto.ConstPcd
+ self.DataPipe.DataContainer = {"LibConstPcd":libConstPcd}
## hash() operator of PlatformAutoGen
#
# The platform file path and arch string will be used to represent
# hash value of this object
#
@@ -1080,11 +1086,14 @@ class PlatformAutoGen(AutoGen):
@cached_property
def GetAllModuleInfo(self,WithoutPcd=True):
ModuleLibs = set()
for m in self.Platform.Modules:
module_obj = self.BuildDatabase[m,self.Arch,self.BuildTarget,self.ToolChain]
- Libs = GetModuleLibInstances(module_obj, self.Platform, self.BuildDatabase, self.Arch,self.BuildTarget,self.ToolChain)
+ if not bool(module_obj.LibraryClass):
+ Libs = GetModuleLibInstances(module_obj, self.Platform, self.BuildDatabase, self.Arch,self.BuildTarget,self.ToolChain)
+ else:
+ Libs = []
ModuleLibs.update( set([(l.MetaFile.File,l.MetaFile.Root,l.MetaFile.Path,l.MetaFile.BaseName,l.MetaFile.OriginalPath,l.Arch,True) for l in Libs]))
if WithoutPcd and module_obj.PcdIsDriver:
continue
ModuleLibs.add((m.File,m.Root,m.Path,m.BaseName,m.OriginalPath,module_obj.Arch,bool(module_obj.LibraryClass)))
@@ -1335,29 +1344,39 @@ class PlatformAutoGen(AutoGen):
else:
PlatformModuleOptions = {}
return ModuleTypeOptions,PlatformModuleOptions
+ def ModuleGuid(self,Module):
+ if os.path.basename(Module.MetaFile.File) != os.path.basename(Module.MetaFile.Path):
+ #
+ # Length of GUID is 36
+ #
+ return os.path.basename(Module.MetaFile.Path)[:36]
+ return Module.Guid
@cached_property
def UniqueBaseName(self):
retVal ={}
- name_path_map = {}
+ ModuleNameDict = {}
+ UniqueName = {}
for Module in self._MbList:
- name_path_map[Module.BaseName] = set()
- for Module in self._MbList:
- name_path_map[Module.BaseName].add(Module.MetaFile)
- for name in name_path_map:
- if len(name_path_map[name]) > 1:
- guidset = set()
- for metafile in name_path_map[name]:
- m = self.BuildDatabase[metafile, self.Arch, self.BuildTarget, self.ToolChain]
- retVal[name] = '%s_%s' % (name, m.Guid)
- guidset.add(m.Guid)
- samemodules = list(name_path_map[name])
- if len(guidset) > 1:
- EdkLogger.error("build", FILE_DUPLICATED, 'Modules have same BaseName and FILE_GUID:\n'
+ unique_base_name = '%s_%s' % (Module.BaseName,self.ModuleGuid(Module))
+ if unique_base_name not in ModuleNameDict:
+ ModuleNameDict[unique_base_name] = []
+ ModuleNameDict[unique_base_name].append(Module.MetaFile)
+ if Module.BaseName not in UniqueName:
+ UniqueName[Module.BaseName] = set()
+ UniqueName[Module.BaseName].add((self.ModuleGuid(Module),Module.MetaFile))
+ for module_paths in ModuleNameDict.values():
+ if len(module_paths) > 1 and len(set(module_paths))>1:
+ samemodules = list(set(module_paths))
+ EdkLogger.error("build", FILE_DUPLICATED, 'Modules have same BaseName and FILE_GUID:\n'
' %s\n %s' % (samemodules[0], samemodules[1]))
+ for name in UniqueName:
+ Guid_Path = UniqueName[name]
+ if len(Guid_Path) > 1:
+ retVal[name] = '%s_%s' % (name,Guid_Path.pop()[0])
return retVal
## Expand * in build option key
#
# @param Options Options to be expanded
# @param ToolDef Use specified ToolDef instead of full version.
diff --git a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
index ab58b21772c3..4ad92653a238 100644
--- a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
@@ -111,10 +111,12 @@ class WorkspaceAutoGen(AutoGen):
self.ProcessModuleFromPdf()
self.ProcessPcdType()
self.ProcessMixedPcd()
self.VerifyPcdsFromFDF()
self.CollectAllPcds()
+ for Pa in self.AutoGenObjectList:
+ Pa.FillData_LibConstPcd()
self.GeneratePkgLevelHash()
#
# Check PCDs token value conflict in each DEC file.
#
self._CheckAllPcdsTokenValueConflict()
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 603d3aa6dad4..dc92495f3f08 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -707,11 +707,11 @@ class Build():
self.Fdf = BuildOptions.FdfFile
self.FdList = BuildOptions.RomImage
self.FvList = BuildOptions.FvImage
self.CapList = BuildOptions.CapName
self.SilentMode = BuildOptions.SilentMode
- self.ThreadNumber = BuildOptions.ThreadNumber
+ self.ThreadNumber = 1
self.SkipAutoGen = BuildOptions.SkipAutoGen
self.Reparse = BuildOptions.Reparse
self.SkuId = BuildOptions.SkuId
if self.SkuId:
GlobalData.gSKUID_CMD = self.SkuId
@@ -812,31 +812,32 @@ class Build():
EdkLogger.info("")
os.chdir(self.WorkspaceDir)
self.share_data = Manager().dict()
self.log_q = log_q
def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList,share_data):
- if SkipAutoGen:
- return
- feedback_q = mp.Queue()
- file_lock = mp.Lock()
- auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_lock,share_data,self.log_q) for _ in range(self.ThreadNumber)]
- self.AutoGenMgr = AutoGenManager(auto_workers,feedback_q)
- self.AutoGenMgr.start()
- for w in auto_workers:
- w.start()
- if PcdMaList is not None:
- for PcdMa in PcdMaList:
- PcdMa.CreateCodeFile(True)
- PcdMa.CreateMakeFile(GenFfsList = DataPipe.Get("FfsCommand").get((PcdMa.MetaFile.File, PcdMa.Arch),[]))
- PcdMa.CreateAsBuiltInf()
- for w in auto_workers:
- w.join()
- rt = self.AutoGenMgr.Status
- self.AutoGenMgr.kill()
- self.AutoGenMgr.join()
- self.AutoGenMgr = None
- return rt
+ try:
+ if SkipAutoGen:
+ return True,0
+ feedback_q = mp.Queue()
+ file_lock = mp.Lock()
+ error_event = mp.Event()
+ auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_lock,share_data,self.log_q,error_event) for _ in range(self.ThreadNumber)]
+ self.AutoGenMgr = AutoGenManager(auto_workers,feedback_q,error_event)
+ self.AutoGenMgr.start()
+ for w in auto_workers:
+ w.start()
+ if PcdMaList is not None:
+ for PcdMa in PcdMaList:
+ PcdMa.CreateCodeFile(True)
+ PcdMa.CreateMakeFile(GenFfsList = DataPipe.Get("FfsCommand").get((PcdMa.MetaFile.File, PcdMa.Arch),[]))
+ PcdMa.CreateAsBuiltInf()
+
+ self.AutoGenMgr.join()
+ rt = self.AutoGenMgr.Status
+ return rt, 0
+ except Exception as e:
+ return False,e.errcode
## Load configuration
#
# This method will parse target.txt and get the build configurations.
#
@@ -880,23 +881,10 @@ class Build():
ToolChainFamily.append(TAB_COMPILER_MSFT)
else:
ToolChainFamily.append(ToolDefinition[TAB_TOD_DEFINES_FAMILY][Tool])
self.ToolChainFamily = ToolChainFamily
- if self.ThreadNumber is None:
- self.ThreadNumber = self.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER]
- if self.ThreadNumber == '':
- self.ThreadNumber = 0
- else:
- self.ThreadNumber = int(self.ThreadNumber, 0)
-
- if self.ThreadNumber == 0:
- try:
- self.ThreadNumber = multiprocessing.cpu_count()
- except (ImportError, NotImplementedError):
- self.ThreadNumber = 1
-
if not self.PlatformFile:
PlatformFile = self.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_ACTIVE_PLATFORM]
if not PlatformFile:
# Try to find one in current directory
WorkingDirectory = os.getcwd()
@@ -911,10 +899,11 @@ class Build():
EdkLogger.error("build", RESOURCE_NOT_AVAILABLE,
ExtraData="No active platform specified in target.txt or command line! Nothing can be built.\n")
self.PlatformFile = PathClass(NormFile(PlatformFile, self.WorkspaceDir), self.WorkspaceDir)
+ self.ThreadNumber = ThreadNum()
## Initialize build configuration
#
# This method will parse DSC file and merge the configurations from
# command line and target.txt, then get the final build configurations.
#
@@ -1213,12 +1202,16 @@ class Build():
AutoGenObject.DataPipe.DataContainer = {"FfsCommand":FfsCommand}
self.Progress.Start("Generating makefile and code")
data_pipe_file = os.path.join(AutoGenObject.BuildDir, "GlobalVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch))
AutoGenObject.DataPipe.dump(data_pipe_file)
- autogen_rt = self.StartAutoGen(mqueue, AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
+ autogen_rt,errorcode = self.StartAutoGen(mqueue, AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
self.Progress.Stop("done!")
+ if not autogen_rt:
+ self.AutoGenMgr.TerminateWorkers()
+ self.AutoGenMgr.join(0.1)
+ raise FatalError(errorcode)
return autogen_rt
else:
# always recreate top/platform makefile when clean, just in case of inconsistency
AutoGenObject.CreateCodeFile(False)
AutoGenObject.CreateMakeFile(False)
@@ -1717,10 +1710,11 @@ class Build():
Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
if Ma is None:
continue
if Ma.PcdIsDriver:
Ma.PlatformInfo = Pa
+ Ma.Workspace = Wa
PcdMaList.append(Ma)
self.BuildModules.append(Ma)
self._BuildPa(self.Target, Pa, FfsCommand=CmdListDict,PcdMaList=PcdMaList)
# Create MAP file when Load Fix Address is enabled.
@@ -2045,14 +2039,15 @@ class Build():
mqueue = mp.Queue()
for m in Pa.GetAllModuleInfo:
mqueue.put(m)
data_pipe_file = os.path.join(Pa.BuildDir, "GlobalVar_%s_%s.bin" % (str(Pa.Guid),Pa.Arch))
Pa.DataPipe.dump(data_pipe_file)
- autogen_rt = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
-
+ autogen_rt, errorcode = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
if not autogen_rt:
- return
+ self.AutoGenMgr.TerminateWorkers()
+ self.AutoGenMgr.join(0.1)
+ raise FatalError(errorcode)
self.AutoGenTime += int(round((time.time() - AutoGenStart)))
self.Progress.Stop("done!")
for Arch in Wa.ArchList:
MakeStart = time.time()
for Ma in BuildModules:
@@ -2286,20 +2281,35 @@ def LogBuildTime(Time):
TimeDurStr = time.strftime("%H:%M:%S", TimeDur)
return TimeDurStr
else:
return None
+def ThreadNum():
+ ThreadNumber = BuildOption.ThreadNumber
+ if ThreadNumber is None:
+ ThreadNumber = TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER]
+ if ThreadNumber == '':
+ ThreadNumber = 0
+ else:
+ ThreadNumber = int(ThreadNumber, 0)
+
+ if ThreadNumber == 0:
+ try:
+ ThreadNumber = multiprocessing.cpu_count()
+ except (ImportError, NotImplementedError):
+ ThreadNumber = 1
+ return ThreadNumber
## Tool entrance method
#
# This method mainly dispatch specific methods per the command line options.
# If no error found, return zero value so the caller of this tool can know
# if it's executed successfully or not.
#
# @retval 0 Tool was successful
# @retval 1 Tool failed
#
-LogQMaxSize = 60
+LogQMaxSize = ThreadNum() * 10
def Main():
StartTime = time.time()
#
# Create a log Queue
@@ -2432,13 +2442,11 @@ def Main():
else:
EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName, Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
ReturnCode = FORMAT_INVALID
except KeyboardInterrupt:
if MyBuild is not None:
- if MyBuild.AutoGenMgr:
- MyBuild.AutoGenMgr.TerminateWorkers()
- MyBuild.AutoGenMgr.kill()
+
# for multi-thread build exits safely
MyBuild.Relinquish()
ReturnCode = ABORT_ERROR
if Option is not None and Option.debug is not None:
EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
@@ -2495,10 +2503,11 @@ def Main():
Log_Agent.kill()
Log_Agent.join()
return ReturnCode
if __name__ == '__main__':
+ mp.set_start_method('spawn')
r = Main()
## 0-127 is a safe return range, and 1 is a standard default error
if r < 0 or r > 127: r = 1
sys.exit(r)
--
2.20.1.windows.1
^ permalink raw reply related [flat|nested] 18+ messages in thread
* Re: [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
` (10 preceding siblings ...)
2019-07-29 8:44 ` [Patch 11/11] BaseTools: Enhance Multiple-Process AutoGen Bob Feng
@ 2019-07-29 10:10 ` Laszlo Ersek
2019-07-30 7:31 ` Bob Feng
11 siblings, 1 reply; 18+ messages in thread
From: Laszlo Ersek @ 2019-07-29 10:10 UTC (permalink / raw)
To: bob.c.feng; +Cc: devel
Hi Bob,
On 07/29/19 10:44, Bob Feng wrote:
> BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
>
> In order to improve the build performance, we implemented
> multiple-processes AutoGen. This change will reduce 20% time
> for AutoGen phase.
>
> The design document can be got from:
> https://edk2.groups.io/g/devel/files/Designs/2019/0627/Multiple-thread-AutoGen.pdf
>
> This patch serial pass the build of Ovmf, MinKabylake, MinPurley,
> packages under Edk2 repository and intel client and server platforms.
>
> V4:
> Add one more patch 11/11 to enhance this feature. 1-10 are the same as
> V3
> 1. Set Log queue maxsize as thread number * 10
> 2. enhance ModuleUniqueBaseName function
> 3. fix bugs of build option pcd in sub Process
> 4. enhance error handling. Handle the exception of
> KeyboardInterrup and exceptions happen in subprocess.
> 5. fix the issue of shared fixed pcd between module and lib.
> 6. fix bug in the function of duplicate modules handling.
Item #1 seems to be in response to my v3 comment at:
http://mid.mail-archive.com/e3f68d77-4837-0ef6-ab4f-95e50c4621ff@redhat.com
https://edk2.groups.io/g/devel/message/44241
Therefore, I understand why item#1 is in scope for the v4 update.
However, the other updates in v4 (items #2 through #6) do not seem to
address review feedback for v3. I'm saying that because I cannot see
*any* feedback under v3 other than mine.
At
http://mid.mail-archive.com/08650203BA1BD64D8AD9B6D5D74A85D160B468EB@SHSMSX105.ccr.corp.intel.com
https://edk2.groups.io/g/devel/message/44249
you wrote,
> I'd like to collect more comments on other parts and update all the
> comments in V4.
Question#1: where are all those comments that justify the v4 updates
#2-#6? Did you get them in private (off-list)? Or did you determine the
necessity of #2-#6 yourself, without review feedback?
--*--
The following v4 updates are certainly bugfixes, relative to v3: #3, #5,
#6.
The following v4 updates *may* be bugfixes rather than additional
features ("enhancements") -- I can't tell myself, because they are not
explained deeply enough: #2, #4.
The point is, bugs that are known to be introduced by patches 01 through
10 should not be fixed up separately, in an incremental patch. Instead,
you should split out *minimally* the #3, #5, and #6 bugfixes, and squash
them into the appropriate patches between 01 and 10 (boundaries included
of course), please.
For example, regarding item #1, the following change from patch 11:
> -LogQMaxSize = 60
> +LogQMaxSize = ThreadNum() * 10
is wrong. Instead, you should update patch 10, so that when the log
agent is introduced, it be introduced at once with "LogQMaxSize =
ThreadNum() * 10".
The same applies to (minimally) #3, #5, and #6. Known bugs should not be
introduced mid-series, even temporarily. The bugs should be fixed up
inside the specific patches that introduce them in v3, and not in an
incremental patch.
If items #2 and #4 are indeed enhancements and not bugfixes (that is,
the series works fine without #2 / #4, functionally speaking, but #2/#4
improve some aspects, such as performance, user experience, etc), then
keeping them in separate patches might, or might not, make sense. That's
up to you, but even if you decide to separate them out of patches 01 to
10, they should still be isolated from *each other*.
Request#2: please restructure the patch series as explained.
--*--
The v4 cover letter, and patch v4 11/11, refer to a function called
"ModuleUniqueBaseName". I can't find the identifier
"ModuleUniqueBaseName" in the series.
Request#3: please clean up the cover letter and the commit messages. In
addition, please explain the v(n) --> v(n+1) updates in a lot more
detail, in the series cover letter. For example, item #5 seems like a
pretty serious bugfix, but nothing is explained about the nature of the
issue.
Thanks
Laszlo
^ permalink raw reply [flat|nested] 18+ messages in thread
* Re: [edk2-devel] [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions
2019-07-29 8:44 ` [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions Bob Feng
@ 2019-07-29 15:03 ` Philippe Mathieu-Daudé
2019-07-30 2:10 ` Bob Feng
0 siblings, 1 reply; 18+ messages in thread
From: Philippe Mathieu-Daudé @ 2019-07-29 15:03 UTC (permalink / raw)
To: devel, bob.c.feng; +Cc: Liming Gao
Hi Bob,
On 7/29/19 10:44 AM, Bob Feng wrote:
> BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
>
> The WorkspaceAutoGen.__InitWorker function is too long, it's hard
> to read and understand.
> This patch is to separate the __InitWorker into multiple small ones.
Patch looks good, however not trivial to review, you are refactoring 1
big function into 13 (or 14) ones, it would be much simpler to review if
you use 1 patch per function extracted. If you mind and that is not too
much effort, that would be appreciated. If you prefer not, I'll review
your patch more carefully.
Thanks,
Phil.
>
> Cc: Liming Gao <liming.gao@intel.com>
> Signed-off-by: Bob Feng <bob.c.feng@intel.com>
> ---
> BaseTools/Source/Python/AutoGen/AutoGen.py | 247 +++++++++++++--------
> 1 file changed, 152 insertions(+), 95 deletions(-)
>
> diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
> index c5b3fbb0a87f..9e06bb942126 100644
> --- a/BaseTools/Source/Python/AutoGen/AutoGen.py
> +++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
> @@ -333,13 +333,58 @@ class WorkspaceAutoGen(AutoGen):
> self._GuidDict = {}
>
> # there's many relative directory operations, so ...
> os.chdir(self.WorkspaceDir)
>
> + self.MergeArch()
> + self.ValidateBuildTarget()
> +
> + EdkLogger.info("")
> + if self.ArchList:
> + EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
> + EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
> + EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
> +
> + EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
> + if BuildModule:
> + EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
> +
> + if self.FdfFile:
> + EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
> +
> + EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
> +
> + if Progress:
> + Progress.Start("\nProcessing meta-data")
> #
> - # Merge Arch
> + # Mark now build in AutoGen Phase
> #
> + GlobalData.gAutoGenPhase = True
> + self.ProcessModuleFromPdf()
> + self.ProcessPcdType()
> + self.ProcessMixedPcd()
> + self.GetPcdsFromFDF()
> + self.CollectAllPcds()
> + self.GeneratePkgLevelHash()
> + #
> + # Check PCDs token value conflict in each DEC file.
> + #
> + self._CheckAllPcdsTokenValueConflict()
> + #
> + # Check PCD type and definition between DSC and DEC
> + #
> + self._CheckPcdDefineAndType()
> +
> + self.CreateBuildOptionsFile()
> + self.CreatePcdTokenNumberFile()
> + self.CreateModuleHashInfo()
> + GlobalData.gAutoGenPhase = False
> +
> + #
> + # Merge Arch
> + #
> + def MergeArch(self):
> if not self.ArchList:
> ArchList = set(self.Platform.SupArchList)
> else:
> ArchList = set(self.ArchList) & set(self.Platform.SupArchList)
> if not ArchList:
> @@ -349,57 +394,49 @@ class WorkspaceAutoGen(AutoGen):
> SkippedArchList = set(self.ArchList).symmetric_difference(set(self.Platform.SupArchList))
> EdkLogger.verbose("\nArch [%s] is ignored because the platform supports [%s] only!"
> % (" ".join(SkippedArchList), " ".join(self.Platform.SupArchList)))
> self.ArchList = tuple(ArchList)
>
> - # Validate build target
> + # Validate build target
> + def ValidateBuildTarget(self):
> if self.BuildTarget not in self.Platform.BuildTargets:
> EdkLogger.error("build", PARAMETER_INVALID,
> ExtraData="Build target [%s] is not supported by the platform. [Valid target: %s]"
> % (self.BuildTarget, " ".join(self.Platform.BuildTargets)))
> -
> -
> - # parse FDF file to get PCDs in it, if any
> + @cached_property
> + def FdfProfile(self):
> if not self.FdfFile:
> self.FdfFile = self.Platform.FlashDefinition
>
> - EdkLogger.info("")
> - if self.ArchList:
> - EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
> - EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
> - EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
> -
> - EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
> - if BuildModule:
> - EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
> -
> + FdfProfile = None
> if self.FdfFile:
> - EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
> -
> - EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
> -
> - if Progress:
> - Progress.Start("\nProcessing meta-data")
> -
> - if self.FdfFile:
> - #
> - # Mark now build in AutoGen Phase
> - #
> - GlobalData.gAutoGenPhase = True
> Fdf = FdfParser(self.FdfFile.Path)
> Fdf.ParseFile()
> GlobalData.gFdfParser = Fdf
> - GlobalData.gAutoGenPhase = False
> - PcdSet = Fdf.Profile.PcdDict
> if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.FdDict:
> FdDict = Fdf.Profile.FdDict[Fdf.CurrentFdName]
> for FdRegion in FdDict.RegionList:
> if str(FdRegion.RegionType) is 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
> if int(FdRegion.Offset) % 8 != 0:
> EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
> - ModuleList = Fdf.Profile.InfList
> - self.FdfProfile = Fdf.Profile
> + FdfProfile = Fdf.Profile
> + else:
> + if self.FdTargetList:
> + EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
> + self.FdTargetList = []
> + if self.FvTargetList:
> + EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
> + self.FvTargetList = []
> + if self.CapTargetList:
> + EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
> + self.CapTargetList = []
> +
> + return FdfProfile
> +
> + def ProcessModuleFromPdf(self):
> +
> + if self.FdfProfile:
> for fvname in self.FvTargetList:
> if fvname.upper() not in self.FdfProfile.FvDict:
> EdkLogger.error("build", OPTION_VALUE_INVALID,
> "No such an FV in FDF file: %s" % fvname)
>
> @@ -407,64 +444,60 @@ class WorkspaceAutoGen(AutoGen):
> # but the path (self.MetaFile.Path) is the real path
> for key in self.FdfProfile.InfDict:
> if key == 'ArchTBD':
> MetaFile_cache = defaultdict(set)
> for Arch in self.ArchList:
> - Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
> + Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> for Pkey in Current_Platform_cache.Modules:
> MetaFile_cache[Arch].add(Current_Platform_cache.Modules[Pkey].MetaFile)
> for Inf in self.FdfProfile.InfDict[key]:
> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
> for Arch in self.ArchList:
> if ModuleFile in MetaFile_cache[Arch]:
> break
> else:
> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
> + ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
> if not ModuleData.IsBinaryModule:
> EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
>
> else:
> for Arch in self.ArchList:
> if Arch == key:
> - Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> MetaFileList = set()
> for Pkey in Platform.Modules:
> MetaFileList.add(Platform.Modules[Pkey].MetaFile)
> for Inf in self.FdfProfile.InfDict[key]:
> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
> if ModuleFile in MetaFileList:
> continue
> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
> + ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
> if not ModuleData.IsBinaryModule:
> EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
>
> - else:
> - PcdSet = {}
> - ModuleList = []
> - self.FdfProfile = None
> - if self.FdTargetList:
> - EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
> - self.FdTargetList = []
> - if self.FvTargetList:
> - EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
> - self.FvTargetList = []
> - if self.CapTargetList:
> - EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
> - self.CapTargetList = []
> -
> - # apply SKU and inject PCDs from Flash Definition file
> +
> +
> + # parse FDF file to get PCDs in it, if any
> + def GetPcdsFromFDF(self):
> +
> + if self.FdfProfile:
> + PcdSet = self.FdfProfile.PcdDict
> + # handle the mixed pcd in FDF file
> + for key in PcdSet:
> + if key in GlobalData.MixedPcd:
> + Value = PcdSet[key]
> + del PcdSet[key]
> + for item in GlobalData.MixedPcd[key]:
> + PcdSet[item] = Value
> + self.VerifyPcdDeclearation(PcdSet)
> +
> + def ProcessPcdType(self):
> for Arch in self.ArchList:
> - Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
> - PlatformPcds = Platform.Pcds
> - self._GuidDict = Platform._GuidDict
> - SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
> - BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
> - SourcePcdDict_Keys = SourcePcdDict.keys()
> - BinaryPcdDict_Keys = BinaryPcdDict.keys()
> -
> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> + Platform.Pcds
> # generate the SourcePcdDict and BinaryPcdDict
> - PGen = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
> + PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
> for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
> if BuildData.Arch != Arch:
> continue
> if BuildData.MetaFile.Ext == '.inf':
> for key in BuildData.Pcds:
> @@ -483,11 +516,11 @@ class WorkspaceAutoGen(AutoGen):
> BuildData.Pcds[key].Type = PcdInPlatform.Type
> BuildData.Pcds[key].Pending = False
> else:
> #Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
> if BuildData.Pcds[key].Pending:
> - MGen = ModuleAutoGen(self, BuildData.MetaFile, Target, Toolchain, Arch, self.MetaFile)
> + MGen = ModuleAutoGen(self, BuildData.MetaFile, self.BuildTarget, self.ToolChain, Arch, self.MetaFile)
> if MGen and MGen.IsLibrary:
> if MGen in PGen.LibraryAutoGenList:
> ReferenceModules = MGen.ReferenceModules
> for ReferenceModule in ReferenceModules:
> if ReferenceModule.MetaFile in Platform.Modules:
> @@ -497,10 +530,24 @@ class WorkspaceAutoGen(AutoGen):
> if PcdInReferenceModule.Type:
> BuildData.Pcds[key].Type = PcdInReferenceModule.Type
> BuildData.Pcds[key].Pending = False
> break
>
> + def ProcessMixedPcd(self):
> + for Arch in self.ArchList:
> + SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
> + BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
> + SourcePcdDict_Keys = SourcePcdDict.keys()
> + BinaryPcdDict_Keys = BinaryPcdDict.keys()
> +
> + # generate the SourcePcdDict and BinaryPcdDict
> + PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
> + for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
> + if BuildData.Arch != Arch:
> + continue
> + if BuildData.MetaFile.Ext == '.inf':
> + for key in BuildData.Pcds:
> if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].Type:
> if BuildData.IsBinaryModule:
> BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
> else:
> SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
> @@ -514,12 +561,11 @@ class WorkspaceAutoGen(AutoGen):
>
> elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Type:
> SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
> elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[key].Type:
> SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
> - else:
> - pass
> +
> #
> # A PCD can only use one type for all source modules
> #
> for i in SourcePcdDict_Keys:
> for j in SourcePcdDict_Keys:
> @@ -588,27 +634,38 @@ class WorkspaceAutoGen(AutoGen):
> del BuildData.Pcds[key]
> BuildData.Pcds[newkey] = Value
> break
> break
>
> - # handle the mixed pcd in FDF file
> - for key in PcdSet:
> - if key in GlobalData.MixedPcd:
> - Value = PcdSet[key]
> - del PcdSet[key]
> - for item in GlobalData.MixedPcd[key]:
> - PcdSet[item] = Value
> + #Collect package set information from INF of FDF
> + @cached_property
> + def PkgSet(self):
> + if not self.FdfFile:
> + self.FdfFile = self.Platform.FlashDefinition
>
> - #Collect package set information from INF of FDF
> + if self.FdfFile:
> + ModuleList = self.FdfProfile.InfList
> + else:
> + ModuleList = []
> + Pkgs = {}
> + for Arch in self.ArchList:
> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> + PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
> PkgSet = set()
> for Inf in ModuleList:
> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
> if ModuleFile in Platform.Modules:
> continue
> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
> + ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
> PkgSet.update(ModuleData.Packages)
> - Pkgs = list(PkgSet) + list(PGen.PackageList)
> + Pkgs[Arch] = list(PkgSet) + list(PGen.PackageList)
> + return Pkgs
> +
> + def VerifyPcdDeclearation(self,PcdSet):
> + for Arch in self.ArchList:
> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> + Pkgs = self.PkgSet[Arch]
> DecPcds = set()
> DecPcdsKey = set()
> for Pkg in Pkgs:
> for Pcd in Pkg.Pcds:
> DecPcds.add((Pcd[0], Pcd[1]))
> @@ -636,37 +693,33 @@ class WorkspaceAutoGen(AutoGen):
> PARSER_ERROR,
> "Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (Guid, Name),
> File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
> Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
> )
> + def CollectAllPcds(self):
>
> - Pa = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
> + for Arch in self.ArchList:
> + Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
> #
> # Explicitly collect platform's dynamic PCDs
> #
> Pa.CollectPlatformDynamicPcds()
> Pa.CollectFixedAtBuildPcds()
> self.AutoGenObjectList.append(Pa)
>
> - #
> - # Generate Package level hash value
> - #
> + #
> + # Generate Package level hash value
> + #
> + def GeneratePkgLevelHash(self):
> + for Arch in self.ArchList:
> GlobalData.gPackageHash = {}
> if GlobalData.gUseHashCache:
> - for Pkg in Pkgs:
> + for Pkg in self.PkgSet[Arch]:
> self._GenPkgLevelHash(Pkg)
>
> - #
> - # Check PCDs token value conflict in each DEC file.
> - #
> - self._CheckAllPcdsTokenValueConflict()
> -
> - #
> - # Check PCD type and definition between DSC and DEC
> - #
> - self._CheckPcdDefineAndType()
>
> + def CreateBuildOptionsFile(self):
> #
> # Create BuildOptions Macro & PCD metafile, also add the Active Platform and FDF file.
> #
> content = 'gCommandLineDefines: '
> content += str(GlobalData.gCommandLineDefines)
> @@ -681,27 +734,31 @@ class WorkspaceAutoGen(AutoGen):
> content += 'Flash Image Definition: '
> content += str(self.FdfFile)
> content += TAB_LINE_BREAK
> SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'), content, False)
>
> + def CreatePcdTokenNumberFile(self):
> #
> # Create PcdToken Number file for Dynamic/DynamicEx Pcd.
> #
> PcdTokenNumber = 'PcdTokenNumber: '
> - if Pa.PcdTokenNumber:
> - if Pa.DynamicPcdList:
> - for Pcd in Pa.DynamicPcdList:
> - PcdTokenNumber += TAB_LINE_BREAK
> - PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
> - PcdTokenNumber += ' : '
> - PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
> + for Arch in self.ArchList:
> + Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
> + if Pa.PcdTokenNumber:
> + if Pa.DynamicPcdList:
> + for Pcd in Pa.DynamicPcdList:
> + PcdTokenNumber += TAB_LINE_BREAK
> + PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
> + PcdTokenNumber += ' : '
> + PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
> SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'), PcdTokenNumber, False)
>
> + def CreateModuleHashInfo(self):
> #
> # Get set of workspace metafiles
> #
> - AllWorkSpaceMetaFiles = self._GetMetaFiles(Target, Toolchain, Arch)
> + AllWorkSpaceMetaFiles = self._GetMetaFiles(self.BuildTarget, self.ToolChain)
>
> #
> # Retrieve latest modified time of all metafiles
> #
> SrcTimeStamp = 0
> @@ -759,11 +816,11 @@ class WorkspaceAutoGen(AutoGen):
> f.close()
> m.update(Content)
> SaveFileOnChange(HashFile, m.hexdigest(), False)
> GlobalData.gPackageHash[Pkg.PackageName] = m.hexdigest()
>
> - def _GetMetaFiles(self, Target, Toolchain, Arch):
> + def _GetMetaFiles(self, Target, Toolchain):
> AllWorkSpaceMetaFiles = set()
> #
> # add fdf
> #
> if self.FdfFile:
>
^ permalink raw reply [flat|nested] 18+ messages in thread
* Re: [edk2-devel] [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions
2019-07-29 15:03 ` [edk2-devel] " Philippe Mathieu-Daudé
@ 2019-07-30 2:10 ` Bob Feng
2019-07-30 12:38 ` Philippe Mathieu-Daudé
0 siblings, 1 reply; 18+ messages in thread
From: Bob Feng @ 2019-07-30 2:10 UTC (permalink / raw)
To: Philippe Mathieu-Daudé, devel@edk2.groups.io; +Cc: Gao, Liming
Hi Phil,
Thanks for your comments. I agree it will be easier to review if this patch is split into multiple smaller ones.
Since the later patches in Multiple-process autogen patch series are based on this patch. It will be bigger effort to recreate this whole patch series. So I'd like to prefer not to change...
I'm willing to answer the questions for this patch.
Thanks,
Bob
-----Original Message-----
From: Philippe Mathieu-Daudé [mailto:philmd@redhat.com]
Sent: Monday, July 29, 2019 11:03 PM
To: devel@edk2.groups.io; Feng, Bob C <bob.c.feng@intel.com>
Cc: Gao, Liming <liming.gao@intel.com>
Subject: Re: [edk2-devel] [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions
Hi Bob,
On 7/29/19 10:44 AM, Bob Feng wrote:
> BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
>
> The WorkspaceAutoGen.__InitWorker function is too long, it's hard to
> read and understand.
> This patch is to separate the __InitWorker into multiple small ones.
Patch looks good, however not trivial to review, you are refactoring 1 big function into 13 (or 14) ones, it would be much simpler to review if you use 1 patch per function extracted. If you mind and that is not too much effort, that would be appreciated. If you prefer not, I'll review your patch more carefully.
Thanks,
Phil.
>
> Cc: Liming Gao <liming.gao@intel.com>
> Signed-off-by: Bob Feng <bob.c.feng@intel.com>
> ---
> BaseTools/Source/Python/AutoGen/AutoGen.py | 247
> +++++++++++++--------
> 1 file changed, 152 insertions(+), 95 deletions(-)
>
> diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py
> b/BaseTools/Source/Python/AutoGen/AutoGen.py
> index c5b3fbb0a87f..9e06bb942126 100644
> --- a/BaseTools/Source/Python/AutoGen/AutoGen.py
> +++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
> @@ -333,13 +333,58 @@ class WorkspaceAutoGen(AutoGen):
> self._GuidDict = {}
>
> # there's many relative directory operations, so ...
> os.chdir(self.WorkspaceDir)
>
> + self.MergeArch()
> + self.ValidateBuildTarget()
> +
> + EdkLogger.info("")
> + if self.ArchList:
> + EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
> + EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
> + EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
> +
> + EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
> + if BuildModule:
> + EdkLogger.info('%-24s = %s' % ("Active Module",
> + BuildModule))
> +
> + if self.FdfFile:
> + EdkLogger.info('%-24s = %s' % ("Flash Image Definition",
> + self.FdfFile))
> +
> + EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
> +
> + if Progress:
> + Progress.Start("\nProcessing meta-data")
> #
> - # Merge Arch
> + # Mark now build in AutoGen Phase
> #
> + GlobalData.gAutoGenPhase = True
> + self.ProcessModuleFromPdf()
> + self.ProcessPcdType()
> + self.ProcessMixedPcd()
> + self.GetPcdsFromFDF()
> + self.CollectAllPcds()
> + self.GeneratePkgLevelHash()
> + #
> + # Check PCDs token value conflict in each DEC file.
> + #
> + self._CheckAllPcdsTokenValueConflict()
> + #
> + # Check PCD type and definition between DSC and DEC
> + #
> + self._CheckPcdDefineAndType()
> +
> + self.CreateBuildOptionsFile()
> + self.CreatePcdTokenNumberFile()
> + self.CreateModuleHashInfo()
> + GlobalData.gAutoGenPhase = False
> +
> + #
> + # Merge Arch
> + #
> + def MergeArch(self):
> if not self.ArchList:
> ArchList = set(self.Platform.SupArchList)
> else:
> ArchList = set(self.ArchList) & set(self.Platform.SupArchList)
> if not ArchList:
> @@ -349,57 +394,49 @@ class WorkspaceAutoGen(AutoGen):
> SkippedArchList = set(self.ArchList).symmetric_difference(set(self.Platform.SupArchList))
> EdkLogger.verbose("\nArch [%s] is ignored because the platform supports [%s] only!"
> % (" ".join(SkippedArchList), " ".join(self.Platform.SupArchList)))
> self.ArchList = tuple(ArchList)
>
> - # Validate build target
> + # Validate build target
> + def ValidateBuildTarget(self):
> if self.BuildTarget not in self.Platform.BuildTargets:
> EdkLogger.error("build", PARAMETER_INVALID,
> ExtraData="Build target [%s] is not supported by the platform. [Valid target: %s]"
> % (self.BuildTarget, "
> ".join(self.Platform.BuildTargets)))
> -
> -
> - # parse FDF file to get PCDs in it, if any
> + @cached_property
> + def FdfProfile(self):
> if not self.FdfFile:
> self.FdfFile = self.Platform.FlashDefinition
>
> - EdkLogger.info("")
> - if self.ArchList:
> - EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
> - EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
> - EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
> -
> - EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
> - if BuildModule:
> - EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
> -
> + FdfProfile = None
> if self.FdfFile:
> - EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
> -
> - EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
> -
> - if Progress:
> - Progress.Start("\nProcessing meta-data")
> -
> - if self.FdfFile:
> - #
> - # Mark now build in AutoGen Phase
> - #
> - GlobalData.gAutoGenPhase = True
> Fdf = FdfParser(self.FdfFile.Path)
> Fdf.ParseFile()
> GlobalData.gFdfParser = Fdf
> - GlobalData.gAutoGenPhase = False
> - PcdSet = Fdf.Profile.PcdDict
> if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.FdDict:
> FdDict = Fdf.Profile.FdDict[Fdf.CurrentFdName]
> for FdRegion in FdDict.RegionList:
> if str(FdRegion.RegionType) is 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
> if int(FdRegion.Offset) % 8 != 0:
> EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
> - ModuleList = Fdf.Profile.InfList
> - self.FdfProfile = Fdf.Profile
> + FdfProfile = Fdf.Profile
> + else:
> + if self.FdTargetList:
> + EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
> + self.FdTargetList = []
> + if self.FvTargetList:
> + EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
> + self.FvTargetList = []
> + if self.CapTargetList:
> + EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
> + self.CapTargetList = []
> +
> + return FdfProfile
> +
> + def ProcessModuleFromPdf(self):
> +
> + if self.FdfProfile:
> for fvname in self.FvTargetList:
> if fvname.upper() not in self.FdfProfile.FvDict:
> EdkLogger.error("build", OPTION_VALUE_INVALID,
> "No such an FV in FDF file: %s" %
> fvname)
>
> @@ -407,64 +444,60 @@ class WorkspaceAutoGen(AutoGen):
> # but the path (self.MetaFile.Path) is the real path
> for key in self.FdfProfile.InfDict:
> if key == 'ArchTBD':
> MetaFile_cache = defaultdict(set)
> for Arch in self.ArchList:
> - Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
> + Current_Platform_cache =
> + self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget,
> + self.ToolChain]
> for Pkey in Current_Platform_cache.Modules:
> MetaFile_cache[Arch].add(Current_Platform_cache.Modules[Pkey].MetaFile)
> for Inf in self.FdfProfile.InfDict[key]:
> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
> for Arch in self.ArchList:
> if ModuleFile in MetaFile_cache[Arch]:
> break
> else:
> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
> + ModuleData =
> + self.BuildDatabase[ModuleFile, Arch, self.BuildTarget,
> + self.ToolChain]
> if not ModuleData.IsBinaryModule:
> EdkLogger.error('build',
> PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary
> module?" % ModuleFile)
>
> else:
> for Arch in self.ArchList:
> if Arch == key:
> - Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
> + Platform =
> + self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget,
> + self.ToolChain]
> MetaFileList = set()
> for Pkey in Platform.Modules:
> MetaFileList.add(Platform.Modules[Pkey].MetaFile)
> for Inf in self.FdfProfile.InfDict[key]:
> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
> if ModuleFile in MetaFileList:
> continue
> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
> + ModuleData =
> + self.BuildDatabase[ModuleFile, Arch, self.BuildTarget,
> + self.ToolChain]
> if not ModuleData.IsBinaryModule:
> EdkLogger.error('build',
> PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary
> module?" % ModuleFile)
>
> - else:
> - PcdSet = {}
> - ModuleList = []
> - self.FdfProfile = None
> - if self.FdTargetList:
> - EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
> - self.FdTargetList = []
> - if self.FvTargetList:
> - EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
> - self.FvTargetList = []
> - if self.CapTargetList:
> - EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
> - self.CapTargetList = []
> -
> - # apply SKU and inject PCDs from Flash Definition file
> +
> +
> + # parse FDF file to get PCDs in it, if any
> + def GetPcdsFromFDF(self):
> +
> + if self.FdfProfile:
> + PcdSet = self.FdfProfile.PcdDict
> + # handle the mixed pcd in FDF file
> + for key in PcdSet:
> + if key in GlobalData.MixedPcd:
> + Value = PcdSet[key]
> + del PcdSet[key]
> + for item in GlobalData.MixedPcd[key]:
> + PcdSet[item] = Value
> + self.VerifyPcdDeclearation(PcdSet)
> +
> + def ProcessPcdType(self):
> for Arch in self.ArchList:
> - Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
> - PlatformPcds = Platform.Pcds
> - self._GuidDict = Platform._GuidDict
> - SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
> - BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
> - SourcePcdDict_Keys = SourcePcdDict.keys()
> - BinaryPcdDict_Keys = BinaryPcdDict.keys()
> -
> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> + Platform.Pcds
> # generate the SourcePcdDict and BinaryPcdDict
> - PGen = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
> + PGen = PlatformAutoGen(self, self.MetaFile,
> + self.BuildTarget, self.ToolChain, Arch)
> for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
> if BuildData.Arch != Arch:
> continue
> if BuildData.MetaFile.Ext == '.inf':
> for key in BuildData.Pcds:
> @@ -483,11 +516,11 @@ class WorkspaceAutoGen(AutoGen):
> BuildData.Pcds[key].Type = PcdInPlatform.Type
> BuildData.Pcds[key].Pending = False
> else:
> #Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
> if BuildData.Pcds[key].Pending:
> - MGen = ModuleAutoGen(self, BuildData.MetaFile, Target, Toolchain, Arch, self.MetaFile)
> + MGen = ModuleAutoGen(self,
> + BuildData.MetaFile, self.BuildTarget, self.ToolChain, Arch,
> + self.MetaFile)
> if MGen and MGen.IsLibrary:
> if MGen in PGen.LibraryAutoGenList:
> ReferenceModules = MGen.ReferenceModules
> for ReferenceModule in ReferenceModules:
> if ReferenceModule.MetaFile in Platform.Modules:
> @@ -497,10 +530,24 @@ class WorkspaceAutoGen(AutoGen):
> if PcdInReferenceModule.Type:
> BuildData.Pcds[key].Type = PcdInReferenceModule.Type
> BuildData.Pcds[key].Pending = False
> break
>
> + def ProcessMixedPcd(self):
> + for Arch in self.ArchList:
> + SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
> + BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
> + SourcePcdDict_Keys = SourcePcdDict.keys()
> + BinaryPcdDict_Keys = BinaryPcdDict.keys()
> +
> + # generate the SourcePcdDict and BinaryPcdDict
> + PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
> + for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
> + if BuildData.Arch != Arch:
> + continue
> + if BuildData.MetaFile.Ext == '.inf':
> + for key in BuildData.Pcds:
> if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].Type:
> if BuildData.IsBinaryModule:
> BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
> else:
>
> SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName
> , BuildData.Pcds[key].TokenSpaceGuidCName))
> @@ -514,12 +561,11 @@ class WorkspaceAutoGen(AutoGen):
>
> elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Type:
> SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
> elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[key].Type:
> SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
> - else:
> - pass
> +
> #
> # A PCD can only use one type for all source modules
> #
> for i in SourcePcdDict_Keys:
> for j in SourcePcdDict_Keys:
> @@ -588,27 +634,38 @@ class WorkspaceAutoGen(AutoGen):
> del BuildData.Pcds[key]
> BuildData.Pcds[newkey] = Value
> break
> break
>
> - # handle the mixed pcd in FDF file
> - for key in PcdSet:
> - if key in GlobalData.MixedPcd:
> - Value = PcdSet[key]
> - del PcdSet[key]
> - for item in GlobalData.MixedPcd[key]:
> - PcdSet[item] = Value
> + #Collect package set information from INF of FDF
> + @cached_property
> + def PkgSet(self):
> + if not self.FdfFile:
> + self.FdfFile = self.Platform.FlashDefinition
>
> - #Collect package set information from INF of FDF
> + if self.FdfFile:
> + ModuleList = self.FdfProfile.InfList
> + else:
> + ModuleList = []
> + Pkgs = {}
> + for Arch in self.ArchList:
> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> + PGen = PlatformAutoGen(self, self.MetaFile,
> + self.BuildTarget, self.ToolChain, Arch)
> PkgSet = set()
> for Inf in ModuleList:
> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
> if ModuleFile in Platform.Modules:
> continue
> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
> + ModuleData = self.BuildDatabase[ModuleFile, Arch,
> + self.BuildTarget, self.ToolChain]
> PkgSet.update(ModuleData.Packages)
> - Pkgs = list(PkgSet) + list(PGen.PackageList)
> + Pkgs[Arch] = list(PkgSet) + list(PGen.PackageList)
> + return Pkgs
> +
> + def VerifyPcdDeclearation(self,PcdSet):
> + for Arch in self.ArchList:
> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
> + Pkgs = self.PkgSet[Arch]
> DecPcds = set()
> DecPcdsKey = set()
> for Pkg in Pkgs:
> for Pcd in Pkg.Pcds:
> DecPcds.add((Pcd[0], Pcd[1])) @@ -636,37 +693,33
> @@ class WorkspaceAutoGen(AutoGen):
> PARSER_ERROR,
> "Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (Guid, Name),
> File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
> Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
> )
> + def CollectAllPcds(self):
>
> - Pa = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
> + for Arch in self.ArchList:
> + Pa = PlatformAutoGen(self, self.MetaFile,
> + self.BuildTarget, self.ToolChain, Arch)
> #
> # Explicitly collect platform's dynamic PCDs
> #
> Pa.CollectPlatformDynamicPcds()
> Pa.CollectFixedAtBuildPcds()
> self.AutoGenObjectList.append(Pa)
>
> - #
> - # Generate Package level hash value
> - #
> + #
> + # Generate Package level hash value
> + #
> + def GeneratePkgLevelHash(self):
> + for Arch in self.ArchList:
> GlobalData.gPackageHash = {}
> if GlobalData.gUseHashCache:
> - for Pkg in Pkgs:
> + for Pkg in self.PkgSet[Arch]:
> self._GenPkgLevelHash(Pkg)
>
> - #
> - # Check PCDs token value conflict in each DEC file.
> - #
> - self._CheckAllPcdsTokenValueConflict()
> -
> - #
> - # Check PCD type and definition between DSC and DEC
> - #
> - self._CheckPcdDefineAndType()
>
> + def CreateBuildOptionsFile(self):
> #
> # Create BuildOptions Macro & PCD metafile, also add the Active Platform and FDF file.
> #
> content = 'gCommandLineDefines: '
> content += str(GlobalData.gCommandLineDefines)
> @@ -681,27 +734,31 @@ class WorkspaceAutoGen(AutoGen):
> content += 'Flash Image Definition: '
> content += str(self.FdfFile)
> content += TAB_LINE_BREAK
> SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'),
> content, False)
>
> + def CreatePcdTokenNumberFile(self):
> #
> # Create PcdToken Number file for Dynamic/DynamicEx Pcd.
> #
> PcdTokenNumber = 'PcdTokenNumber: '
> - if Pa.PcdTokenNumber:
> - if Pa.DynamicPcdList:
> - for Pcd in Pa.DynamicPcdList:
> - PcdTokenNumber += TAB_LINE_BREAK
> - PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
> - PcdTokenNumber += ' : '
> - PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
> + for Arch in self.ArchList:
> + Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
> + if Pa.PcdTokenNumber:
> + if Pa.DynamicPcdList:
> + for Pcd in Pa.DynamicPcdList:
> + PcdTokenNumber += TAB_LINE_BREAK
> + PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
> + PcdTokenNumber += ' : '
> + PcdTokenNumber +=
> + str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
> SaveFileOnChange(os.path.join(self.BuildDir,
> 'PcdTokenNumber'), PcdTokenNumber, False)
>
> + def CreateModuleHashInfo(self):
> #
> # Get set of workspace metafiles
> #
> - AllWorkSpaceMetaFiles = self._GetMetaFiles(Target, Toolchain, Arch)
> + AllWorkSpaceMetaFiles = self._GetMetaFiles(self.BuildTarget,
> + self.ToolChain)
>
> #
> # Retrieve latest modified time of all metafiles
> #
> SrcTimeStamp = 0
> @@ -759,11 +816,11 @@ class WorkspaceAutoGen(AutoGen):
> f.close()
> m.update(Content)
> SaveFileOnChange(HashFile, m.hexdigest(), False)
> GlobalData.gPackageHash[Pkg.PackageName] = m.hexdigest()
>
> - def _GetMetaFiles(self, Target, Toolchain, Arch):
> + def _GetMetaFiles(self, Target, Toolchain):
> AllWorkSpaceMetaFiles = set()
> #
> # add fdf
> #
> if self.FdfFile:
>
^ permalink raw reply [flat|nested] 18+ messages in thread
* Re: [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen
2019-07-29 10:10 ` [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen Laszlo Ersek
@ 2019-07-30 7:31 ` Bob Feng
2019-07-30 14:02 ` Laszlo Ersek
0 siblings, 1 reply; 18+ messages in thread
From: Bob Feng @ 2019-07-30 7:31 UTC (permalink / raw)
To: devel@edk2.groups.io, lersek@redhat.com
Hi Laszlo,
1. Question#1
I did not receive any except yours about the LogQ MaxSize. 2#-6# are introduced based on my testing result. They can be seen as bugfixes for regression issues.
2. Request#2
Yes. I can restructure the patch series and sent out V5. I created a separate 11/11 patch because I thought it would be easier to tell what is my changes between V3 and V4.
3. Request#3
Yes. I'll state more clear description in V5 cover letter.
My testing is to compare auto-gened files created by basetool with patches and without patches.
2#-6# resolved the regression issue for the following usage scenarios:
1. One module is built multiple times in one build. 2# and 6#
2. Exception handling. 1) The code run in sub-process raise exception. 2) Ctrl + C. 4#
3. --pcd is used in build command line. 3#
4. Shared fixedatbuild Pcd between module and its libraries. It affects the content of AutoGen.h and AutoGen.c. 5#
Thanks,
Bob
-----Original Message-----
From: devel@edk2.groups.io [mailto:devel@edk2.groups.io] On Behalf Of Laszlo Ersek
Sent: Monday, July 29, 2019 6:10 PM
To: Feng, Bob C <bob.c.feng@intel.com>
Cc: devel@edk2.groups.io
Subject: Re: [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen
Hi Bob,
On 07/29/19 10:44, Bob Feng wrote:
> BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
>
> In order to improve the build performance, we implemented
> multiple-processes AutoGen. This change will reduce 20% time for
> AutoGen phase.
>
> The design document can be got from:
> https://edk2.groups.io/g/devel/files/Designs/2019/0627/Multiple-thread
> -AutoGen.pdf
>
> This patch serial pass the build of Ovmf, MinKabylake, MinPurley,
> packages under Edk2 repository and intel client and server platforms.
>
> V4:
> Add one more patch 11/11 to enhance this feature. 1-10 are the same as
> V3
> 1. Set Log queue maxsize as thread number * 10 2. enhance
> ModuleUniqueBaseName function 3. fix bugs of build option pcd in sub
> Process 4. enhance error handling. Handle the exception of
> KeyboardInterrup and exceptions happen in subprocess.
> 5. fix the issue of shared fixed pcd between module and lib.
> 6. fix bug in the function of duplicate modules handling.
Item #1 seems to be in response to my v3 comment at:
http://mid.mail-archive.com/e3f68d77-4837-0ef6-ab4f-95e50c4621ff@redhat.com
https://edk2.groups.io/g/devel/message/44241
Therefore, I understand why item#1 is in scope for the v4 update.
However, the other updates in v4 (items #2 through #6) do not seem to address review feedback for v3. I'm saying that because I cannot see
*any* feedback under v3 other than mine.
At
http://mid.mail-archive.com/08650203BA1BD64D8AD9B6D5D74A85D160B468EB@SHSMSX105.ccr.corp.intel.com
https://edk2.groups.io/g/devel/message/44249
you wrote,
> I'd like to collect more comments on other parts and update all the
> comments in V4.
Question#1: where are all those comments that justify the v4 updates #2-#6? Did you get them in private (off-list)? Or did you determine the necessity of #2-#6 yourself, without review feedback?
--*--
The following v4 updates are certainly bugfixes, relative to v3: #3, #5, #6.
The following v4 updates *may* be bugfixes rather than additional features ("enhancements") -- I can't tell myself, because they are not explained deeply enough: #2, #4.
The point is, bugs that are known to be introduced by patches 01 through
10 should not be fixed up separately, in an incremental patch. Instead, you should split out *minimally* the #3, #5, and #6 bugfixes, and squash them into the appropriate patches between 01 and 10 (boundaries included of course), please.
For example, regarding item #1, the following change from patch 11:
> -LogQMaxSize = 60
> +LogQMaxSize = ThreadNum() * 10
is wrong. Instead, you should update patch 10, so that when the log agent is introduced, it be introduced at once with "LogQMaxSize =
ThreadNum() * 10".
The same applies to (minimally) #3, #5, and #6. Known bugs should not be introduced mid-series, even temporarily. The bugs should be fixed up inside the specific patches that introduce them in v3, and not in an incremental patch.
If items #2 and #4 are indeed enhancements and not bugfixes (that is, the series works fine without #2 / #4, functionally speaking, but #2/#4 improve some aspects, such as performance, user experience, etc), then keeping them in separate patches might, or might not, make sense. That's up to you, but even if you decide to separate them out of patches 01 to 10, they should still be isolated from *each other*.
Request#2: please restructure the patch series as explained.
--*--
The v4 cover letter, and patch v4 11/11, refer to a function called "ModuleUniqueBaseName". I can't find the identifier "ModuleUniqueBaseName" in the series.
Request#3: please clean up the cover letter and the commit messages. In addition, please explain the v(n) --> v(n+1) updates in a lot more detail, in the series cover letter. For example, item #5 seems like a pretty serious bugfix, but nothing is explained about the nature of the issue.
Thanks
Laszlo
^ permalink raw reply [flat|nested] 18+ messages in thread
* Re: [edk2-devel] [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions
2019-07-30 2:10 ` Bob Feng
@ 2019-07-30 12:38 ` Philippe Mathieu-Daudé
0 siblings, 0 replies; 18+ messages in thread
From: Philippe Mathieu-Daudé @ 2019-07-30 12:38 UTC (permalink / raw)
To: Feng, Bob C, devel@edk2.groups.io; +Cc: Gao, Liming
On 7/30/19 4:10 AM, Feng, Bob C wrote:
> Hi Phil,
>
> Thanks for your comments. I agree it will be easier to review if this patch is split into multiple smaller ones.
> Since the later patches in Multiple-process autogen patch series are based on this patch. It will be bigger effort to recreate this whole patch series. So I'd like to prefer not to change...
The tradeoff is which will have to do the biggest effort, and how many
times, 1 for the commiter or N times for the N reviewers...
> I'm willing to answer the questions for this patch.
>
> Thanks,
> Bob
>
> -----Original Message-----
> From: Philippe Mathieu-Daudé [mailto:philmd@redhat.com]
> Sent: Monday, July 29, 2019 11:03 PM
> To: devel@edk2.groups.io; Feng, Bob C <bob.c.feng@intel.com>
> Cc: Gao, Liming <liming.gao@intel.com>
> Subject: Re: [edk2-devel] [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions
>
> Hi Bob,
>
> On 7/29/19 10:44 AM, Bob Feng wrote:
>> BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
>>
>> The WorkspaceAutoGen.__InitWorker function is too long, it's hard to
>> read and understand.
>> This patch is to separate the __InitWorker into multiple small ones.
>
> Patch looks good, however not trivial to review, you are refactoring 1 big function into 13 (or 14) ones, it would be much simpler to review if you use 1 patch per function extracted. If you mind and that is not too much effort, that would be appreciated. If you prefer not, I'll review your patch more carefully.
>
> Thanks,
>
> Phil.
>
>>
>> Cc: Liming Gao <liming.gao@intel.com>
>> Signed-off-by: Bob Feng <bob.c.feng@intel.com>
>> ---
>> BaseTools/Source/Python/AutoGen/AutoGen.py | 247
>> +++++++++++++--------
>> 1 file changed, 152 insertions(+), 95 deletions(-)
>>
>> diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py
>> b/BaseTools/Source/Python/AutoGen/AutoGen.py
>> index c5b3fbb0a87f..9e06bb942126 100644
>> --- a/BaseTools/Source/Python/AutoGen/AutoGen.py
>> +++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
>> @@ -333,13 +333,58 @@ class WorkspaceAutoGen(AutoGen):
>> self._GuidDict = {}
>>
>> # there's many relative directory operations, so ...
>> os.chdir(self.WorkspaceDir)
>>
>> + self.MergeArch()
>> + self.ValidateBuildTarget()
>> +
>> + EdkLogger.info("")
>> + if self.ArchList:
>> + EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
>> + EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
>> + EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
>> +
>> + EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
>> + if BuildModule:
>> + EdkLogger.info('%-24s = %s' % ("Active Module",
>> + BuildModule))
>> +
>> + if self.FdfFile:
>> + EdkLogger.info('%-24s = %s' % ("Flash Image Definition",
>> + self.FdfFile))
>> +
>> + EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
>> +
>> + if Progress:
>> + Progress.Start("\nProcessing meta-data")
>> #
>> - # Merge Arch
>> + # Mark now build in AutoGen Phase
>> #
>> + GlobalData.gAutoGenPhase = True
>> + self.ProcessModuleFromPdf()
>> + self.ProcessPcdType()
>> + self.ProcessMixedPcd()
>> + self.GetPcdsFromFDF()
>> + self.CollectAllPcds()
>> + self.GeneratePkgLevelHash()
>> + #
>> + # Check PCDs token value conflict in each DEC file.
>> + #
>> + self._CheckAllPcdsTokenValueConflict()
>> + #
>> + # Check PCD type and definition between DSC and DEC
>> + #
>> + self._CheckPcdDefineAndType()
>> +
>> + self.CreateBuildOptionsFile()
>> + self.CreatePcdTokenNumberFile()
>> + self.CreateModuleHashInfo()
>> + GlobalData.gAutoGenPhase = False
>> +
>> + #
>> + # Merge Arch
>> + #
>> + def MergeArch(self):
>> if not self.ArchList:
>> ArchList = set(self.Platform.SupArchList)
>> else:
>> ArchList = set(self.ArchList) & set(self.Platform.SupArchList)
>> if not ArchList:
>> @@ -349,57 +394,49 @@ class WorkspaceAutoGen(AutoGen):
>> SkippedArchList = set(self.ArchList).symmetric_difference(set(self.Platform.SupArchList))
>> EdkLogger.verbose("\nArch [%s] is ignored because the platform supports [%s] only!"
>> % (" ".join(SkippedArchList), " ".join(self.Platform.SupArchList)))
>> self.ArchList = tuple(ArchList)
>>
>> - # Validate build target
>> + # Validate build target
>> + def ValidateBuildTarget(self):
>> if self.BuildTarget not in self.Platform.BuildTargets:
>> EdkLogger.error("build", PARAMETER_INVALID,
>> ExtraData="Build target [%s] is not supported by the platform. [Valid target: %s]"
>> % (self.BuildTarget, "
>> ".join(self.Platform.BuildTargets)))
>> -
>> -
>> - # parse FDF file to get PCDs in it, if any
>> + @cached_property
>> + def FdfProfile(self):
>> if not self.FdfFile:
>> self.FdfFile = self.Platform.FlashDefinition
>>
>> - EdkLogger.info("")
>> - if self.ArchList:
>> - EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
>> - EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
>> - EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
>> -
>> - EdkLogger.info('\n%-24s = %s' % ("Active Platform", self.Platform))
>> - if BuildModule:
>> - EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
>> -
>> + FdfProfile = None
>> if self.FdfFile:
>> - EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
>> -
>> - EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
>> -
>> - if Progress:
>> - Progress.Start("\nProcessing meta-data")
>> -
>> - if self.FdfFile:
>> - #
>> - # Mark now build in AutoGen Phase
>> - #
>> - GlobalData.gAutoGenPhase = True
>> Fdf = FdfParser(self.FdfFile.Path)
>> Fdf.ParseFile()
>> GlobalData.gFdfParser = Fdf
>> - GlobalData.gAutoGenPhase = False
>> - PcdSet = Fdf.Profile.PcdDict
>> if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.FdDict:
>> FdDict = Fdf.Profile.FdDict[Fdf.CurrentFdName]
>> for FdRegion in FdDict.RegionList:
>> if str(FdRegion.RegionType) is 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
>> if int(FdRegion.Offset) % 8 != 0:
>> EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
>> - ModuleList = Fdf.Profile.InfList
>> - self.FdfProfile = Fdf.Profile
>> + FdfProfile = Fdf.Profile
>> + else:
>> + if self.FdTargetList:
>> + EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
>> + self.FdTargetList = []
>> + if self.FvTargetList:
>> + EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
>> + self.FvTargetList = []
>> + if self.CapTargetList:
>> + EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
>> + self.CapTargetList = []
>> +
>> + return FdfProfile
>> +
>> + def ProcessModuleFromPdf(self):
>> +
>> + if self.FdfProfile:
>> for fvname in self.FvTargetList:
>> if fvname.upper() not in self.FdfProfile.FvDict:
>> EdkLogger.error("build", OPTION_VALUE_INVALID,
>> "No such an FV in FDF file: %s" %
>> fvname)
>>
>> @@ -407,64 +444,60 @@ class WorkspaceAutoGen(AutoGen):
>> # but the path (self.MetaFile.Path) is the real path
>> for key in self.FdfProfile.InfDict:
>> if key == 'ArchTBD':
>> MetaFile_cache = defaultdict(set)
>> for Arch in self.ArchList:
>> - Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
>> + Current_Platform_cache =
>> + self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget,
>> + self.ToolChain]
>> for Pkey in Current_Platform_cache.Modules:
>> MetaFile_cache[Arch].add(Current_Platform_cache.Modules[Pkey].MetaFile)
>> for Inf in self.FdfProfile.InfDict[key]:
>> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
>> for Arch in self.ArchList:
>> if ModuleFile in MetaFile_cache[Arch]:
>> break
>> else:
>> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
>> + ModuleData =
>> + self.BuildDatabase[ModuleFile, Arch, self.BuildTarget,
>> + self.ToolChain]
>> if not ModuleData.IsBinaryModule:
>> EdkLogger.error('build',
>> PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary
>> module?" % ModuleFile)
>>
>> else:
>> for Arch in self.ArchList:
>> if Arch == key:
>> - Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
>> + Platform =
>> + self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget,
>> + self.ToolChain]
>> MetaFileList = set()
>> for Pkey in Platform.Modules:
>> MetaFileList.add(Platform.Modules[Pkey].MetaFile)
>> for Inf in self.FdfProfile.InfDict[key]:
>> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
>> if ModuleFile in MetaFileList:
>> continue
>> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
>> + ModuleData =
>> + self.BuildDatabase[ModuleFile, Arch, self.BuildTarget,
>> + self.ToolChain]
>> if not ModuleData.IsBinaryModule:
>> EdkLogger.error('build',
>> PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary
>> module?" % ModuleFile)
>>
>> - else:
>> - PcdSet = {}
>> - ModuleList = []
>> - self.FdfProfile = None
>> - if self.FdTargetList:
>> - EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
>> - self.FdTargetList = []
>> - if self.FvTargetList:
>> - EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
>> - self.FvTargetList = []
>> - if self.CapTargetList:
>> - EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
>> - self.CapTargetList = []
>> -
>> - # apply SKU and inject PCDs from Flash Definition file
>> +
>> +
>> + # parse FDF file to get PCDs in it, if any
>> + def GetPcdsFromFDF(self):
>> +
>> + if self.FdfProfile:
>> + PcdSet = self.FdfProfile.PcdDict
>> + # handle the mixed pcd in FDF file
>> + for key in PcdSet:
>> + if key in GlobalData.MixedPcd:
>> + Value = PcdSet[key]
>> + del PcdSet[key]
>> + for item in GlobalData.MixedPcd[key]:
>> + PcdSet[item] = Value
>> + self.VerifyPcdDeclearation(PcdSet)
>> +
>> + def ProcessPcdType(self):
>> for Arch in self.ArchList:
>> - Platform = self.BuildDatabase[self.MetaFile, Arch, Target, Toolchain]
>> - PlatformPcds = Platform.Pcds
>> - self._GuidDict = Platform._GuidDict
>> - SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
>> - BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
>> - SourcePcdDict_Keys = SourcePcdDict.keys()
>> - BinaryPcdDict_Keys = BinaryPcdDict.keys()
>> -
>> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
>> + Platform.Pcds
This line is odd, it seems an incorrect copy/paste.
Did you mean:
PlatformPcds = Platform.Pcds
Instead?
>> # generate the SourcePcdDict and BinaryPcdDict
>> - PGen = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
>> + PGen = PlatformAutoGen(self, self.MetaFile,
>> + self.BuildTarget, self.ToolChain, Arch)
>> for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
>> if BuildData.Arch != Arch:
>> continue
>> if BuildData.MetaFile.Ext == '.inf':
>> for key in BuildData.Pcds:
>> @@ -483,11 +516,11 @@ class WorkspaceAutoGen(AutoGen):
>> BuildData.Pcds[key].Type = PcdInPlatform.Type
>> BuildData.Pcds[key].Pending = False
>> else:
>> #Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
>> if BuildData.Pcds[key].Pending:
>> - MGen = ModuleAutoGen(self, BuildData.MetaFile, Target, Toolchain, Arch, self.MetaFile)
>> + MGen = ModuleAutoGen(self,
>> + BuildData.MetaFile, self.BuildTarget, self.ToolChain, Arch,
>> + self.MetaFile)
>> if MGen and MGen.IsLibrary:
>> if MGen in PGen.LibraryAutoGenList:
>> ReferenceModules = MGen.ReferenceModules
>> for ReferenceModule in ReferenceModules:
>> if ReferenceModule.MetaFile in Platform.Modules:
>> @@ -497,10 +530,24 @@ class WorkspaceAutoGen(AutoGen):
>> if PcdInReferenceModule.Type:
>> BuildData.Pcds[key].Type = PcdInReferenceModule.Type
>> BuildData.Pcds[key].Pending = False
>> break
>>
>> + def ProcessMixedPcd(self):
>> + for Arch in self.ArchList:
>> + SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
>> + BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
>> + SourcePcdDict_Keys = SourcePcdDict.keys()
>> + BinaryPcdDict_Keys = BinaryPcdDict.keys()
>> +
>> + # generate the SourcePcdDict and BinaryPcdDict
>> + PGen = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
>> + for BuildData in list(PGen.BuildDatabase._CACHE_.values()):
>> + if BuildData.Arch != Arch:
>> + continue
>> + if BuildData.MetaFile.Ext == '.inf':
>> + for key in BuildData.Pcds:
>> if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].Type:
>> if BuildData.IsBinaryModule:
>> BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
>> else:
>>
>> SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName
>> , BuildData.Pcds[key].TokenSpaceGuidCName))
>> @@ -514,12 +561,11 @@ class WorkspaceAutoGen(AutoGen):
>>
>> elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Type:
>> SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
>> elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[key].Type:
>> SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
>> - else:
>> - pass
>> +
>> #
>> # A PCD can only use one type for all source modules
>> #
>> for i in SourcePcdDict_Keys:
>> for j in SourcePcdDict_Keys:
>> @@ -588,27 +634,38 @@ class WorkspaceAutoGen(AutoGen):
>> del BuildData.Pcds[key]
>> BuildData.Pcds[newkey] = Value
>> break
>> break
>>
>> - # handle the mixed pcd in FDF file
>> - for key in PcdSet:
>> - if key in GlobalData.MixedPcd:
>> - Value = PcdSet[key]
>> - del PcdSet[key]
>> - for item in GlobalData.MixedPcd[key]:
>> - PcdSet[item] = Value
>> + #Collect package set information from INF of FDF
>> + @cached_property
>> + def PkgSet(self):
>> + if not self.FdfFile:
>> + self.FdfFile = self.Platform.FlashDefinition
>>
>> - #Collect package set information from INF of FDF
>> + if self.FdfFile:
>> + ModuleList = self.FdfProfile.InfList
>> + else:
>> + ModuleList = []
>> + Pkgs = {}
>> + for Arch in self.ArchList:
>> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
>> + PGen = PlatformAutoGen(self, self.MetaFile,
>> + self.BuildTarget, self.ToolChain, Arch)
>> PkgSet = set()
>> for Inf in ModuleList:
>> ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
>> if ModuleFile in Platform.Modules:
>> continue
>> - ModuleData = self.BuildDatabase[ModuleFile, Arch, Target, Toolchain]
>> + ModuleData = self.BuildDatabase[ModuleFile, Arch,
>> + self.BuildTarget, self.ToolChain]
>> PkgSet.update(ModuleData.Packages)
>> - Pkgs = list(PkgSet) + list(PGen.PackageList)
>> + Pkgs[Arch] = list(PkgSet) + list(PGen.PackageList)
>> + return Pkgs
>> +
>> + def VerifyPcdDeclearation(self,PcdSet):
>> + for Arch in self.ArchList:
>> + Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
>> + Pkgs = self.PkgSet[Arch]
>> DecPcds = set()
>> DecPcdsKey = set()
>> for Pkg in Pkgs:
>> for Pcd in Pkg.Pcds:
>> DecPcds.add((Pcd[0], Pcd[1])) @@ -636,37 +693,33
>> @@ class WorkspaceAutoGen(AutoGen):
>> PARSER_ERROR,
>> "Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (Guid, Name),
>> File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
>> Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
>> )
>> + def CollectAllPcds(self):
>>
>> - Pa = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
>> + for Arch in self.ArchList:
>> + Pa = PlatformAutoGen(self, self.MetaFile,
>> + self.BuildTarget, self.ToolChain, Arch)
>> #
>> # Explicitly collect platform's dynamic PCDs
>> #
>> Pa.CollectPlatformDynamicPcds()
>> Pa.CollectFixedAtBuildPcds()
>> self.AutoGenObjectList.append(Pa)
>>
>> - #
>> - # Generate Package level hash value
>> - #
>> + #
>> + # Generate Package level hash value
>> + #
>> + def GeneratePkgLevelHash(self):
>> + for Arch in self.ArchList:
>> GlobalData.gPackageHash = {}
>> if GlobalData.gUseHashCache:
>> - for Pkg in Pkgs:
>> + for Pkg in self.PkgSet[Arch]:
>> self._GenPkgLevelHash(Pkg)
>>
>> - #
>> - # Check PCDs token value conflict in each DEC file.
>> - #
>> - self._CheckAllPcdsTokenValueConflict()
>> -
>> - #
>> - # Check PCD type and definition between DSC and DEC
>> - #
>> - self._CheckPcdDefineAndType()
>>
>> + def CreateBuildOptionsFile(self):
>> #
>> # Create BuildOptions Macro & PCD metafile, also add the Active Platform and FDF file.
>> #
>> content = 'gCommandLineDefines: '
>> content += str(GlobalData.gCommandLineDefines)
>> @@ -681,27 +734,31 @@ class WorkspaceAutoGen(AutoGen):
>> content += 'Flash Image Definition: '
>> content += str(self.FdfFile)
>> content += TAB_LINE_BREAK
>> SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'),
>> content, False)
>>
>> + def CreatePcdTokenNumberFile(self):
>> #
>> # Create PcdToken Number file for Dynamic/DynamicEx Pcd.
>> #
>> PcdTokenNumber = 'PcdTokenNumber: '
>> - if Pa.PcdTokenNumber:
>> - if Pa.DynamicPcdList:
>> - for Pcd in Pa.DynamicPcdList:
>> - PcdTokenNumber += TAB_LINE_BREAK
>> - PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
>> - PcdTokenNumber += ' : '
>> - PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
>> + for Arch in self.ArchList:
>> + Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
>> + if Pa.PcdTokenNumber:
>> + if Pa.DynamicPcdList:
>> + for Pcd in Pa.DynamicPcdList:
>> + PcdTokenNumber += TAB_LINE_BREAK
>> + PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
>> + PcdTokenNumber += ' : '
>> + PcdTokenNumber +=
>> + str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
>> SaveFileOnChange(os.path.join(self.BuildDir,
>> 'PcdTokenNumber'), PcdTokenNumber, False)
>>
>> + def CreateModuleHashInfo(self):
>> #
>> # Get set of workspace metafiles
>> #
>> - AllWorkSpaceMetaFiles = self._GetMetaFiles(Target, Toolchain, Arch)
>> + AllWorkSpaceMetaFiles = self._GetMetaFiles(self.BuildTarget,
>> + self.ToolChain)
>>
>> #
>> # Retrieve latest modified time of all metafiles
>> #
>> SrcTimeStamp = 0
>> @@ -759,11 +816,11 @@ class WorkspaceAutoGen(AutoGen):
>> f.close()
>> m.update(Content)
>> SaveFileOnChange(HashFile, m.hexdigest(), False)
>> GlobalData.gPackageHash[Pkg.PackageName] = m.hexdigest()
>>
>> - def _GetMetaFiles(self, Target, Toolchain, Arch):
>> + def _GetMetaFiles(self, Target, Toolchain):
>> AllWorkSpaceMetaFiles = set()
>> #
>> # add fdf
>> #
>> if self.FdfFile:
>>
^ permalink raw reply [flat|nested] 18+ messages in thread
* Re: [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen
2019-07-30 7:31 ` Bob Feng
@ 2019-07-30 14:02 ` Laszlo Ersek
0 siblings, 0 replies; 18+ messages in thread
From: Laszlo Ersek @ 2019-07-30 14:02 UTC (permalink / raw)
To: Feng, Bob C, devel@edk2.groups.io
On 07/30/19 09:31, Feng, Bob C wrote:
> Hi Laszlo,
>
> 1. Question#1
> I did not receive any except yours about the LogQ MaxSize. 2#-6# are introduced based on my testing result. They can be seen as bugfixes for regression issues.
OK, thanks. This information is very helpful. Please consider including
it in the v(n+1) cover letter, whenever it applies.
> 2. Request#2
> Yes. I can restructure the patch series and sent out V5. I created a separate 11/11 patch because I thought it would be easier to tell what is my changes between V3 and V4.
That's very thoughtful of you, thank you.
However, the goal of a patch series is to create good patches for the
git history (i.e. to make permanent stages of the project history as
self contained and as "perfect" as possible).
While reviewer burden *does* count, it is still secondary to the
self-containment of patches.
The right answer to the problem that you implicitly raise -- namely that
"incremental review of patch series is difficult" -- is to use proper
*tooling*. WebUI's are generally poor at this. However, the
"git-range-diff" command is quite good at it (that is, at displaying
"interdiff"s).
So please just update each patch properly, in order to remove the
regressions, and please leave it to reviewers to generate the resultant
interdiffs for review, with whatever tools they prefer.
> 3. Request#3
> Yes. I'll state more clear description in V5 cover letter.
>
> My testing is to compare auto-gened files created by basetool with patches and without patches.
Nice!
> 2#-6# resolved the regression issue for the following usage scenarios:
> 1. One module is built multiple times in one build. 2# and 6#
Can you tell us more about this use case?
- Does it refer to the same library instance that is built into multiple
different drivers/applications, with module-scope PCD / LibClasses
overrides? (Such that PCDs / sub-libraries change how the library
instance behaves.)
- Or does it refer to multiple builds of the same driver/application,
with FILE_GUID overrides?
> 2. Exception handling. 1) The code run in sub-process raise exception. 2) Ctrl + C. 4#
> 3. --pcd is used in build command line. 3#
> 4. Shared fixedatbuild Pcd between module and its libraries. It affects the content of AutoGen.h and AutoGen.c. 5#
Thanks, these help me a bit to understand. In the v5 cover letter, can
you please repeat the explanations?
Thanks!
Laszlo
> Thanks,
> Bob
>
> -----Original Message-----
> From: devel@edk2.groups.io [mailto:devel@edk2.groups.io] On Behalf Of Laszlo Ersek
> Sent: Monday, July 29, 2019 6:10 PM
> To: Feng, Bob C <bob.c.feng@intel.com>
> Cc: devel@edk2.groups.io
> Subject: Re: [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen
>
> Hi Bob,
>
> On 07/29/19 10:44, Bob Feng wrote:
>> BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1875
>>
>> In order to improve the build performance, we implemented
>> multiple-processes AutoGen. This change will reduce 20% time for
>> AutoGen phase.
>>
>> The design document can be got from:
>> https://edk2.groups.io/g/devel/files/Designs/2019/0627/Multiple-thread
>> -AutoGen.pdf
>>
>> This patch serial pass the build of Ovmf, MinKabylake, MinPurley,
>> packages under Edk2 repository and intel client and server platforms.
>>
>> V4:
>> Add one more patch 11/11 to enhance this feature. 1-10 are the same as
>> V3
>> 1. Set Log queue maxsize as thread number * 10 2. enhance
>> ModuleUniqueBaseName function 3. fix bugs of build option pcd in sub
>> Process 4. enhance error handling. Handle the exception of
>> KeyboardInterrup and exceptions happen in subprocess.
>> 5. fix the issue of shared fixed pcd between module and lib.
>> 6. fix bug in the function of duplicate modules handling.
>
> Item #1 seems to be in response to my v3 comment at:
>
> http://mid.mail-archive.com/e3f68d77-4837-0ef6-ab4f-95e50c4621ff@redhat.com
> https://edk2.groups.io/g/devel/message/44241
>
> Therefore, I understand why item#1 is in scope for the v4 update.
>
> However, the other updates in v4 (items #2 through #6) do not seem to address review feedback for v3. I'm saying that because I cannot see
> *any* feedback under v3 other than mine.
>
> At
>
> http://mid.mail-archive.com/08650203BA1BD64D8AD9B6D5D74A85D160B468EB@SHSMSX105.ccr.corp.intel.com
> https://edk2.groups.io/g/devel/message/44249
>
> you wrote,
>
>> I'd like to collect more comments on other parts and update all the
>> comments in V4.
>
> Question#1: where are all those comments that justify the v4 updates #2-#6? Did you get them in private (off-list)? Or did you determine the necessity of #2-#6 yourself, without review feedback?
>
> --*--
>
> The following v4 updates are certainly bugfixes, relative to v3: #3, #5, #6.
>
> The following v4 updates *may* be bugfixes rather than additional features ("enhancements") -- I can't tell myself, because they are not explained deeply enough: #2, #4.
>
> The point is, bugs that are known to be introduced by patches 01 through
> 10 should not be fixed up separately, in an incremental patch. Instead, you should split out *minimally* the #3, #5, and #6 bugfixes, and squash them into the appropriate patches between 01 and 10 (boundaries included of course), please.
>
> For example, regarding item #1, the following change from patch 11:
>
>> -LogQMaxSize = 60
>> +LogQMaxSize = ThreadNum() * 10
>
> is wrong. Instead, you should update patch 10, so that when the log agent is introduced, it be introduced at once with "LogQMaxSize =
> ThreadNum() * 10".
>
> The same applies to (minimally) #3, #5, and #6. Known bugs should not be introduced mid-series, even temporarily. The bugs should be fixed up inside the specific patches that introduce them in v3, and not in an incremental patch.
>
> If items #2 and #4 are indeed enhancements and not bugfixes (that is, the series works fine without #2 / #4, functionally speaking, but #2/#4 improve some aspects, such as performance, user experience, etc), then keeping them in separate patches might, or might not, make sense. That's up to you, but even if you decide to separate them out of patches 01 to 10, they should still be isolated from *each other*.
>
> Request#2: please restructure the patch series as explained.
>
> --*--
>
> The v4 cover letter, and patch v4 11/11, refer to a function called "ModuleUniqueBaseName". I can't find the identifier "ModuleUniqueBaseName" in the series.
>
> Request#3: please clean up the cover letter and the commit messages. In addition, please explain the v(n) --> v(n+1) updates in a lot more detail, in the series cover letter. For example, item #5 seems like a pretty serious bugfix, but nothing is explained about the nature of the issue.
>
> Thanks
> Laszlo
>
>
>
^ permalink raw reply [flat|nested] 18+ messages in thread
end of thread, other threads:[~2019-07-30 14:02 UTC | newest]
Thread overview: 18+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2019-07-29 8:44 [Patch 00/11 V4] Enable multiple process AutoGen Bob Feng
2019-07-29 8:44 ` [Patch 01/11] BaseTools: Singleton the object to handle build conf file Bob Feng
2019-07-29 8:44 ` [Patch 02/11] BaseTools: Split WorkspaceAutoGen._InitWorker into multiple functions Bob Feng
2019-07-29 15:03 ` [edk2-devel] " Philippe Mathieu-Daudé
2019-07-30 2:10 ` Bob Feng
2019-07-30 12:38 ` Philippe Mathieu-Daudé
2019-07-29 8:44 ` [Patch 03/11] BaseTools: Add functions to get platform scope build options Bob Feng
2019-07-29 8:44 ` [Patch 04/11] BaseTools: Decouple AutoGen Objects Bob Feng
2019-07-29 8:44 ` [Patch 05/11] BaseTools: Enable Multiple Process AutoGen Bob Feng
2019-07-29 8:44 ` [Patch 06/11] BaseTools: Add shared data for processes Bob Feng
2019-07-29 8:44 ` [Patch 07/11] BaseTools: Add LogAgent to support multiple process Autogen Bob Feng
2019-07-29 8:44 ` [Patch 08/11] BaseTools: Move BuildOption parser out of build.py Bob Feng
2019-07-29 8:44 ` [Patch 09/11] BaseTools: Add the support for python 2 Bob Feng
2019-07-29 8:44 ` [Patch 10/11] BaseTools: Enable block queue log agent Bob Feng
2019-07-29 8:44 ` [Patch 11/11] BaseTools: Enhance Multiple-Process AutoGen Bob Feng
2019-07-29 10:10 ` [edk2-devel] [Patch 00/11 V4] Enable multiple process AutoGen Laszlo Ersek
2019-07-30 7:31 ` Bob Feng
2019-07-30 14:02 ` Laszlo Ersek
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox