From mboxrd@z Thu Jan 1 00:00:00 1970 Authentication-Results: mx.groups.io; dkim=missing; spf=pass (domain: intel.com, ip: 134.134.136.20, mailfrom: bob.c.feng@intel.com) Received: from mga02.intel.com (mga02.intel.com [134.134.136.20]) by groups.io with SMTP; Thu, 22 Aug 2019 01:15:10 -0700 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False Received: from fmsmga002.fm.intel.com ([10.253.24.26]) by orsmga101.jf.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 22 Aug 2019 01:15:09 -0700 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.64,416,1559545200"; d="scan'208";a="208056205" Received: from fmsmsx106.amr.corp.intel.com ([10.18.124.204]) by fmsmga002.fm.intel.com with ESMTP; 22 Aug 2019 01:15:08 -0700 Received: from fmsmsx153.amr.corp.intel.com (10.18.125.6) by FMSMSX106.amr.corp.intel.com (10.18.124.204) with Microsoft SMTP Server (TLS) id 14.3.439.0; Thu, 22 Aug 2019 01:15:07 -0700 Received: from shsmsx101.ccr.corp.intel.com (10.239.4.153) by FMSMSX153.amr.corp.intel.com (10.18.125.6) with Microsoft SMTP Server (TLS) id 14.3.439.0; Thu, 22 Aug 2019 01:15:06 -0700 Received: from shsmsx104.ccr.corp.intel.com ([169.254.5.112]) by SHSMSX101.ccr.corp.intel.com ([169.254.1.80]) with mapi id 14.03.0439.000; Thu, 22 Aug 2019 16:15:04 +0800 From: "Bob Feng" To: "devel@edk2.groups.io" , "glin@suse.com" CC: "Gao, Liming" , "Shi, Steven" Subject: Re: [edk2-devel] [Patch 04/10 V8] BaseTools: Decouple AutoGen Objects Thread-Topic: [edk2-devel] [Patch 04/10 V8] BaseTools: Decouple AutoGen Objects Thread-Index: AQHVTNg0k/mzujHdEEegfOl/3ggvwacGYL4AgACIdCA= Date: Thu, 22 Aug 2019 08:15:03 +0000 Message-ID: <08650203BA1BD64D8AD9B6D5D74A85D161527696@SHSMSX104.ccr.corp.intel.com> References: <20190807042537.11928-1-bob.c.feng@intel.com> <20190807042537.11928-5-bob.c.feng@intel.com> <20190822080437.GF2052@GaryWorkstation> In-Reply-To: <20190822080437.GF2052@GaryWorkstation> Accept-Language: zh-CN, en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [10.239.127.40] MIME-Version: 1.0 Return-Path: bob.c.feng@intel.com Content-Language: en-US Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Hi Gary, https://edk2.groups.io/g/devel/message/46196 This patch is under review and it can fix this regression issue. Thanks, Bob=20 -----Original Message----- From: devel@edk2.groups.io [mailto:devel@edk2.groups.io] On Behalf Of Gary= Lin Sent: Thursday, August 22, 2019 4:05 PM To: devel@edk2.groups.io; Feng, Bob C Cc: Gao, Liming ; Shi, Steven Subject: Re: [edk2-devel] [Patch 04/10 V8] BaseTools: Decouple AutoGen Obj= ects On Wed, Aug 07, 2019 at 12:25:31PM +0800, Bob Feng wrote: > BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=3D1875 >=20 > 1. Separate the AutoGen.py into 3 small py files. > One is for AutoGen base class, one is for WorkspaceAutoGen class > and PlatformAutoGen class, and the one for ModuleAutoGen class. > 2. Create a new class DataPipe to store the Platform scope settings. > Create a new class PlatformInfo to provide the same interface > as PlatformAutoGen. PlatformInfo class is initialized by > DataPipe instance. > Create a new class WorkspaceInfo to provide the same interface > as WorkspaceAutoGen. WorkspaceInfo class is initialized by > DataPipe instance. > 3. Change ModuleAutoGen to depends on DataPipe, PlatformInfo and > WorkspaceInfo. Remove the dependency of ModuleAutoGen to PlatformAutoGen= . >=20 I found a regression of the dependency check. When adding a driver, e.g. OpalPasswordPei, to OVMF without setting the library dependencies correctly, before this patch, 'build' would show the missing library and stop immediately. Now, 'build' doesn't complain anythi= ng and just starts compiling the code. It ends up in some strange errors due to the missing libraries. It's easy to reproduce the bug with the following patch: diff --git a/OvmfPkg/OvmfPkgX64.dsc b/OvmfPkg/OvmfPkgX64.dsc index 68073ef55b4d..7d67706612d1 100644 --- a/OvmfPkg/OvmfPkgX64.dsc +++ b/OvmfPkg/OvmfPkgX64.dsc @@ -636,6 +636,7 @@ [Components] NULL|SecurityPkg/Library/HashInstanceLibSha384/HashInstanceLibSha38= 4.inf NULL|SecurityPkg/Library/HashInstanceLibSha512/HashInstanceLibSha51= 2.inf } + SecurityPkg/Tcg/Opal/OpalPassword/OpalPasswordPei.inf !if $(TPM2_CONFIG_ENABLE) =3D=3D TRUE SecurityPkg/Tcg/Tcg2Config/Tcg2ConfigDxe.inf !endif Gary Lin > Cc: Liming Gao > Cc: Steven Shi > Signed-off-by: Bob Feng > --- > BaseTools/Source/Python/AutoGen/AutoGen.py | 4264 +---------------- > BaseTools/Source/Python/AutoGen/DataPipe.py | 147 + > BaseTools/Source/Python/AutoGen/GenC.py | 2 +- > .../Source/Python/AutoGen/ModuleAutoGen.py | 1908 ++++++++ > .../Python/AutoGen/ModuleAutoGenHelper.py | 619 +++ > .../Source/Python/AutoGen/PlatformAutoGen.py | 1505 ++++++ > .../Source/Python/AutoGen/WorkspaceAutoGen.py | 904 ++++ > BaseTools/Source/Python/Common/Misc.py | 1 - > .../Python/PatchPcdValue/PatchPcdValue.py | 1 - > .../Source/Python/Workspace/DscBuildData.py | 10 +- > .../Source/Python/Workspace/InfBuildData.py | 29 + > .../Python/Workspace/WorkspaceCommon.py | 4 + > .../Python/Workspace/WorkspaceDatabase.py | 3 + > BaseTools/Source/Python/build/BuildReport.py | 4 +- > BaseTools/Source/Python/build/build.py | 51 +- > 15 files changed, 5204 insertions(+), 4248 deletions(-) > create mode 100644 BaseTools/Source/Python/AutoGen/DataPipe.py > create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGen.py > create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.= py > create mode 100644 BaseTools/Source/Python/AutoGen/PlatformAutoGen.py > create mode 100644 BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py >=20 > diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Sour= ce/Python/AutoGen/AutoGen.py > index bb0da46d74a9..d9ee699d8f30 100644 > --- a/BaseTools/Source/Python/AutoGen/AutoGen.py > +++ b/BaseTools/Source/Python/AutoGen/AutoGen.py > @@ -10,226 +10,11 @@ > > ## Import Modules > # > from __future__ import print_function > from __future__ import absolute_import > -import Common.LongFilePathOs as os > -import re > -import os.path as path > -import copy > -import uuid > - > -from . import GenC > -from . import GenMake > -from . import GenDepex > -from io import BytesIO > - > -from .StrGather import * > -from .BuildEngine import BuildRuleObj as BuildRule > -from .BuildEngine import gDefaultBuildRuleFile,AutoGenReqBuildRuleVerNu= m > -import shutil > -from Common.LongFilePathSupport import CopyLongFilePath > -from Common.BuildToolError import * > -from Common.DataType import * > -from Common.Misc import * > -from Common.StringUtils import * > -import Common.GlobalData as GlobalData > -from GenFds.FdfParser import * > -from CommonDataClass.CommonClass import SkuInfoClass > -from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile > -import Common.VpdInfoFile as VpdInfoFile > -from .GenPcdDb import CreatePcdDatabaseCode > -from Workspace.MetaFileCommentParser import UsageList > -from Workspace.WorkspaceCommon import GetModuleLibInstances > -from Common.MultipleWorkspace import MultipleWorkspace as mws > -from . import InfSectionParser > -import datetime > -import hashlib > -from .GenVar import VariableMgr, var_info > -from collections import OrderedDict > -from collections import defaultdict > -from Workspace.WorkspaceCommon import OrderedListDict > -from Common.ToolDefClassObject import gDefaultToolsDefFile > - > -from Common.caching import cached_property, cached_class_function > - > -## Regular expression for splitting Dependency Expression string into t= okens > -gDepexTokenPattern =3D re.compile("(\(|\)|\w+| \S+\.inf)") > - > -## Regular expression for match: PCD(xxxx.yyy) > -gPCDAsGuidPattern =3D re.compile(r"^PCD\(.+\..+\)$") > - > -# > -# Regular expression for finding Include Directories, the difference be= tween MSFT and INTEL/GCC/RVCT > -# is the former use /I , the Latter used -I to specify include director= ies > -# > -gBuildOptIncludePatternMsft =3D re.compile(r"(?:.*?)/I[ \t]*([^ ]*)", r= e.MULTILINE | re.DOTALL) > -gBuildOptIncludePatternOther =3D re.compile(r"(?:.*?)-I[ \t]*([^ ]*)", = re.MULTILINE | re.DOTALL) > - > -# > -# Match name =3D variable > -# > -gEfiVarStoreNamePattern =3D re.compile("\s*name\s*=3D\s*(\w+)") > -# > -# The format of guid in efivarstore statement likes following and must = be correct: > -# guid =3D {0xA04A27f4, 0xDF00, 0x4D42, {0xB5, 0x52, 0x39, 0x51, 0x13, = 0x02, 0x11, 0x3D}} > -# > -gEfiVarStoreGuidPattern =3D re.compile("\s*guid\s*=3D\s*({.*?{.*?}\s*})= ") > - > -## Mapping Makefile type > -gMakeTypeMap =3D {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"} > - > - > -## default file name for AutoGen > -gAutoGenCodeFileName =3D "AutoGen.c" > -gAutoGenHeaderFileName =3D "AutoGen.h" > -gAutoGenStringFileName =3D "%(module_name)sStrDefs.h" > -gAutoGenStringFormFileName =3D "%(module_name)sStrDefs.hpk" > -gAutoGenDepexFileName =3D "%(module_name)s.depex" > -gAutoGenImageDefFileName =3D "%(module_name)sImgDefs.h" > -gAutoGenIdfFileName =3D "%(module_name)sIdf.hpk" > -gInfSpecVersion =3D "0x00010017" > - > -# > -# Template string to generic AsBuilt INF > -# > -gAsBuiltInfHeaderString =3D TemplateString("""${header_comments} > - > -# DO NOT EDIT > -# FILE auto-generated > - > -[Defines] > - INF_VERSION =3D ${module_inf_version} > - BASE_NAME =3D ${module_name} > - FILE_GUID =3D ${module_guid} > - MODULE_TYPE =3D ${module_module_type}${BEGIN} > - VERSION_STRING =3D ${module_version_string}${END}${BEGIN} > - PCD_IS_DRIVER =3D ${pcd_is_driver_string}${END}${BEGIN} > - UEFI_SPECIFICATION_VERSION =3D ${module_uefi_specification_version}${= END}${BEGIN} > - PI_SPECIFICATION_VERSION =3D ${module_pi_specification_version}${EN= D}${BEGIN} > - ENTRY_POINT =3D ${module_entry_point}${END}${BEGIN} > - UNLOAD_IMAGE =3D ${module_unload_image}${END}${BEGIN} > - CONSTRUCTOR =3D ${module_constructor}${END}${BEGIN} > - DESTRUCTOR =3D ${module_destructor}${END}${BEGIN} > - SHADOW =3D ${module_shadow}${END}${BEGIN} > - PCI_VENDOR_ID =3D ${module_pci_vendor_id}${END}${BEGIN} > - PCI_DEVICE_ID =3D ${module_pci_device_id}${END}${BEGIN} > - PCI_CLASS_CODE =3D ${module_pci_class_code}${END}${BEGIN} > - PCI_REVISION =3D ${module_pci_revision}${END}${BEGIN} > - BUILD_NUMBER =3D ${module_build_number}${END}${BEGIN} > - SPEC =3D ${module_spec}${END}${BEGIN} > - UEFI_HII_RESOURCE_SECTION =3D ${module_uefi_hii_resource_section}${E= ND}${BEGIN} > - MODULE_UNI_FILE =3D ${module_uni_file}${END} > - > -[Packages.${module_arch}]${BEGIN} > - ${package_item}${END} > - > -[Binaries.${module_arch}]${BEGIN} > - ${binary_item}${END} > - > -[PatchPcd.${module_arch}]${BEGIN} > - ${patchablepcd_item} > -${END} > - > -[Protocols.${module_arch}]${BEGIN} > - ${protocol_item} > -${END} > - > -[Ppis.${module_arch}]${BEGIN} > - ${ppi_item} > -${END} > - > -[Guids.${module_arch}]${BEGIN} > - ${guid_item} > -${END} > - > -[PcdEx.${module_arch}]${BEGIN} > - ${pcd_item} > -${END} > - > -[LibraryClasses.${module_arch}] > -## @LIB_INSTANCES${BEGIN} > -# ${libraryclasses_item}${END} > - > -${depexsection_item} > - > -${userextension_tianocore_item} > - > -${tail_comments} > - > -[BuildOptions.${module_arch}] > -## @AsBuilt${BEGIN} > -## ${flags_item}${END} > -""") > -## Split command line option string to list > -# > -# subprocess.Popen needs the args to be a sequence. Otherwise there's p= roblem > -# in non-windows platform to launch command > -# > -def _SplitOption(OptionString): > - OptionList =3D [] > - LastChar =3D " " > - OptionStart =3D 0 > - QuotationMark =3D "" > - for Index in range(0, len(OptionString)): > - CurrentChar =3D OptionString[Index] > - if CurrentChar in ['"', "'"]: > - if QuotationMark =3D=3D CurrentChar: > - QuotationMark =3D "" > - elif QuotationMark =3D=3D "": > - QuotationMark =3D CurrentChar > - continue > - elif QuotationMark: > - continue > - > - if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r", = "\n"]: > - if Index > OptionStart: > - OptionList.append(OptionString[OptionStart:Index - 1]) > - OptionStart =3D Index > - LastChar =3D CurrentChar > - OptionList.append(OptionString[OptionStart:]) > - return OptionList > - > -# > -# Convert string to C format array > -# > -def _ConvertStringToByteArray(Value): > - Value =3D Value.strip() > - if not Value: > - return None > - if Value[0] =3D=3D '{': > - if not Value.endswith('}'): > - return None > - Value =3D Value.replace(' ', '').replace('{', '').replace('}', = '') > - ValFields =3D Value.split(',') > - try: > - for Index in range(len(ValFields)): > - ValFields[Index] =3D str(int(ValFields[Index], 0)) > - except ValueError: > - return None > - Value =3D '{' + ','.join(ValFields) + '}' > - return Value > - > - Unicode =3D False > - if Value.startswith('L"'): > - if not Value.endswith('"'): > - return None > - Value =3D Value[1:] > - Unicode =3D True > - elif not Value.startswith('"') or not Value.endswith('"'): > - return None > - > - Value =3D eval(Value) # translate escape character > - NewValue =3D '{' > - for Index in range(0, len(Value)): > - if Unicode: > - NewValue =3D NewValue + str(ord(Value[Index]) % 0x10000) + = ',' > - else: > - NewValue =3D NewValue + str(ord(Value[Index]) % 0x100) + ',= ' > - Value =3D NewValue + '0}' > - return Value > - > +from Common.DataType import TAB_STAR > ## Base class for AutoGen > # > # This class just implements the cache mechanism of AutoGen objects. > # > class AutoGen(object): > @@ -246,10 +31,11 @@ class AutoGen(object): > # @param Toolchain Tool chain name > # @param Arch Target arch > # @param *args The specific class related parameters > # @param **kwargs The specific class related dict paramet= ers > # > + > def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *arg= s, **kwargs): > # check if the object has been created > Key =3D (Target, Toolchain, Arch, MetaFile) > if Key in cls.__ObjectCache: > # if it exists, just return it directly > @@ -279,4007 +65,49 @@ class AutoGen(object): > > ## "=3D=3D" operator > def __eq__(self, Other): > return Other and self.MetaFile =3D=3D Other > > -## Workspace AutoGen class > -# > -# This class is used mainly to control the whole platform build for d= ifferent > -# architecture. This class will generate top level makefile. > -# > -class WorkspaceAutoGen(AutoGen): > - # call super().__init__ then call the worker function with differen= t parameter count > - def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > - if not hasattr(self, "_Init"): > - self._InitWorker(Workspace, MetaFile, Target, Toolchain, Ar= ch, *args, **kwargs) > - self._Init =3D True > - > - ## Initialize WorkspaceAutoGen > - # > - # @param WorkspaceDir Root directory of workspace > - # @param ActivePlatform Meta-file of active platform > - # @param Target Build target > - # @param Toolchain Tool chain name > - # @param ArchList List of architecture of current= build > - # @param MetaFileDb Database containing meta-files > - # @param BuildConfig Configuration of build > - # @param ToolDefinition Tool chain definitions > - # @param FlashDefinitionFile File of flash definition > - # @param Fds FD list to be generated > - # @param Fvs FV list to be generated > - # @param Caps Capsule list to be generated > - # @param SkuId SKU id from command line > - # > - def _InitWorker(self, WorkspaceDir, ActivePlatform, Target, Toolcha= in, ArchList, MetaFileDb, > - BuildConfig, ToolDefinition, FlashDefinitionFile=3D'', Fd= s=3DNone, Fvs=3DNone, Caps=3DNone, SkuId=3D'', UniFlag=3DNone, > - Progress=3DNone, BuildModule=3DNone): > - self.BuildDatabase =3D MetaFileDb > - self.MetaFile =3D ActivePlatform > - self.WorkspaceDir =3D WorkspaceDir > - self.Platform =3D self.BuildDatabase[self.MetaFile, TAB_A= RCH_COMMON, Target, Toolchain] > - GlobalData.gActivePlatform =3D self.Platform > - self.BuildTarget =3D Target > - self.ToolChain =3D Toolchain > - self.ArchList =3D ArchList > - self.SkuId =3D SkuId > - self.UniFlag =3D UniFlag > - > - self.TargetTxt =3D BuildConfig > - self.ToolDef =3D ToolDefinition > - self.FdfFile =3D FlashDefinitionFile > - self.FdTargetList =3D Fds if Fds else [] > - self.FvTargetList =3D Fvs if Fvs else [] > - self.CapTargetList =3D Caps if Caps else [] > - self.AutoGenObjectList =3D [] > - self._GuidDict =3D {} > - > - # there's many relative directory operations, so ... > - os.chdir(self.WorkspaceDir) > - > - self.MergeArch() > - self.ValidateBuildTarget() > - > - EdkLogger.info("") > - if self.ArchList: > - EdkLogger.info('%-16s =3D %s' % ("Architecture(s)", ' '.joi= n(self.ArchList))) > - EdkLogger.info('%-16s =3D %s' % ("Build target", self.BuildTarg= et)) > - EdkLogger.info('%-16s =3D %s' % ("Toolchain", self.ToolChain)) > - > - EdkLogger.info('\n%-24s =3D %s' % ("Active Platform", self.Plat= form)) > - if BuildModule: > - EdkLogger.info('%-24s =3D %s' % ("Active Module", BuildModu= le)) > - > - if self.FdfFile: > - EdkLogger.info('%-24s =3D %s' % ("Flash Image Definition", = self.FdfFile)) > - > - EdkLogger.verbose("\nFLASH_DEFINITION =3D %s" % self.FdfFile) > - > - if Progress: > - Progress.Start("\nProcessing meta-data") > - # > - # Mark now build in AutoGen Phase > - # > - GlobalData.gAutoGenPhase =3D True > - self.ProcessModuleFromPdf() > - self.ProcessPcdType() > - self.ProcessMixedPcd() > - self.GetPcdsFromFDF() > - self.CollectAllPcds() > - self.GeneratePkgLevelHash() > - # > - # Check PCDs token value conflict in each DEC file. > - # > - self._CheckAllPcdsTokenValueConflict() > - # > - # Check PCD type and definition between DSC and DEC > - # > - self._CheckPcdDefineAndType() > - > - self.CreateBuildOptionsFile() > - self.CreatePcdTokenNumberFile() > - self.CreateModuleHashInfo() > - GlobalData.gAutoGenPhase =3D False > - > - # > - # Merge Arch > - # > - def MergeArch(self): > - if not self.ArchList: > - ArchList =3D set(self.Platform.SupArchList) > - else: > - ArchList =3D set(self.ArchList) & set(self.Platform.SupArch= List) > - if not ArchList: > - EdkLogger.error("build", PARAMETER_INVALID, > - ExtraData =3D "Invalid ARCH specified. [Val= id ARCH: %s]" % (" ".join(self.Platform.SupArchList))) > - elif self.ArchList and len(ArchList) !=3D len(self.ArchList): > - SkippedArchList =3D set(self.ArchList).symmetric_difference= (set(self.Platform.SupArchList)) > - EdkLogger.verbose("\nArch [%s] is ignored because the platf= orm supports [%s] only!" > - % (" ".join(SkippedArchList), " ".join(se= lf.Platform.SupArchList))) > - self.ArchList =3D tuple(ArchList) > - > - # Validate build target > - def ValidateBuildTarget(self): > - if self.BuildTarget not in self.Platform.BuildTargets: > - EdkLogger.error("build", PARAMETER_INVALID, > - ExtraData=3D"Build target [%s] is not suppo= rted by the platform. [Valid target: %s]" > - % (self.BuildTarget, " ".join(sel= f.Platform.BuildTargets))) > - @cached_property > - def FdfProfile(self): > - if not self.FdfFile: > - self.FdfFile =3D self.Platform.FlashDefinition > - > - FdfProfile =3D None > - if self.FdfFile: > - Fdf =3D FdfParser(self.FdfFile.Path) > - Fdf.ParseFile() > - GlobalData.gFdfParser =3D Fdf > - if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.F= dDict: > - FdDict =3D Fdf.Profile.FdDict[Fdf.CurrentFdName] > - for FdRegion in FdDict.RegionList: > - if str(FdRegion.RegionType) is 'FILE' and self.Plat= form.VpdToolGuid in str(FdRegion.RegionDataList): > - if int(FdRegion.Offset) % 8 !=3D 0: > - EdkLogger.error("build", FORMAT_INVALID, 'T= he VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset)) > - FdfProfile =3D Fdf.Profile > - else: > - if self.FdTargetList: > - EdkLogger.info("No flash definition file found. FD [%s]= will be ignored." % " ".join(self.FdTargetList)) > - self.FdTargetList =3D [] > - if self.FvTargetList: > - EdkLogger.info("No flash definition file found. FV [%s]= will be ignored." % " ".join(self.FvTargetList)) > - self.FvTargetList =3D [] > - if self.CapTargetList: > - EdkLogger.info("No flash definition file found. Capsule= [%s] will be ignored." % " ".join(self.CapTargetList)) > - self.CapTargetList =3D [] > - > - return FdfProfile > - > - def ProcessModuleFromPdf(self): > - > - if self.FdfProfile: > - for fvname in self.FvTargetList: > - if fvname.upper() not in self.FdfProfile.FvDict: > - EdkLogger.error("build", OPTION_VALUE_INVALID, > - "No such an FV in FDF file: %s" % f= vname) > - > - # In DSC file may use FILE_GUID to override the module, the= n in the Platform.Modules use FILE_GUIDmodule.inf as key, > - # but the path (self.MetaFile.Path) is the real path > - for key in self.FdfProfile.InfDict: > - if key =3D=3D 'ArchTBD': > - MetaFile_cache =3D defaultdict(set) > - for Arch in self.ArchList: > - Current_Platform_cache =3D self.BuildDatabase[s= elf.MetaFile, Arch, self.BuildTarget, self.ToolChain] > - for Pkey in Current_Platform_cache.Modules: > - MetaFile_cache[Arch].add(Current_Platform_c= ache.Modules[Pkey].MetaFile) > - for Inf in self.FdfProfile.InfDict[key]: > - ModuleFile =3D PathClass(NormPath(Inf), GlobalD= ata.gWorkspace, Arch) > - for Arch in self.ArchList: > - if ModuleFile in MetaFile_cache[Arch]: > - break > - else: > - ModuleData =3D self.BuildDatabase[ModuleFil= e, Arch, self.BuildTarget, self.ToolChain] > - if not ModuleData.IsBinaryModule: > - EdkLogger.error('build', PARSER_ERROR, = "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFi= le) > - > - else: > - for Arch in self.ArchList: > - if Arch =3D=3D key: > - Platform =3D self.BuildDatabase[self.MetaFi= le, Arch, self.BuildTarget, self.ToolChain] > - MetaFileList =3D set() > - for Pkey in Platform.Modules: > - MetaFileList.add(Platform.Modules[Pkey]= .MetaFile) > - for Inf in self.FdfProfile.InfDict[key]: > - ModuleFile =3D PathClass(NormPath(Inf),= GlobalData.gWorkspace, Arch) > - if ModuleFile in MetaFileList: > - continue > - ModuleData =3D self.BuildDatabase[Modul= eFile, Arch, self.BuildTarget, self.ToolChain] > - if not ModuleData.IsBinaryModule: > - EdkLogger.error('build', PARSER_ERR= OR, "Module %s NOT found in DSC file; Is it really a binary module?" % Modu= leFile) > - > - > - > - # parse FDF file to get PCDs in it, if any > - def GetPcdsFromFDF(self): > - > - if self.FdfProfile: > - PcdSet =3D self.FdfProfile.PcdDict > - # handle the mixed pcd in FDF file > - for key in PcdSet: > - if key in GlobalData.MixedPcd: > - Value =3D PcdSet[key] > - del PcdSet[key] > - for item in GlobalData.MixedPcd[key]: > - PcdSet[item] =3D Value > - self.VerifyPcdDeclearation(PcdSet) > - > - def ProcessPcdType(self): > - for Arch in self.ArchList: > - Platform =3D self.BuildDatabase[self.MetaFile, Arch, self.B= uildTarget, self.ToolChain] > - Platform.Pcds > - # generate the SourcePcdDict and BinaryPcdDict > - PGen =3D PlatformAutoGen(self, self.MetaFile, self.BuildTar= get, self.ToolChain, Arch) > - for BuildData in list(PGen.BuildDatabase._CACHE_.values()): > - if BuildData.Arch !=3D Arch: > - continue > - if BuildData.MetaFile.Ext =3D=3D '.inf': > - for key in BuildData.Pcds: > - if BuildData.Pcds[key].Pending: > - if key in Platform.Pcds: > - PcdInPlatform =3D Platform.Pcds[key] > - if PcdInPlatform.Type: > - BuildData.Pcds[key].Type =3D PcdInP= latform.Type > - BuildData.Pcds[key].Pending =3D Fal= se > - > - if BuildData.MetaFile in Platform.Modules: > - PlatformModule =3D Platform.Modules[str= (BuildData.MetaFile)] > - if key in PlatformModule.Pcds: > - PcdInPlatform =3D PlatformModule.Pc= ds[key] > - if PcdInPlatform.Type: > - BuildData.Pcds[key].Type =3D Pc= dInPlatform.Type > - BuildData.Pcds[key].Pending =3D= False > - else: > - #Pcd used in Library, Pcd Type from ref= erence module if Pcd Type is Pending > - if BuildData.Pcds[key].Pending: > - MGen =3D ModuleAutoGen(self, BuildD= ata.MetaFile, self.BuildTarget, self.ToolChain, Arch, self.MetaFile) > - if MGen and MGen.IsLibrary: > - if MGen in PGen.LibraryAutoGenL= ist: > - ReferenceModules =3D MGen.R= eferenceModules > - for ReferenceModule in Refe= renceModules: > - if ReferenceModule.Meta= File in Platform.Modules: > - RefPlatformModule = =3D Platform.Modules[str(ReferenceModule.MetaFile)] > - if key in RefPlatfo= rmModule.Pcds: > - PcdInReferenceM= odule =3D RefPlatformModule.Pcds[key] > - if PcdInReferen= ceModule.Type: > - BuildData.P= cds[key].Type =3D PcdInReferenceModule.Type > - BuildData.P= cds[key].Pending =3D False > - break > - > - def ProcessMixedPcd(self): > - for Arch in self.ArchList: > - SourcePcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATC= HABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()} > - BinaryPcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATC= HABLE_IN_MODULE:set()} > - SourcePcdDict_Keys =3D SourcePcdDict.keys() > - BinaryPcdDict_Keys =3D BinaryPcdDict.keys() > - > - # generate the SourcePcdDict and BinaryPcdDict > - PGen =3D PlatformAutoGen(self, self.MetaFile, self.BuildTar= get, self.ToolChain, Arch) > - for BuildData in list(PGen.BuildDatabase._CACHE_.values()): > - if BuildData.Arch !=3D Arch: > - continue > - if BuildData.MetaFile.Ext =3D=3D '.inf': > - for key in BuildData.Pcds: > - if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].T= ype: > - if BuildData.IsBinaryModule: > - BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add(= (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > - else: > - SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add(= (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > - > - elif TAB_PCDS_PATCHABLE_IN_MODULE in BuildData.= Pcds[key].Type: > - if BuildData.MetaFile.Ext =3D=3D '.inf': > - if BuildData.IsBinaryModule: > - BinaryPcdDict[TAB_PCDS_PATCHABLE_IN= _MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpac= eGuidCName)) > - else: > - SourcePcdDict[TAB_PCDS_PATCHABLE_IN= _MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpac= eGuidCName)) > - > - elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Ty= pe: > - SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildD= ata.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > - elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[= key].Type: > - SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add(= (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > - > - # > - # A PCD can only use one type for all source modules > - # > - for i in SourcePcdDict_Keys: > - for j in SourcePcdDict_Keys: > - if i !=3D j: > - Intersections =3D SourcePcdDict[i].intersection= (SourcePcdDict[j]) > - if len(Intersections) > 0: > - EdkLogger.error( > - 'build', > - FORMAT_INVALID, > - "Building modules from source INFs, followi= ng PCD use %s and %s access method. It must be corrected to use only one ac= cess method." % (i, j), > - ExtraData=3D'\n\t'.join(str(P[1]+'.'+P[0]) = for P in Intersections) > - ) > - > - # > - # intersection the BinaryPCD for Mixed PCD > - # > - for i in BinaryPcdDict_Keys: > - for j in BinaryPcdDict_Keys: > - if i !=3D j: > - Intersections =3D BinaryPcdDict[i].intersection= (BinaryPcdDict[j]) > - for item in Intersections: > - NewPcd1 =3D (item[0] + '_' + i, item[1]) > - NewPcd2 =3D (item[0] + '_' + j, item[1]) > - if item not in GlobalData.MixedPcd: > - GlobalData.MixedPcd[item] =3D [NewPcd1,= NewPcd2] > - else: > - if NewPcd1 not in GlobalData.MixedPcd[i= tem]: > - GlobalData.MixedPcd[item].append(Ne= wPcd1) > - if NewPcd2 not in GlobalData.MixedPcd[i= tem]: > - GlobalData.MixedPcd[item].append(Ne= wPcd2) > - > - # > - # intersection the SourcePCD and BinaryPCD for Mixed PCD > - # > - for i in SourcePcdDict_Keys: > - for j in BinaryPcdDict_Keys: > - if i !=3D j: > - Intersections =3D SourcePcdDict[i].intersection= (BinaryPcdDict[j]) > - for item in Intersections: > - NewPcd1 =3D (item[0] + '_' + i, item[1]) > - NewPcd2 =3D (item[0] + '_' + j, item[1]) > - if item not in GlobalData.MixedPcd: > - GlobalData.MixedPcd[item] =3D [NewPcd1,= NewPcd2] > - else: > - if NewPcd1 not in GlobalData.MixedPcd[i= tem]: > - GlobalData.MixedPcd[item].append(Ne= wPcd1) > - if NewPcd2 not in GlobalData.MixedPcd[i= tem]: > - GlobalData.MixedPcd[item].append(Ne= wPcd2) > - > - for BuildData in list(PGen.BuildDatabase._CACHE_.values()): > - if BuildData.Arch !=3D Arch: > - continue > - for key in BuildData.Pcds: > - for SinglePcd in GlobalData.MixedPcd: > - if (BuildData.Pcds[key].TokenCName, BuildData.P= cds[key].TokenSpaceGuidCName) =3D=3D SinglePcd: > - for item in GlobalData.MixedPcd[SinglePcd]: > - Pcd_Type =3D item[0].split('_')[-1] > - if (Pcd_Type =3D=3D BuildData.Pcds[key]= .Type) or (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC_EX and BuildData.Pcds[key].Type= in PCD_DYNAMIC_EX_TYPE_SET) or \ > - (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC an= d BuildData.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET): > - Value =3D BuildData.Pcds[key] > - Value.TokenCName =3D BuildData.Pcds= [key].TokenCName + '_' + Pcd_Type > - if len(key) =3D=3D 2: > - newkey =3D (Value.TokenCName, k= ey[1]) > - elif len(key) =3D=3D 3: > - newkey =3D (Value.TokenCName, k= ey[1], key[2]) > - del BuildData.Pcds[key] > - BuildData.Pcds[newkey] =3D Value > - break > - break > - > - #Collect package set information from INF of FDF > - @cached_property > - def PkgSet(self): > - if not self.FdfFile: > - self.FdfFile =3D self.Platform.FlashDefinition > - > - if self.FdfFile: > - ModuleList =3D self.FdfProfile.InfList > - else: > - ModuleList =3D [] > - Pkgs =3D {} > - for Arch in self.ArchList: > - Platform =3D self.BuildDatabase[self.MetaFile, Arch, self.B= uildTarget, self.ToolChain] > - PGen =3D PlatformAutoGen(self, self.MetaFile, self.BuildTar= get, self.ToolChain, Arch) > - PkgSet =3D set() > - for Inf in ModuleList: > - ModuleFile =3D PathClass(NormPath(Inf), GlobalData.gWor= kspace, Arch) > - if ModuleFile in Platform.Modules: > - continue > - ModuleData =3D self.BuildDatabase[ModuleFile, Arch, sel= f.BuildTarget, self.ToolChain] > - PkgSet.update(ModuleData.Packages) > - Pkgs[Arch] =3D list(PkgSet) + list(PGen.PackageList) > - return Pkgs > - > - def VerifyPcdDeclearation(self,PcdSet): > - for Arch in self.ArchList: > - Platform =3D self.BuildDatabase[self.MetaFile, Arch, self.B= uildTarget, self.ToolChain] > - Pkgs =3D self.PkgSet[Arch] > - DecPcds =3D set() > - DecPcdsKey =3D set() > - for Pkg in Pkgs: > - for Pcd in Pkg.Pcds: > - DecPcds.add((Pcd[0], Pcd[1])) > - DecPcdsKey.add((Pcd[0], Pcd[1], Pcd[2])) > - > - Platform.SkuName =3D self.SkuId > - for Name, Guid,Fileds in PcdSet: > - if (Name, Guid) not in DecPcds: > - EdkLogger.error( > - 'build', > - PARSER_ERROR, > - "PCD (%s.%s) used in FDF is not declared in DEC= files." % (Guid, Name), > - File =3D self.FdfProfile.PcdFileLineDict[Name, = Guid, Fileds][0], > - Line =3D self.FdfProfile.PcdFileLineDict[Name, = Guid, Fileds][1] > - ) > - else: > - # Check whether Dynamic or DynamicEx PCD used in FD= F file. If used, build break and give a error message. > - if (Name, Guid, TAB_PCDS_FIXED_AT_BUILD) in DecPcds= Key \ > - or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE) i= n DecPcdsKey \ > - or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in DecPc= dsKey: > - continue > - elif (Name, Guid, TAB_PCDS_DYNAMIC) in DecPcdsKey o= r (Name, Guid, TAB_PCDS_DYNAMIC_EX) in DecPcdsKey: > - EdkLogger.error( > - 'build', > - PARSER_ERROR, > - "Using Dynamic or DynamicEx type of PCD= [%s.%s] in FDF file is not allowed." % (Guid, Name), > - File =3D self.FdfProfile.PcdFileLineDic= t[Name, Guid, Fileds][0], > - Line =3D self.FdfProfile.PcdFileLineDic= t[Name, Guid, Fileds][1] > - ) > - def CollectAllPcds(self): > - > - for Arch in self.ArchList: > - Pa =3D PlatformAutoGen(self, self.MetaFile, self.BuildTarge= t, self.ToolChain, Arch) > - # > - # Explicitly collect platform's dynamic PCDs > - # > - Pa.CollectPlatformDynamicPcds() > - Pa.CollectFixedAtBuildPcds() > - self.AutoGenObjectList.append(Pa) > - > - # > - # Generate Package level hash value > - # > - def GeneratePkgLevelHash(self): > - for Arch in self.ArchList: > - GlobalData.gPackageHash =3D {} > - if GlobalData.gUseHashCache: > - for Pkg in self.PkgSet[Arch]: > - self._GenPkgLevelHash(Pkg) > - > - > - def CreateBuildOptionsFile(self): > - # > - # Create BuildOptions Macro & PCD metafile, also add the Active= Platform and FDF file. > - # > - content =3D 'gCommandLineDefines: ' > - content +=3D str(GlobalData.gCommandLineDefines) > - content +=3D TAB_LINE_BREAK > - content +=3D 'BuildOptionPcd: ' > - content +=3D str(GlobalData.BuildOptionPcd) > - content +=3D TAB_LINE_BREAK > - content +=3D 'Active Platform: ' > - content +=3D str(self.Platform) > - content +=3D TAB_LINE_BREAK > - if self.FdfFile: > - content +=3D 'Flash Image Definition: ' > - content +=3D str(self.FdfFile) > - content +=3D TAB_LINE_BREAK > - SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'), c= ontent, False) > - > - def CreatePcdTokenNumberFile(self): > - # > - # Create PcdToken Number file for Dynamic/DynamicEx Pcd. > - # > - PcdTokenNumber =3D 'PcdTokenNumber: ' > - Pa =3D self.AutoGenObjectList[0] > - if Pa.PcdTokenNumber: > - if Pa.DynamicPcdList: > - for Pcd in Pa.DynamicPcdList: > - PcdTokenNumber +=3D TAB_LINE_BREAK > - PcdTokenNumber +=3D str((Pcd.TokenCName, Pcd.TokenS= paceGuidCName)) > - PcdTokenNumber +=3D ' : ' > - PcdTokenNumber +=3D str(Pa.PcdTokenNumber[Pcd.Token= CName, Pcd.TokenSpaceGuidCName]) > - SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'),= PcdTokenNumber, False) > - > - def CreateModuleHashInfo(self): > - # > - # Get set of workspace metafiles > - # > - AllWorkSpaceMetaFiles =3D self._GetMetaFiles(self.BuildTarget, = self.ToolChain) > - > - # > - # Retrieve latest modified time of all metafiles > - # > - SrcTimeStamp =3D 0 > - for f in AllWorkSpaceMetaFiles: > - if os.stat(f)[8] > SrcTimeStamp: > - SrcTimeStamp =3D os.stat(f)[8] > - self._SrcTimeStamp =3D SrcTimeStamp > - > - if GlobalData.gUseHashCache: > - m =3D hashlib.md5() > - for files in AllWorkSpaceMetaFiles: > - if files.endswith('.dec'): > - continue > - f =3D open(files, 'rb') > - Content =3D f.read() > - f.close() > - m.update(Content) > - SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'= ), m.hexdigest(), False) > - GlobalData.gPlatformHash =3D m.hexdigest() > - > - # > - # Write metafile list to build directory > - # > - AutoGenFilePath =3D os.path.join(self.BuildDir, 'AutoGen') > - if os.path.exists (AutoGenFilePath): > - os.remove(AutoGenFilePath) > - if not os.path.exists(self.BuildDir): > - os.makedirs(self.BuildDir) > - with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file= : > - for f in AllWorkSpaceMetaFiles: > - print(f, file=3Dfile) > - return True > - > - def _GenPkgLevelHash(self, Pkg): > - if Pkg.PackageName in GlobalData.gPackageHash: > - return > - > - PkgDir =3D os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageNam= e) > - CreateDirectory(PkgDir) > - HashFile =3D os.path.join(PkgDir, Pkg.PackageName + '.hash') > - m =3D hashlib.md5() > - # Get .dec file's hash value > - f =3D open(Pkg.MetaFile.Path, 'rb') > - Content =3D f.read() > - f.close() > - m.update(Content) > - # Get include files hash value > - if Pkg.Includes: > - for inc in sorted(Pkg.Includes, key=3Dlambda x: str(x)): > - for Root, Dirs, Files in os.walk(str(inc)): > - for File in sorted(Files): > - File_Path =3D os.path.join(Root, File) > - f =3D open(File_Path, 'rb') > - Content =3D f.read() > - f.close() > - m.update(Content) > - SaveFileOnChange(HashFile, m.hexdigest(), False) > - GlobalData.gPackageHash[Pkg.PackageName] =3D m.hexdigest() > - > - def _GetMetaFiles(self, Target, Toolchain): > - AllWorkSpaceMetaFiles =3D set() > - # > - # add fdf > - # > - if self.FdfFile: > - AllWorkSpaceMetaFiles.add (self.FdfFile.Path) > - for f in GlobalData.gFdfParser.GetAllIncludedFile(): > - AllWorkSpaceMetaFiles.add (f.FileName) > - # > - # add dsc > - # > - AllWorkSpaceMetaFiles.add(self.MetaFile.Path) > - > - # > - # add build_rule.txt & tools_def.txt > - # > - AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirector= y, gDefaultBuildRuleFile)) > - AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirector= y, gDefaultToolsDefFile)) > - > - # add BuildOption metafile > - # > - AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'BuildOpt= ions')) > - > - # add PcdToken Number file for Dynamic/DynamicEx Pcd > - # > - AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'PcdToken= Number')) > - > - for Pa in self.AutoGenObjectList: > - AllWorkSpaceMetaFiles.add(Pa.ToolDefinitionFile) > - > - for Arch in self.ArchList: > - # > - # add dec > - # > - for Package in PlatformAutoGen(self, self.MetaFile, Target,= Toolchain, Arch).PackageList: > - AllWorkSpaceMetaFiles.add(Package.MetaFile.Path) > - > - # > - # add included dsc > - # > - for filePath in self.BuildDatabase[self.MetaFile, Arch, Tar= get, Toolchain]._RawData.IncludedFiles: > - AllWorkSpaceMetaFiles.add(filePath.Path) > - > - return AllWorkSpaceMetaFiles > - > - def _CheckPcdDefineAndType(self): > - PcdTypeSet =3D {TAB_PCDS_FIXED_AT_BUILD, > - TAB_PCDS_PATCHABLE_IN_MODULE, > - TAB_PCDS_FEATURE_FLAG, > - TAB_PCDS_DYNAMIC, > - TAB_PCDS_DYNAMIC_EX} > - > - # This dict store PCDs which are not used by any modules with s= pecified arches > - UnusedPcd =3D OrderedDict() > - for Pa in self.AutoGenObjectList: > - # Key of DSC's Pcds dictionary is PcdCName, TokenSpaceGuid > - for Pcd in Pa.Platform.Pcds: > - PcdType =3D Pa.Platform.Pcds[Pcd].Type > - > - # If no PCD type, this PCD comes from FDF > - if not PcdType: > - continue > - > - # Try to remove Hii and Vpd suffix > - if PcdType.startswith(TAB_PCDS_DYNAMIC_EX): > - PcdType =3D TAB_PCDS_DYNAMIC_EX > - elif PcdType.startswith(TAB_PCDS_DYNAMIC): > - PcdType =3D TAB_PCDS_DYNAMIC > - > - for Package in Pa.PackageList: > - # Key of DEC's Pcds dictionary is PcdCName, TokenSp= aceGuid, PcdType > - if (Pcd[0], Pcd[1], PcdType) in Package.Pcds: > - break > - for Type in PcdTypeSet: > - if (Pcd[0], Pcd[1], Type) in Package.Pcds: > - EdkLogger.error( > - 'build', > - FORMAT_INVALID, > - "Type [%s] of PCD [%s.%s] in DSC file d= oesn't match the type [%s] defined in DEC file." \ > - % (Pa.Platform.Pcds[Pcd].Type, Pcd[1], = Pcd[0], Type), > - ExtraData=3DNone > - ) > - return > - else: > - UnusedPcd.setdefault(Pcd, []).append(Pa.Arch) > - > - for Pcd in UnusedPcd: > - EdkLogger.warn( > - 'build', > - "The PCD was not specified by any INF module in the pla= tform for the given architecture.\n" > - "\tPCD: [%s.%s]\n\tPlatform: [%s]\n\tArch: %s" > - % (Pcd[1], Pcd[0], os.path.basename(str(self.MetaFile))= , str(UnusedPcd[Pcd])), > - ExtraData=3DNone > - ) > - > - def __repr__(self): > - return "%s [%s]" % (self.MetaFile, ", ".join(self.ArchList)) > - > - ## Return the directory to store FV files > - @cached_property > - def FvDir(self): > - return path.join(self.BuildDir, TAB_FV_DIRECTORY) > - > - ## Return the directory to store all intermediate and final files b= uilt > - @cached_property > - def BuildDir(self): > - return self.AutoGenObjectList[0].BuildDir > - > - ## Return the build output directory platform specifies > - @cached_property > - def OutputDir(self): > - return self.Platform.OutputDirectory > - > - ## Return platform name > - @cached_property > - def Name(self): > - return self.Platform.PlatformName > - > - ## Return meta-file GUID > - @cached_property > - def Guid(self): > - return self.Platform.Guid > - > - ## Return platform version > - @cached_property > - def Version(self): > - return self.Platform.Version > - > - ## Return paths of tools > - @cached_property > - def ToolDefinition(self): > - return self.AutoGenObjectList[0].ToolDefinition > - > - ## Return directory of platform makefile > - # > - # @retval string Makefile directory > - # > - @cached_property > - def MakeFileDir(self): > - return self.BuildDir > - > - ## Return build command string > - # > - # @retval string Build command string > - # > - @cached_property > - def BuildCommand(self): > - # BuildCommand should be all the same. So just get one from pla= tform AutoGen > - return self.AutoGenObjectList[0].BuildCommand > - > - ## Check the PCDs token value conflict in each DEC file. > - # > - # Will cause build break and raise error message while two PCDs con= flict. > - # > - # @return None > - # > - def _CheckAllPcdsTokenValueConflict(self): > - for Pa in self.AutoGenObjectList: > - for Package in Pa.PackageList: > - PcdList =3D list(Package.Pcds.values()) > - PcdList.sort(key=3Dlambda x: int(x.TokenValue, 0)) > - Count =3D 0 > - while (Count < len(PcdList) - 1) : > - Item =3D PcdList[Count] > - ItemNext =3D PcdList[Count + 1] > - # > - # Make sure in the same token space the TokenValue = should be unique > - # > - if (int(Item.TokenValue, 0) =3D=3D int(ItemNext.Tok= enValue, 0)): > - SameTokenValuePcdList =3D [] > - SameTokenValuePcdList.append(Item) > - SameTokenValuePcdList.append(ItemNext) > - RemainPcdListLength =3D len(PcdList) - Count - = 2 > - for ValueSameCount in range(RemainPcdListLength= ): > - if int(PcdList[len(PcdList) - RemainPcdList= Length + ValueSameCount].TokenValue, 0) =3D=3D int(Item.TokenValue, 0): > - SameTokenValuePcdList.append(PcdList[le= n(PcdList) - RemainPcdListLength + ValueSameCount]) > - else: > - break; > - # > - # Sort same token value PCD list with TokenGuid= and TokenCName > - # > - SameTokenValuePcdList.sort(key=3Dlambda x: "%s.= %s" % (x.TokenSpaceGuidCName, x.TokenCName)) > - SameTokenValuePcdListCount =3D 0 > - while (SameTokenValuePcdListCount < len(SameTok= enValuePcdList) - 1): > - Flag =3D False > - TemListItem =3D SameTokenValuePcdList[SameT= okenValuePcdListCount] > - TemListItemNext =3D SameTokenValuePcdList[S= ameTokenValuePcdListCount + 1] > - > - if (TemListItem.TokenSpaceGuidCName =3D=3D = TemListItemNext.TokenSpaceGuidCName) and (TemListItem.TokenCName !=3D TemLi= stItemNext.TokenCName): > - for PcdItem in GlobalData.MixedPcd: > - if (TemListItem.TokenCName, TemList= Item.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem] or \ > - (TemListItemNext.TokenCName, Te= mListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]: > - Flag =3D True > - if not Flag: > - EdkLogger.error( > - 'build', > - FORMAT_INVALID, > - "The TokenValue [%s] of= PCD [%s.%s] is conflict with: [%s.%s] in %s"\ > - % (TemListItem.TokenVal= ue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemNex= t.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package), > - ExtraData=3DNone > - ) > - SameTokenValuePcdListCount +=3D 1 > - Count +=3D SameTokenValuePcdListCount > - Count +=3D 1 > - > - PcdList =3D list(Package.Pcds.values()) > - PcdList.sort(key=3Dlambda x: "%s.%s" % (x.TokenSpaceGui= dCName, x.TokenCName)) > - Count =3D 0 > - while (Count < len(PcdList) - 1) : > - Item =3D PcdList[Count] > - ItemNext =3D PcdList[Count + 1] > - # > - # Check PCDs with same TokenSpaceGuidCName.TokenCNa= me have same token value as well. > - # > - if (Item.TokenSpaceGuidCName =3D=3D ItemNext.TokenS= paceGuidCName) and (Item.TokenCName =3D=3D ItemNext.TokenCName) and (int(It= em.TokenValue, 0) !=3D int(ItemNext.TokenValue, 0)): > - EdkLogger.error( > - 'build', > - FORMAT_INVALID, > - "The TokenValue [%s] of PCD [%s.%s]= in %s defined in two places should be same as well."\ > - % (Item.TokenValue, Item.TokenSpace= GuidCName, Item.TokenCName, Package), > - ExtraData=3DNone > - ) > - Count +=3D 1 > - ## Generate fds command > - @property > - def GenFdsCommand(self): > - return (GenMake.TopLevelMakefile(self)._TEMPLATE_.Replace(GenMa= ke.TopLevelMakefile(self)._TemplateDict)).strip() > - > - @property > - def GenFdsCommandDict(self): > - FdsCommandDict =3D {} > - LogLevel =3D EdkLogger.GetLevel() > - if LogLevel =3D=3D EdkLogger.VERBOSE: > - FdsCommandDict["verbose"] =3D True > - elif LogLevel <=3D EdkLogger.DEBUG_9: > - FdsCommandDict["debug"] =3D LogLevel - 1 > - elif LogLevel =3D=3D EdkLogger.QUIET: > - FdsCommandDict["quiet"] =3D True > - > - if GlobalData.gEnableGenfdsMultiThread: > - FdsCommandDict["GenfdsMultiThread"] =3D True > - if GlobalData.gIgnoreSource: > - FdsCommandDict["IgnoreSources"] =3D True > - > - FdsCommandDict["OptionPcd"] =3D [] > - for pcd in GlobalData.BuildOptionPcd: > - if pcd[2]: > - pcdname =3D '.'.join(pcd[0:3]) > - else: > - pcdname =3D '.'.join(pcd[0:2]) > - if pcd[3].startswith('{'): > - FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + 'H= ' + '"' + pcd[3] + '"') > - else: > - FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + pc= d[3]) > - > - MacroList =3D [] > - # macros passed to GenFds > - MacroDict =3D {} > - MacroDict.update(GlobalData.gGlobalDefines) > - MacroDict.update(GlobalData.gCommandLineDefines) > - for MacroName in MacroDict: > - if MacroDict[MacroName] !=3D "": > - MacroList.append('"%s=3D%s"' % (MacroName, MacroDict[Ma= croName].replace('\\', '\\\\'))) > - else: > - MacroList.append('"%s"' % MacroName) > - FdsCommandDict["macro"] =3D MacroList > - > - FdsCommandDict["fdf_file"] =3D [self.FdfFile] > - FdsCommandDict["build_target"] =3D self.BuildTarget > - FdsCommandDict["toolchain_tag"] =3D self.ToolChain > - FdsCommandDict["active_platform"] =3D str(self) > - > - FdsCommandDict["conf_directory"] =3D GlobalData.gConfDirectory > - FdsCommandDict["build_architecture_list"] =3D ','.join(self.Arc= hList) > - FdsCommandDict["platform_build_directory"] =3D self.BuildDir > - > - FdsCommandDict["fd"] =3D self.FdTargetList > - FdsCommandDict["fv"] =3D self.FvTargetList > - FdsCommandDict["cap"] =3D self.CapTargetList > - return FdsCommandDict > - > - ## Create makefile for the platform and modules in it > - # > - # @param CreateDepsMakeFile Flag indicating if the make= file for > - # modules will be created as = well > - # > - def CreateMakeFile(self, CreateDepsMakeFile=3DFalse): > - if not CreateDepsMakeFile: > - return > - for Pa in self.AutoGenObjectList: > - Pa.CreateMakeFile(True) > - > - ## Create autogen code for platform and modules > - # > - # Since there's no autogen code for platform, this method will do = nothing > - # if CreateModuleCodeFile is set to False. > - # > - # @param CreateDepsCodeFile Flag indicating if creating= module's > - # autogen code file or not > - # > - def CreateCodeFile(self, CreateDepsCodeFile=3DFalse): > - if not CreateDepsCodeFile: > - return > - for Pa in self.AutoGenObjectList: > - Pa.CreateCodeFile(True) > - > - ## Create AsBuilt INF file the platform > - # > - def CreateAsBuiltInf(self): > - return > - > - > -## AutoGen class for platform > -# > -# PlatformAutoGen class will process the original information in platf= orm > -# file in order to generate makefile for platform. > -# > -class PlatformAutoGen(AutoGen): > - # call super().__init__ then call the worker function with differen= t parameter count > - def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > - if not hasattr(self, "_Init"): > - self._InitWorker(Workspace, MetaFile, Target, Toolchain, Ar= ch) > - self._Init =3D True > - # > - # Used to store all PCDs for both PEI and DXE phase, in order to ge= nerate > - # correct PCD database > - # > - _DynaPcdList_ =3D [] > - _NonDynaPcdList_ =3D [] > - _PlatformPcds =3D {} > - > - # > - # The priority list while override build option > - # > - PrioList =3D {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMAND= TYPE_ATTRIBUTE (Highest) > - "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTY= PE_ATTRIBUTE > - "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTY= PE_ATTRIBUTE > - "0x00111" : 13, # ******_*********_ARCH_COMMANDTY= PE_ATTRIBUTE > - "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTY= PE_ATTRIBUTE > - "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTY= PE_ATTRIBUTE > - "0x10011" : 10, # TARGET_*********_****_COMMANDTY= PE_ATTRIBUTE > - "0x00011" : 9, # ******_*********_****_COMMANDTY= PE_ATTRIBUTE > - "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_*********= **_ATTRIBUTE > - "0x01101" : 7, # ******_TOOLCHAIN_ARCH_*********= **_ATTRIBUTE > - "0x10101" : 6, # TARGET_*********_ARCH_*********= **_ATTRIBUTE > - "0x00101" : 5, # ******_*********_ARCH_*********= **_ATTRIBUTE > - "0x11001" : 4, # TARGET_TOOLCHAIN_****_*********= **_ATTRIBUTE > - "0x01001" : 3, # ******_TOOLCHAIN_****_*********= **_ATTRIBUTE > - "0x10001" : 2, # TARGET_*********_****_*********= **_ATTRIBUTE > - "0x00001" : 1} # ******_*********_****_*********= **_ATTRIBUTE (Lowest) > - > - ## Initialize PlatformAutoGen > - # > - # > - # @param Workspace WorkspaceAutoGen object > - # @param PlatformFile Platform file (DSC file) > - # @param Target Build target (DEBUG, RELEASE) > - # @param Toolchain Name of tool chain > - # @param Arch arch of the platform supports > - # > - def _InitWorker(self, Workspace, PlatformFile, Target, Toolchain, A= rch): > - EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen platform [%s] [%s]"= % (PlatformFile, Arch)) > - GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (PlatformFil= e, Arch, Toolchain, Target) > - > - self.MetaFile =3D PlatformFile > - self.Workspace =3D Workspace > - self.WorkspaceDir =3D Workspace.WorkspaceDir > - self.ToolChain =3D Toolchain > - self.BuildTarget =3D Target > - self.Arch =3D Arch > - self.SourceDir =3D PlatformFile.SubDir > - self.FdTargetList =3D self.Workspace.FdTargetList > - self.FvTargetList =3D self.Workspace.FvTargetList > - # get the original module/package/platform objects > - self.BuildDatabase =3D Workspace.BuildDatabase > - self.DscBuildDataObj =3D Workspace.Platform > - > - # flag indicating if the makefile/C-code file has been created = or not > - self.IsMakeFileCreated =3D False > - > - self._DynamicPcdList =3D None # [(TokenCName1, TokenSpaceGui= dCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > - self._NonDynamicPcdList =3D None # [(TokenCName1, TokenSpaceGui= dCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > - > - self._AsBuildInfList =3D [] > - self._AsBuildModuleList =3D [] > - > - self.VariableInfo =3D None > - > - if GlobalData.gFdfParser is not None: > - self._AsBuildInfList =3D GlobalData.gFdfParser.Profile.InfL= ist > - for Inf in self._AsBuildInfList: > - InfClass =3D PathClass(NormPath(Inf), GlobalData.gWorks= pace, self.Arch) > - M =3D self.BuildDatabase[InfClass, self.Arch, self.Buil= dTarget, self.ToolChain] > - if not M.IsBinaryModule: > - continue > - self._AsBuildModuleList.append(InfClass) > - # get library/modules for build > - self.LibraryBuildDirectoryList =3D [] > - self.ModuleBuildDirectoryList =3D [] > - > - return True > - > - ## hash() operator of PlatformAutoGen > - # > - # The platform file path and arch string will be used to represent > - # hash value of this object > - # > - # @retval int Hash value of the platform file path and arch > - # > - @cached_class_function > - def __hash__(self): > - return hash((self.MetaFile, self.Arch)) > - > - @cached_class_function > - def __repr__(self): > - return "%s [%s]" % (self.MetaFile, self.Arch) > - > - ## Create autogen code for platform and modules > - # > - # Since there's no autogen code for platform, this method will do = nothing > - # if CreateModuleCodeFile is set to False. > - # > - # @param CreateModuleCodeFile Flag indicating if creating= module's > - # autogen code file or not > - # > - @cached_class_function > - def CreateCodeFile(self, CreateModuleCodeFile=3DFalse): > - # only module has code to be created, so do nothing if CreateMo= duleCodeFile is False > - if not CreateModuleCodeFile: > - return > - > - for Ma in self.ModuleAutoGenList: > - Ma.CreateCodeFile(True) > - > - ## Generate Fds Command > - @cached_property > - def GenFdsCommand(self): > - return self.Workspace.GenFdsCommand > - > - ## Create makefile for the platform and modules in it > - # > - # @param CreateModuleMakeFile Flag indicating if the make= file for > - # modules will be created as = well > - # > - def CreateMakeFile(self, CreateModuleMakeFile=3DFalse, FfsCommand = =3D {}): > - if CreateModuleMakeFile: > - for Ma in self._MaList: > - key =3D (Ma.MetaFile.File, self.Arch) > - if key in FfsCommand: > - Ma.CreateMakeFile(True, FfsCommand[key]) > - else: > - Ma.CreateMakeFile(True) > - > - # no need to create makefile for the platform more than once > - if self.IsMakeFileCreated: > - return > - > - # create library/module build dirs for platform > - Makefile =3D GenMake.PlatformMakefile(self) > - self.LibraryBuildDirectoryList =3D Makefile.GetLibraryBuildDire= ctoryList() > - self.ModuleBuildDirectoryList =3D Makefile.GetModuleBuildDirect= oryList() > - > - self.IsMakeFileCreated =3D True > - > - @property > - def AllPcdList(self): > - return self.DynamicPcdList + self.NonDynamicPcdList > - ## Deal with Shared FixedAtBuild Pcds > - # > - def CollectFixedAtBuildPcds(self): > - for LibAuto in self.LibraryAutoGenList: > - FixedAtBuildPcds =3D {} > - ShareFixedAtBuildPcdsSameValue =3D {} > - for Module in LibAuto.ReferenceModules: > - for Pcd in set(Module.FixedAtBuildPcds + LibAuto.FixedA= tBuildPcds): > - DefaultValue =3D Pcd.DefaultValue > - # Cover the case: DSC component override the Pcd va= lue and the Pcd only used in one Lib > - if Pcd in Module.LibraryPcdList: > - Index =3D Module.LibraryPcdList.index(Pcd) > - DefaultValue =3D Module.LibraryPcdList[Index].D= efaultValue > - key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.Toke= nCName)) > - if key not in FixedAtBuildPcds: > - ShareFixedAtBuildPcdsSameValue[key] =3D True > - FixedAtBuildPcds[key] =3D DefaultValue > - else: > - if FixedAtBuildPcds[key] !=3D DefaultValue: > - ShareFixedAtBuildPcdsSameValue[key] =3D Fal= se > - for Pcd in LibAuto.FixedAtBuildPcds: > - key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCNa= me)) > - if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in sel= f.NonDynamicPcdDict: > - continue > - else: > - DscPcd =3D self.NonDynamicPcdDict[(Pcd.TokenCName, = Pcd.TokenSpaceGuidCName)] > - if DscPcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > - continue > - if key in ShareFixedAtBuildPcdsSameValue and ShareFixed= AtBuildPcdsSameValue[key]: > - LibAuto.ConstPcd[key] =3D FixedAtBuildPcds[key] > - > - def CollectVariables(self, DynamicPcdSet): > - VpdRegionSize =3D 0 > - VpdRegionBase =3D 0 > - if self.Workspace.FdfFile: > - FdDict =3D self.Workspace.FdfProfile.FdDict[GlobalData.gFdf= Parser.CurrentFdName] > - for FdRegion in FdDict.RegionList: > - for item in FdRegion.RegionDataList: > - if self.Platform.VpdToolGuid.strip() and self.Platf= orm.VpdToolGuid in item: > - VpdRegionSize =3D FdRegion.Size > - VpdRegionBase =3D FdRegion.Offset > - break > - > - VariableInfo =3D VariableMgr(self.DscBuildDataObj._GetDefaultSt= ores(), self.DscBuildDataObj.SkuIds) > - VariableInfo.SetVpdRegionMaxSize(VpdRegionSize) > - VariableInfo.SetVpdRegionOffset(VpdRegionBase) > - Index =3D 0 > - for Pcd in DynamicPcdSet: > - pcdname =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCNa= me)) > - for SkuName in Pcd.SkuInfoList: > - Sku =3D Pcd.SkuInfoList[SkuName] > - SkuId =3D Sku.SkuId > - if SkuId is None or SkuId =3D=3D '': > - continue > - if len(Sku.VariableName) > 0: > - if Sku.VariableAttribute and 'NV' not in Sku.Variab= leAttribute: > - continue > - VariableGuidStructure =3D Sku.VariableGuidValue > - VariableGuid =3D GuidStructureStringToGuidString(Va= riableGuidStructure) > - for StorageName in Sku.DefaultStoreDict: > - VariableInfo.append_variable(var_info(Index, pc= dname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid,= Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.Defaul= tStoreDict[StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else Stri= ngToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAttr= ibute['DscPosition'], Pcd.CustomAttribute.get('IsStru',False))) > - Index +=3D 1 > - return VariableInfo > - > - def UpdateNVStoreMaxSize(self, OrgVpdFile): > - if self.VariableInfo: > - VpdMapFilePath =3D os.path.join(self.BuildDir, TAB_FV_DIREC= TORY, "%s.map" % self.Platform.VpdToolGuid) > - PcdNvStoreDfBuffer =3D [item for item in self._DynamicPcdLi= st if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.TokenS= paceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > - > - if PcdNvStoreDfBuffer: > - if os.path.exists(VpdMapFilePath): > - OrgVpdFile.Read(VpdMapFilePath) > - PcdItems =3D OrgVpdFile.GetOffset(PcdNvStoreDfBuffe= r[0]) > - NvStoreOffset =3D list(PcdItems.values())[0].strip(= ) if PcdItems else '0' > - else: > - EdkLogger.error("build", FILE_READ_FAILURE, "Can no= t find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > - > - NvStoreOffset =3D int(NvStoreOffset, 16) if NvStoreOffs= et.upper().startswith("0X") else int(NvStoreOffset) > - default_skuobj =3D PcdNvStoreDfBuffer[0].SkuInfoList.ge= t(TAB_DEFAULT) > - maxsize =3D self.VariableInfo.VpdRegionSize - NvStoreO= ffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultVal= ue.split(",")) > - var_data =3D self.VariableInfo.PatchNVStoreDefaultMaxSi= ze(maxsize) > - > - if var_data and default_skuobj: > - default_skuobj.DefaultValue =3D var_data > - PcdNvStoreDfBuffer[0].DefaultValue =3D var_data > - PcdNvStoreDfBuffer[0].SkuInfoList.clear() > - PcdNvStoreDfBuffer[0].SkuInfoList[TAB_DEFAULT] =3D = default_skuobj > - PcdNvStoreDfBuffer[0].MaxDatumSize =3D str(len(defa= ult_skuobj.DefaultValue.split(","))) > - > - return OrgVpdFile > - > - ## Collect dynamic PCDs > - # > - # Gather dynamic PCDs list from each module and their settings fro= m platform > - # This interface should be invoked explicitly when platform action= is created. > - # > - def CollectPlatformDynamicPcds(self): > - for key in self.Platform.Pcds: > - for SinglePcd in GlobalData.MixedPcd: > - if (self.Platform.Pcds[key].TokenCName, self.Platform.P= cds[key].TokenSpaceGuidCName) =3D=3D SinglePcd: > - for item in GlobalData.MixedPcd[SinglePcd]: > - Pcd_Type =3D item[0].split('_')[-1] > - if (Pcd_Type =3D=3D self.Platform.Pcds[key].Typ= e) or (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC_EX and self.Platform.Pcds[key].Type= in PCD_DYNAMIC_EX_TYPE_SET) or \ > - (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC and self.P= latform.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET): > - Value =3D self.Platform.Pcds[key] > - Value.TokenCName =3D self.Platform.Pcds[key= ].TokenCName + '_' + Pcd_Type > - if len(key) =3D=3D 2: > - newkey =3D (Value.TokenCName, key[1]) > - elif len(key) =3D=3D 3: > - newkey =3D (Value.TokenCName, key[1], k= ey[2]) > - del self.Platform.Pcds[key] > - self.Platform.Pcds[newkey] =3D Value > - break > - break > - > - # for gathering error information > - NoDatumTypePcdList =3D set() > - FdfModuleList =3D [] > - for InfName in self._AsBuildInfList: > - InfName =3D mws.join(self.WorkspaceDir, InfName) > - FdfModuleList.append(os.path.normpath(InfName)) > - for M in self._MaList: > -# F is the Module for which M is the module autogen > - for PcdFromModule in M.ModulePcdList + M.LibraryPcdList: > - # make sure that the "VOID*" kind of datum has MaxDatum= Size set > - if PcdFromModule.DatumType =3D=3D TAB_VOID and not PcdF= romModule.MaxDatumSize: > - NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModul= e.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile)) > - > - # Check the PCD from Binary INF or Source INF > - if M.IsBinaryModule =3D=3D True: > - PcdFromModule.IsFromBinaryInf =3D True > - > - # Check the PCD from DSC or not > - PcdFromModule.IsFromDsc =3D (PcdFromModule.TokenCName, = PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds > - > - if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET or PcdFro= mModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > - if M.MetaFile.Path not in FdfModuleList: > - # If one of the Source built modules listed in = the DSC is not listed > - # in FDF modules, and the INF lists a PCD can o= nly use the PcdsDynamic > - # access method (it is only listed in the DEC f= ile that declares the > - # PCD as PcdsDynamic), then build tool will rep= ort warning message > - # notify the PI that they are attempting to bui= ld a module that must > - # be included in a flash image in order to be f= unctional. These Dynamic > - # PCD will not be added into the Database unles= s it is used by other > - # modules that are included in the FDF file. > - if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET a= nd \ > - PcdFromModule.IsFromBinaryInf =3D=3D False: > - # Print warning message to let the develope= r make a determine. > - continue > - # If one of the Source built modules listed in = the DSC is not listed in > - # FDF modules, and the INF lists a PCD can only= use the PcdsDynamicEx > - # access method (it is only listed in the DEC f= ile that declares the > - # PCD as PcdsDynamicEx), then DO NOT break the = build; DO NOT add the > - # PCD to the Platform's PCD Database. > - if PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SE= T: > - continue > - # > - # If a dynamic PCD used by a PEM module/PEI module = & DXE module, > - # it should be stored in Pcd PEI database, If a dyn= amic only > - # used by DXE module, it should be stored in DXE PC= D database. > - # The default Phase is DXE > - # > - if M.ModuleType in SUP_MODULE_SET_PEI: > - PcdFromModule.Phase =3D "PEI" > - if PcdFromModule not in self._DynaPcdList_: > - self._DynaPcdList_.append(PcdFromModule) > - elif PcdFromModule.Phase =3D=3D 'PEI': > - # overwrite any the same PCD existing, if Phase= is PEI > - Index =3D self._DynaPcdList_.index(PcdFromModul= e) > - self._DynaPcdList_[Index] =3D PcdFromModule > - elif PcdFromModule not in self._NonDynaPcdList_: > - self._NonDynaPcdList_.append(PcdFromModule) > - elif PcdFromModule in self._NonDynaPcdList_ and PcdFrom= Module.IsFromBinaryInf =3D=3D True: > - Index =3D self._NonDynaPcdList_.index(PcdFromModule= ) > - if self._NonDynaPcdList_[Index].IsFromBinaryInf =3D= = =3D False: > - #The PCD from Binary INF will override the same= one from source INF > - self._NonDynaPcdList_.remove (self._NonDynaPcdL= ist_[Index]) > - PcdFromModule.Pending =3D False > - self._NonDynaPcdList_.append (PcdFromModule) > - DscModuleSet =3D {os.path.normpath(ModuleInf.Path) for ModuleIn= f in self.Platform.Modules} > - # add the PCD from modules that listed in FDF but not in DSC to= Database > - for InfName in FdfModuleList: > - if InfName not in DscModuleSet: > - InfClass =3D PathClass(InfName) > - M =3D self.BuildDatabase[InfClass, self.Arch, self.Buil= dTarget, self.ToolChain] > - # If a module INF in FDF but not in current arch's DSC = module list, it must be module (either binary or source) > - # for different Arch. PCDs in source module for differe= nt Arch is already added before, so skip the source module here. > - # For binary module, if in current arch, we need to lis= t the PCDs into database. > - if not M.IsBinaryModule: > - continue > - # Override the module PCD setting by platform setting > - ModulePcdList =3D self.ApplyPcdSetting(M, M.Pcds) > - for PcdFromModule in ModulePcdList: > - PcdFromModule.IsFromBinaryInf =3D True > - PcdFromModule.IsFromDsc =3D False > - # Only allow the DynamicEx and Patchable PCD in AsB= uild INF > - if PcdFromModule.Type not in PCD_DYNAMIC_EX_TYPE_SE= T and PcdFromModule.Type not in TAB_PCDS_PATCHABLE_IN_MODULE: > - EdkLogger.error("build", AUTOGEN_ERROR, "PCD se= tting error", > - File=3Dself.MetaFile, > - ExtraData=3D"\n\tExisted %s PCD= %s in:\n\t\t%s\n" > - % (PcdFromModule.Type, PcdFromM= odule.TokenCName, InfName)) > - # make sure that the "VOID*" kind of datum has MaxD= atumSize set > - if PcdFromModule.DatumType =3D=3D TAB_VOID and not = PcdFromModule.MaxDatumSize: > - NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromM= odule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName)) > - if M.ModuleType in SUP_MODULE_SET_PEI: > - PcdFromModule.Phase =3D "PEI" > - if PcdFromModule not in self._DynaPcdList_ and PcdF= romModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > - self._DynaPcdList_.append(PcdFromModule) > - elif PcdFromModule not in self._NonDynaPcdList_ and= PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE: > - self._NonDynaPcdList_.append(PcdFromModule) > - if PcdFromModule in self._DynaPcdList_ and PcdFromM= odule.Phase =3D=3D 'PEI' and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > - # Overwrite the phase of any the same PCD exist= ing, if Phase is PEI. > - # It is to solve the case that a dynamic PCD us= ed by a PEM module/PEI > - # module & DXE module at a same time. > - # Overwrite the type of the PCDs in source INF = by the type of AsBuild > - # INF file as DynamicEx. > - Index =3D self._DynaPcdList_.index(PcdFromModul= e) > - self._DynaPcdList_[Index].Phase =3D PcdFromModu= le.Phase > - self._DynaPcdList_[Index].Type =3D PcdFromModul= e.Type > - for PcdFromModule in self._NonDynaPcdList_: > - # If a PCD is not listed in the DSC file, but binary INF fi= les used by > - # this platform all (that use this PCD) list the PCD in a [= PatchPcds] > - # section, AND all source INF files used by this platform t= he build > - # that use the PCD list the PCD in either a [Pcds] or [Patc= hPcds] > - # section, then the tools must NOT add the PCD to the Platf= orm's PCD > - # Database; the build must assign the access method for thi= s PCD as > - # PcdsPatchableInModule. > - if PcdFromModule not in self._DynaPcdList_: > - continue > - Index =3D self._DynaPcdList_.index(PcdFromModule) > - if PcdFromModule.IsFromDsc =3D=3D False and \ > - PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE and = \ > - PcdFromModule.IsFromBinaryInf =3D=3D True and \ > - self._DynaPcdList_[Index].IsFromBinaryInf =3D=3D False: > - Index =3D self._DynaPcdList_.index(PcdFromModule) > - self._DynaPcdList_.remove (self._DynaPcdList_[Index]) > - > - # print out error information and break the build, if error fou= nd > - if len(NoDatumTypePcdList) > 0: > - NoDatumTypePcdListString =3D "\n\t\t".join(NoDatumTypePcdLi= st) > - EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting error"= , > - File=3Dself.MetaFile, > - ExtraData=3D"\n\tPCD(s) without MaxDatumSiz= e:\n\t\t%s\n" > - % NoDatumTypePcdListString) > - self._NonDynamicPcdList =3D self._NonDynaPcdList_ > - self._DynamicPcdList =3D self._DynaPcdList_ > - # > - # Sort dynamic PCD list to: > - # 1) If PCD's datum type is VOID* and value is unicode string w= hich starts with L, the PCD item should > - # try to be put header of dynamicd List > - # 2) If PCD is HII type, the PCD item should be put after unico= de type PCD > - # > - # The reason of sorting is make sure the unicode string is in d= ouble-byte alignment in string table. > - # > - UnicodePcdArray =3D set() > - HiiPcdArray =3D set() > - OtherPcdArray =3D set() > - VpdPcdDict =3D {} > - VpdFile =3D VpdInfoFile.VpdInfoFile() > - NeedProcessVpdMapFile =3D False > - > - for pcd in self.Platform.Pcds: > - if pcd not in self._PlatformPcds: > - self._PlatformPcds[pcd] =3D self.Platform.Pcds[pcd] > - > - for item in self._PlatformPcds: > - if self._PlatformPcds[item].DatumType and self._PlatformPcd= s[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TA= B_VOID, "BOOLEAN"]: > - self._PlatformPcds[item].DatumType =3D TAB_VOID > - > - if (self.Workspace.ArchList[-1] =3D=3D self.Arch): > - for Pcd in self._DynamicPcdList: > - # just pick the a value to determine whether is unicode= string type > - Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > - > - if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UIN= T32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > - Pcd.DatumType =3D TAB_VOID > - > - # if found PCD which datum value is unicode string = the insert to left size of UnicodeIndex > - # if found HII type PCD then insert to right of Uni= codeIndex > - if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_= EX_VPD]: > - VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName= )] =3D Pcd > - > - #Collect DynamicHii PCD values and assign it to DynamicExVp= d PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer > - PcdNvStoreDfBuffer =3D VpdPcdDict.get(("PcdNvStoreDefaultVa= lueBuffer", "gEfiMdeModulePkgTokenSpaceGuid")) > - if PcdNvStoreDfBuffer: > - self.VariableInfo =3D self.CollectVariables(self._Dynam= icPcdList) > - vardump =3D self.VariableInfo.dump() > - if vardump: > - # > - #According to PCD_DATABASE_INIT in edk2\MdeModulePk= g\Include\Guid\PcdDataBaseSignatureGuid.h, > - #the max size for string PCD should not exceed USHR= T_MAX 65535(0xffff). > - #typedef UINT16 SIZE_INFO; > - #//SIZE_INFO SizeTable[]; > - if len(vardump.split(",")) > 0xffff: > - EdkLogger.error("build", RESOURCE_OVERFLOW, 'Th= e current length of PCD %s value is %d, it exceeds to the max size of Strin= g PCD.' %(".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName,PcdNvStoreDfBuff= er.TokenCName]) ,len(vardump.split(",")))) > - PcdNvStoreDfBuffer.DefaultValue =3D vardump > - for skuname in PcdNvStoreDfBuffer.SkuInfoList: > - PcdNvStoreDfBuffer.SkuInfoList[skuname].Default= Value =3D vardump > - PcdNvStoreDfBuffer.MaxDatumSize =3D str(len(var= dump.split(","))) > - else: > - #If the end user define [DefaultStores] and [XXX.Menufa= cturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to Pc= dsDynamicVpd > - if [Pcd for Pcd in self._DynamicPcdList if Pcd.UserDefi= nedDefaultStoresFlag]: > - EdkLogger.warn("build", "PcdNvStoreDefaultValueBuff= er should be defined as PcdsDynamicExVpd in dsc file since the DefaultStore= s is enabled for this platform.\n%s" %self.Platform.MetaFile.Path) > - PlatformPcds =3D sorted(self._PlatformPcds.keys()) > - # > - # Add VPD type PCD into VpdFile and determine whether the V= PD PCD need to be fixed up. > - # > - VpdSkuMap =3D {} > - for PcdKey in PlatformPcds: > - Pcd =3D self._PlatformPcds[PcdKey] > - if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_= EX_VPD] and \ > - PcdKey in VpdPcdDict: > - Pcd =3D VpdPcdDict[PcdKey] > - SkuValueMap =3D {} > - DefaultSku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > - if DefaultSku: > - PcdValue =3D DefaultSku.DefaultValue > - if PcdValue not in SkuValueMap: > - SkuValueMap[PcdValue] =3D [] > - VpdFile.Add(Pcd, TAB_DEFAULT, DefaultSku.Vp= dOffset) > - SkuValueMap[PcdValue].append(DefaultSku) > - > - for (SkuName, Sku) in Pcd.SkuInfoList.items(): > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > - PcdValue =3D Sku.DefaultValue > - if PcdValue =3D=3D "": > - PcdValue =3D Pcd.DefaultValue > - if Sku.VpdOffset !=3D TAB_STAR: > - if PcdValue.startswith("{"): > - Alignment =3D 8 > - elif PcdValue.startswith("L"): > - Alignment =3D 2 > - else: > - Alignment =3D 1 > - try: > - VpdOffset =3D int(Sku.VpdOffset) > - except: > - try: > - VpdOffset =3D int(Sku.VpdOffset, 16= ) > - except: > - EdkLogger.error("build", FORMAT_INV= ALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Pcd.TokenS= paceGuidCName, Pcd.TokenCName)) > - if VpdOffset % Alignment !=3D 0: > - if PcdValue.startswith("{"): > - EdkLogger.warn("build", "The offset= value of PCD %s.%s is not 8-byte aligned!" %(Pcd.TokenSpaceGuidCName, Pcd.= TokenCName), File=3Dself.MetaFile) > - else: > - EdkLogger.error("build", FORMAT_INV= ALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.Tok= enSpaceGuidCName, Pcd.TokenCName, Alignment)) > - if PcdValue not in SkuValueMap: > - SkuValueMap[PcdValue] =3D [] > - VpdFile.Add(Pcd, SkuName, Sku.VpdOffset) > - SkuValueMap[PcdValue].append(Sku) > - # if the offset of a VPD is *, then it need to = be fixed up by third party tool. > - if not NeedProcessVpdMapFile and Sku.VpdOffset = = =3D=3D TAB_STAR: > - NeedProcessVpdMapFile =3D True > - if self.Platform.VpdToolGuid is None or sel= f.Platform.VpdToolGuid =3D=3D '': > - EdkLogger.error("Build", FILE_NOT_FOUND= , \ > - "Fail to find third-par= ty BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tool= s_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > - > - VpdSkuMap[PcdKey] =3D SkuValueMap > - # > - # Fix the PCDs define in VPD PCD section that never referen= ced by module. > - # An example is PCD for signature usage. > - # > - for DscPcd in PlatformPcds: > - DscPcdEntry =3D self._PlatformPcds[DscPcd] > - if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_= DYNAMIC_EX_VPD]: > - if not (self.Platform.VpdToolGuid is None or self.P= latform.VpdToolGuid =3D=3D ''): > - FoundFlag =3D False > - for VpdPcd in VpdFile._VpdArray: > - # This PCD has been referenced by module > - if (VpdPcd.TokenSpaceGuidCName =3D=3D DscPc= dEntry.TokenSpaceGuidCName) and \ > - (VpdPcd.TokenCName =3D=3D DscPcdEntry.To= kenCName): > - FoundFlag =3D True > - > - # Not found, it should be signature > - if not FoundFlag : > - # just pick the a value to determine whethe= r is unicode string type > - SkuValueMap =3D {} > - SkuObjList =3D list(DscPcdEntry.SkuInfoList= .items()) > - DefaultSku =3D DscPcdEntry.SkuInfoList.get(= TAB_DEFAULT) > - if DefaultSku: > - defaultindex =3D SkuObjList.index((TAB_= DEFAULT, DefaultSku)) > - SkuObjList[0], SkuObjList[defaultindex]= =3D SkuObjList[defaultindex], SkuObjList[0] > - for (SkuName, Sku) in SkuObjList: > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > - > - # Need to iterate DEC pcd information t= o get the value & datumtype > - for eachDec in self.PackageList: > - for DecPcd in eachDec.Pcds: > - DecPcdEntry =3D eachDec.Pcds[De= cPcd] > - if (DecPcdEntry.TokenSpaceGuidC= Name =3D=3D DscPcdEntry.TokenSpaceGuidCName) and \ > - (DecPcdEntry.TokenCName =3D= =3D DscPcdEntry.TokenCName): > - # Print warning message to = let the developer make a determine. > - EdkLogger.warn("build", "Un= referenced vpd pcd used!", > - File=3Dself= .MetaFile, \ > - ExtraData = =3D "PCD: %s.%s used in the DSC file %s is unreferenced." \ > - %(DscPcdEnt= ry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path= )) > - > - DscPcdEntry.DatumType = =3D DecPcdEntry.DatumType > - DscPcdEntry.DefaultValue = =3D DecPcdEntry.DefaultValue > - DscPcdEntry.TokenValue =3D = DecPcdEntry.TokenValue > - DscPcdEntry.TokenSpaceGuidV= alue =3D eachDec.Guids[DecPcdEntry.TokenSpaceGuidCName] > - # Only fix the value while = no value provided in DSC file. > - if not Sku.DefaultValue: > - DscPcdEntry.SkuInfoList= [list(DscPcdEntry.SkuInfoList.keys())[0]].DefaultValue =3D DecPcdEntry.Defa= ultValue > - > - if DscPcdEntry not in self._DynamicPcdL= ist: > - self._DynamicPcdList.append(DscPcdE= ntry) > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > - PcdValue =3D Sku.DefaultValue > - if PcdValue =3D=3D "": > - PcdValue =3D DscPcdEntry.DefaultVa= lue > - if Sku.VpdOffset !=3D TAB_STAR: > - if PcdValue.startswith("{"): > - Alignment =3D 8 > - elif PcdValue.startswith("L"): > - Alignment =3D 2 > - else: > - Alignment =3D 1 > - try: > - VpdOffset =3D int(Sku.VpdOffset= ) > - except: > - try: > - VpdOffset =3D int(Sku.VpdOf= fset, 16) > - except: > - EdkLogger.error("build", FO= RMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Ds= cPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName)) > - if VpdOffset % Alignment !=3D 0: > - if PcdValue.startswith("{"): > - EdkLogger.warn("build", "Th= e offset value of PCD %s.%s is not 8-byte aligned!" %(DscPcdEntry.TokenSpac= eGuidCName, DscPcdEntry.TokenCName), File=3Dself.MetaFile) > - else: > - EdkLogger.error("build", FO= RMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % = (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment)) > - if PcdValue not in SkuValueMap: > - SkuValueMap[PcdValue] =3D [] > - VpdFile.Add(DscPcdEntry, SkuName, S= ku.VpdOffset) > - SkuValueMap[PcdValue].append(Sku) > - if not NeedProcessVpdMapFile and Sku.Vp= dOffset =3D=3D TAB_STAR: > - NeedProcessVpdMapFile =3D True > - if DscPcdEntry.DatumType =3D=3D TAB_VOID an= d PcdValue.startswith("L"): > - UnicodePcdArray.add(DscPcdEntry) > - elif len(Sku.VariableName) > 0: > - HiiPcdArray.add(DscPcdEntry) > - else: > - OtherPcdArray.add(DscPcdEntry) > - > - # if the offset of a VPD is *, then it = need to be fixed up by third party tool. > - VpdSkuMap[DscPcd] =3D SkuValueMap > - if (self.Platform.FlashDefinition is None or self.Platform.= FlashDefinition =3D=3D '') and \ > - VpdFile.GetCount() !=3D 0: > - EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, > - "Fail to get FLASH_DEFINITION definitio= n in DSC file %s which is required when DSC contains VPD PCD." % str(self.P= latform.MetaFile)) > - > - if VpdFile.GetCount() !=3D 0: > - > - self.FixVpdOffset(VpdFile) > - > - self.FixVpdOffset(self.UpdateNVStoreMaxSize(VpdFile)) > - PcdNvStoreDfBuffer =3D [item for item in self._DynamicP= cdList if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.To= kenSpaceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > - if PcdNvStoreDfBuffer: > - PcdName,PcdGuid =3D PcdNvStoreDfBuffer[0].TokenCNam= e, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName > - if (PcdName,PcdGuid) in VpdSkuMap: > - DefaultSku =3D PcdNvStoreDfBuffer[0].SkuInfoLis= t.get(TAB_DEFAULT) > - VpdSkuMap[(PcdName,PcdGuid)] =3D {DefaultSku.De= faultValue:[SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values()= ]} > - > - # Process VPD map file generated by third party BPDG to= ol > - if NeedProcessVpdMapFile: > - VpdMapFilePath =3D os.path.join(self.BuildDir, TAB_= FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid) > - if os.path.exists(VpdMapFilePath): > - VpdFile.Read(VpdMapFilePath) > - > - # Fixup TAB_STAR offset > - for pcd in VpdSkuMap: > - vpdinfo =3D VpdFile.GetVpdInfo(pcd) > - if vpdinfo is None: > - # just pick the a value to determine whethe= r is unicode string type > - continue > - for pcdvalue in VpdSkuMap[pcd]: > - for sku in VpdSkuMap[pcd][pcdvalue]: > - for item in vpdinfo: > - if item[2] =3D=3D pcdvalue: > - sku.VpdOffset =3D item[1] > - else: > - EdkLogger.error("build", FILE_READ_FAILURE, "Ca= n not find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > - > - # Delete the DynamicPcdList At the last time enter into thi= s function > - for Pcd in self._DynamicPcdList: > - # just pick the a value to determine whether is unicode= string type > - Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > - > - if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UIN= T32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > - Pcd.DatumType =3D TAB_VOID > - > - PcdValue =3D Sku.DefaultValue > - if Pcd.DatumType =3D=3D TAB_VOID and PcdValue.startswit= h("L"): > - # if found PCD which datum value is unicode string = the insert to left size of UnicodeIndex > - UnicodePcdArray.add(Pcd) > - elif len(Sku.VariableName) > 0: > - # if found HII type PCD then insert to right of Uni= codeIndex > - HiiPcdArray.add(Pcd) > - else: > - OtherPcdArray.add(Pcd) > - del self._DynamicPcdList[:] > - self._DynamicPcdList.extend(list(UnicodePcdArray)) > - self._DynamicPcdList.extend(list(HiiPcdArray)) > - self._DynamicPcdList.extend(list(OtherPcdArray)) > - allskuset =3D [(SkuName, Sku.SkuId) for pcd in self._DynamicPcd= List for (SkuName, Sku) in pcd.SkuInfoList.items()] > - for pcd in self._DynamicPcdList: > - if len(pcd.SkuInfoList) =3D=3D 1: > - for (SkuName, SkuId) in allskuset: > - if isinstance(SkuId, str) and eval(SkuId) =3D=3D 0 = or SkuId =3D=3D 0: > - continue > - pcd.SkuInfoList[SkuName] =3D copy.deepcopy(pcd.SkuI= nfoList[TAB_DEFAULT]) > - pcd.SkuInfoList[SkuName].SkuId =3D SkuId > - pcd.SkuInfoList[SkuName].SkuIdName =3D SkuName > - > - def FixVpdOffset(self, VpdFile ): > - FvPath =3D os.path.join(self.BuildDir, TAB_FV_DIRECTORY) > - if not os.path.exists(FvPath): > - try: > - os.makedirs(FvPath) > - except: > - EdkLogger.error("build", FILE_WRITE_FAILURE, "Fail to c= reate FV folder under %s" % self.BuildDir) > - > - VpdFilePath =3D os.path.join(FvPath, "%s.txt" % self.Platform.V= pdToolGuid) > - > - if VpdFile.Write(VpdFilePath): > - # retrieve BPDG tool's path from tool_def.txt according to = VPD_TOOL_GUID defined in DSC file. > - BPDGToolName =3D None > - for ToolDef in self.ToolDefinition.values(): > - if TAB_GUID in ToolDef and ToolDef[TAB_GUID] =3D=3D sel= f.Platform.VpdToolGuid: > - if "PATH" not in ToolDef: > - EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABL= E, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt"= % self.Platform.VpdToolGuid) > - BPDGToolName =3D ToolDef["PATH"] > - break > - # Call third party GUID BPDG tool. > - if BPDGToolName is not None: > - VpdInfoFile.CallExtenalBPDGTool(BPDGToolName, VpdFilePa= th) > - else: > - EdkLogger.error("Build", FILE_NOT_FOUND, "Fail to find = third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be define= d in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > - > - ## Return the platform build data object > - @cached_property > - def Platform(self): > - return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildT= arget, self.ToolChain] > - > - ## Return platform name > - @cached_property > - def Name(self): > - return self.Platform.PlatformName > - > - ## Return the meta file GUID > - @cached_property > - def Guid(self): > - return self.Platform.Guid > - > - ## Return the platform version > - @cached_property > - def Version(self): > - return self.Platform.Version > - > - ## Return the FDF file name > - @cached_property > - def FdfFile(self): > - if self.Workspace.FdfFile: > - RetVal=3D mws.join(self.WorkspaceDir, self.Workspace.FdfFil= e) > - else: > - RetVal =3D '' > - return RetVal > - > - ## Return the build output directory platform specifies > - @cached_property > - def OutputDir(self): > - return self.Platform.OutputDirectory > - > - ## Return the directory to store all intermediate and final files b= uilt > - @cached_property > - def BuildDir(self): > - if os.path.isabs(self.OutputDir): > - GlobalData.gBuildDirectory =3D RetVal =3D path.join( > - path.abspath(self.OutputDir), > - self.BuildTarget + "_" + self.T= oolChain, > - ) > - else: > - GlobalData.gBuildDirectory =3D RetVal =3D path.join( > - self.WorkspaceDir, > - self.OutputDir, > - self.BuildTarget + "_" + self.T= oolChain, > - ) > - return RetVal > - > - ## Return directory of platform makefile > - # > - # @retval string Makefile directory > - # > - @cached_property > - def MakeFileDir(self): > - return path.join(self.BuildDir, self.Arch) > - > - ## Return build command string > - # > - # @retval string Build command string > - # > - @cached_property > - def BuildCommand(self): > - RetVal =3D [] > - if "MAKE" in self.ToolDefinition and "PATH" in self.ToolDefinit= ion["MAKE"]: > - RetVal +=3D _SplitOption(self.ToolDefinition["MAKE"]["PATH"= ]) > - if "FLAGS" in self.ToolDefinition["MAKE"]: > - NewOption =3D self.ToolDefinition["MAKE"]["FLAGS"].stri= p() > - if NewOption !=3D '': > - RetVal +=3D _SplitOption(NewOption) > - if "MAKE" in self.EdkIIBuildOption: > - if "FLAGS" in self.EdkIIBuildOption["MAKE"]: > - Flags =3D self.EdkIIBuildOption["MAKE"]["FLAGS"] > - if Flags.startswith('=3D'): > - RetVal =3D [RetVal[0]] + [Flags[1:]] > - else: > - RetVal.append(Flags) > - return RetVal > - > - ## Get tool chain definition > - # > - # Get each tool definition for given tool chain from tools_def.txt= and platform > - # > - @cached_property > - def ToolDefinition(self): > - ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDictionary > - if TAB_TOD_DEFINES_COMMAND_TYPE not in self.Workspace.ToolDef.T= oolsDefTxtDatabase: > - EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No tools = found in configuration", > - ExtraData=3D"[%s]" % self.MetaFile) > - RetVal =3D OrderedDict() > - DllPathList =3D set() > - for Def in ToolDefinition: > - Target, Tag, Arch, Tool, Attr =3D Def.split("_") > - if Target !=3D self.BuildTarget or Tag !=3D self.ToolChain = or Arch !=3D self.Arch: > - continue > - > - Value =3D ToolDefinition[Def] > - # don't record the DLL > - if Attr =3D=3D "DLL": > - DllPathList.add(Value) > - continue > - > - if Tool not in RetVal: > - RetVal[Tool] =3D OrderedDict() > - RetVal[Tool][Attr] =3D Value > - > - ToolsDef =3D '' > - if GlobalData.gOptions.SilentMode and "MAKE" in RetVal: > - if "FLAGS" not in RetVal["MAKE"]: > - RetVal["MAKE"]["FLAGS"] =3D "" > - RetVal["MAKE"]["FLAGS"] +=3D " -s" > - MakeFlags =3D '' > - for Tool in RetVal: > - for Attr in RetVal[Tool]: > - Value =3D RetVal[Tool][Attr] > - if Tool in self._BuildOptionWithToolDef(RetVal) and Att= r in self._BuildOptionWithToolDef(RetVal)[Tool]: > - # check if override is indicated > - if self._BuildOptionWithToolDef(RetVal)[Tool][Attr]= .startswith('=3D'): > - Value =3D self._BuildOptionWithToolDef(RetVal)[= Tool][Attr][1:] > - else: > - if Attr !=3D 'PATH': > - Value +=3D " " + self._BuildOptionWithToolD= ef(RetVal)[Tool][Attr] > - else: > - Value =3D self._BuildOptionWithToolDef(RetV= al)[Tool][Attr] > - > - if Attr =3D=3D "PATH": > - # Don't put MAKE definition in the file > - if Tool !=3D "MAKE": > - ToolsDef +=3D "%s =3D %s\n" % (Tool, Value) > - elif Attr !=3D "DLL": > - # Don't put MAKE definition in the file > - if Tool =3D=3D "MAKE": > - if Attr =3D=3D "FLAGS": > - MakeFlags =3D Value > - else: > - ToolsDef +=3D "%s_%s =3D %s\n" % (Tool, Attr, V= alue) > - ToolsDef +=3D "\n" > - tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF." += self.Arch) > - SaveFileOnChange(tool_def_file, ToolsDef, False) > - for DllPath in DllPathList: > - os.environ["PATH"] =3D DllPath + os.pathsep + os.environ["P= ATH"] > - os.environ["MAKE_FLAGS"] =3D MakeFlags > - > - return RetVal > - > - ## Return the paths of tools > - @cached_property > - def ToolDefinitionFile(self): > - tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF." += self.Arch) > - if not os.path.exists(tool_def_file): > - self.ToolDefinition > - return tool_def_file > - > - ## Retrieve the toolchain family of given toolchain tag. Default to= 'MSFT'. > - @cached_property > - def ToolChainFamily(self): > - ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > - if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \ > - or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_FAMI= LY] \ > - or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain= ]: > - EdkLogger.verbose("No tool chain family found in configurat= ion for %s. Default to MSFT." \ > - % self.ToolChain) > - RetVal =3D TAB_COMPILER_MSFT > - else: > - RetVal =3D ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.Tool= Chain] > - return RetVal > - > - @cached_property > - def BuildRuleFamily(self): > - ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > - if TAB_TOD_DEFINES_BUILDRULEFAMILY not in ToolDefinition \ > - or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_BUIL= DRULEFAMILY] \ > - or not ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.= ToolChain]: > - EdkLogger.verbose("No tool chain family found in configurat= ion for %s. Default to MSFT." \ > - % self.ToolChain) > - return TAB_COMPILER_MSFT > - > - return ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.Too= lChain] > - > - ## Return the build options specific for all modules in this platfo= rm > - @cached_property > - def BuildOption(self): > - return self._ExpandBuildOption(self.Platform.BuildOptions) > - > - def _BuildOptionWithToolDef(self, ToolDef): > - return self._ExpandBuildOption(self.Platform.BuildOptions, Tool= Def=3DToolDef) > - > - ## Return the build options specific for EDK modules in this platfo= rm > - @cached_property > - def EdkBuildOption(self): > - return self._ExpandBuildOption(self.Platform.BuildOptions, EDK_= NAME) > - > - ## Return the build options specific for EDKII modules in this plat= form > - @cached_property > - def EdkIIBuildOption(self): > - return self._ExpandBuildOption(self.Platform.BuildOptions, EDKI= I_NAME) > - > - ## Summarize the packages used by modules in this platform > - @cached_property > - def PackageList(self): > - RetVal =3D set() > - for La in self.LibraryAutoGenList: > - RetVal.update(La.DependentPackageList) > - for Ma in self.ModuleAutoGenList: > - RetVal.update(Ma.DependentPackageList) > - #Collect package set information from INF of FDF > - for ModuleFile in self._AsBuildModuleList: > - if ModuleFile in self.Platform.Modules: > - continue > - ModuleData =3D self.BuildDatabase[ModuleFile, self.Arch, se= lf.BuildTarget, self.ToolChain] > - RetVal.update(ModuleData.Packages) > - return list(RetVal) > - > - @cached_property > - def NonDynamicPcdDict(self): > - return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):Pcd for Pcd i= n self.NonDynamicPcdList} > - > - ## Get list of non-dynamic PCDs > - @property > - def NonDynamicPcdList(self): > - if not self._NonDynamicPcdList: > - self.CollectPlatformDynamicPcds() > - return self._NonDynamicPcdList > - > - ## Get list of dynamic PCDs > - @property > - def DynamicPcdList(self): > - if not self._DynamicPcdList: > - self.CollectPlatformDynamicPcds() > - return self._DynamicPcdList > - > - ## Generate Token Number for all PCD > - @cached_property > - def PcdTokenNumber(self): > - RetVal =3D OrderedDict() > - TokenNumber =3D 1 > - # > - # Make the Dynamic and DynamicEx PCD use within different Token= Number area. > - # Such as: > - # > - # Dynamic PCD: > - # TokenNumber 0 ~ 10 > - # DynamicEx PCD: > - # TokeNumber 11 ~ 20 > - # > - for Pcd in self.DynamicPcdList: > - if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_TYPE_= SET: > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > - TokenNumber +=3D 1 > - > - for Pcd in self.DynamicPcdList: > - if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_EX_TY= PE_SET: > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > - TokenNumber +=3D 1 > - > - for Pcd in self.DynamicPcdList: > - if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_TYPE_= SET: > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > - TokenNumber +=3D 1 > - > - for Pcd in self.DynamicPcdList: > - if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_EX_TY= PE_SET: > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > - TokenNumber +=3D 1 > - > - for Pcd in self.NonDynamicPcdList: > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D TokenNu= mber > - TokenNumber +=3D 1 > - return RetVal > - > - @cached_property > - def _MaList(self): > - for ModuleFile in self.Platform.Modules: > - Ma =3D ModuleAutoGen( > - self.Workspace, > - ModuleFile, > - self.BuildTarget, > - self.ToolChain, > - self.Arch, > - self.MetaFile > - ) > - self.Platform.Modules[ModuleFile].M =3D Ma > - return [x.M for x in self.Platform.Modules.values()] > - > - ## Summarize ModuleAutoGen objects of all modules to be built for t= his platform > - @cached_property > - def ModuleAutoGenList(self): > - RetVal =3D [] > - for Ma in self._MaList: > - if Ma not in RetVal: > - RetVal.append(Ma) > - return RetVal > - > - ## Summarize ModuleAutoGen objects of all libraries to be built for= this platform > - @cached_property > - def LibraryAutoGenList(self): > - RetVal =3D [] > - for Ma in self._MaList: > - for La in Ma.LibraryAutoGenList: > - if La not in RetVal: > - RetVal.append(La) > - if Ma not in La.ReferenceModules: > - La.ReferenceModules.append(Ma) > - return RetVal > - > - ## Test if a module is supported by the platform > - # > - # An error will be raised directly if the module or its arch is no= t supported > - # by the platform or current configuration > - # > - def ValidModule(self, Module): > - return Module in self.Platform.Modules or Module in self.Platfo= rm.LibraryInstances \ > - or Module in self._AsBuildModuleList > - > - ## Resolve the library classes in a module to library instances > - # > - # This method will not only resolve library classes but also sort t= he library > - # instances according to the dependency-ship. > - # > - # @param Module The module from which the library classes w= ill be resolved > - # > - # @retval library_list List of library instances sorted > - # > - def ApplyLibraryInstance(self, Module): > - # Cover the case that the binary INF file is list in the FDF fi= le but not DSC file, return empty list directly > - if str(Module) not in self.Platform.Modules: > - return [] > - > - return GetModuleLibInstances(Module, > - self.Platform, > - self.BuildDatabase, > - self.Arch, > - self.BuildTarget, > - self.ToolChain, > - self.MetaFile, > - EdkLogger) > - > - ## Override PCD setting (type, value, ...) > - # > - # @param ToPcd The PCD to be overridden > - # @param FromPcd The PCD overriding from > - # > - def _OverridePcd(self, ToPcd, FromPcd, Module=3D"", Msg=3D"", Libra= ry=3D""): > - # > - # in case there's PCDs coming from FDF file, which have no type= given. > - # at this point, ToPcd.Type has the type found from dependent > - # package > - # > - TokenCName =3D ToPcd.TokenCName > - for PcdItem in GlobalData.MixedPcd: > - if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in GlobalD= ata.MixedPcd[PcdItem]: > - TokenCName =3D PcdItem[0] > - break > - if FromPcd is not None: > - if ToPcd.Pending and FromPcd.Type: > - ToPcd.Type =3D FromPcd.Type > - elif ToPcd.Type and FromPcd.Type\ > - and ToPcd.Type !=3D FromPcd.Type and ToPcd.Type in From= Pcd.Type: > - if ToPcd.Type.strip() =3D=3D TAB_PCDS_DYNAMIC_EX: > - ToPcd.Type =3D FromPcd.Type > - elif ToPcd.Type and FromPcd.Type \ > - and ToPcd.Type !=3D FromPcd.Type: > - if Library: > - Module =3D str(Module) + " 's library file (" + str= (Library) + ")" > - EdkLogger.error("build", OPTION_CONFLICT, "Mismatched P= CD type", > - ExtraData=3D"%s.%s is used as [%s] in m= odule %s, but as [%s] in %s."\ > - % (ToPcd.TokenSpaceGuidCName,= TokenCName, > - ToPcd.Type, Module, FromPc= d.Type, Msg), > - File=3Dself.MetaFile) > - > - if FromPcd.MaxDatumSize: > - ToPcd.MaxDatumSize =3D FromPcd.MaxDatumSize > - ToPcd.MaxSizeUserSet =3D FromPcd.MaxDatumSize > - if FromPcd.DefaultValue: > - ToPcd.DefaultValue =3D FromPcd.DefaultValue > - if FromPcd.TokenValue: > - ToPcd.TokenValue =3D FromPcd.TokenValue > - if FromPcd.DatumType: > - ToPcd.DatumType =3D FromPcd.DatumType > - if FromPcd.SkuInfoList: > - ToPcd.SkuInfoList =3D FromPcd.SkuInfoList > - if FromPcd.UserDefinedDefaultStoresFlag: > - ToPcd.UserDefinedDefaultStoresFlag =3D FromPcd.UserDefi= nedDefaultStoresFlag > - # Add Flexible PCD format parse > - if ToPcd.DefaultValue: > - try: > - ToPcd.DefaultValue =3D ValueExpressionEx(ToPcd.Defa= ultValue, ToPcd.DatumType, self.Workspace._GuidDict)(True) > - except BadExpression as Value: > - EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.= %s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.De= faultValue, Value), > - File=3Dself.MetaFile) > - > - # check the validation of datum > - IsValid, Cause =3D CheckPcdDatum(ToPcd.DatumType, ToPcd.Def= aultValue) > - if not IsValid: > - EdkLogger.error('build', FORMAT_INVALID, Cause, File=3D= self.MetaFile, > - ExtraData=3D"%s.%s" % (ToPcd.TokenSpace= GuidCName, TokenCName)) > - ToPcd.validateranges =3D FromPcd.validateranges > - ToPcd.validlists =3D FromPcd.validlists > - ToPcd.expressions =3D FromPcd.expressions > - ToPcd.CustomAttribute =3D FromPcd.CustomAttribute > - > - if FromPcd is not None and ToPcd.DatumType =3D=3D TAB_VOID and = not ToPcd.MaxDatumSize: > - EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specifi= ed for PCD %s.%s" \ > - % (ToPcd.TokenSpaceGuidCName, TokenCName)) > - Value =3D ToPcd.DefaultValue > - if not Value: > - ToPcd.MaxDatumSize =3D '1' > - elif Value[0] =3D=3D 'L': > - ToPcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > - elif Value[0] =3D=3D '{': > - ToPcd.MaxDatumSize =3D str(len(Value.split(','))) > - else: > - ToPcd.MaxDatumSize =3D str(len(Value) - 1) > - > - # apply default SKU for dynamic PCDS if specified one is not av= ailable > - if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYN= AMIC_EX_TYPE_SET) \ > - and not ToPcd.SkuInfoList: > - if self.Platform.SkuName in self.Platform.SkuIds: > - SkuName =3D self.Platform.SkuName > - else: > - SkuName =3D TAB_DEFAULT > - ToPcd.SkuInfoList =3D { > - SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[Sk= uName][0], '', '', '', '', '', ToPcd.DefaultValue) > - } > - > - ## Apply PCD setting defined platform to a module > - # > - # @param Module The module from which the PCD setting will be o= verridden > - # > - # @retval PCD_list The list PCDs with settings from platform > - # > - def ApplyPcdSetting(self, Module, Pcds, Library=3D""): > - # for each PCD in module > - for Name, Guid in Pcds: > - PcdInModule =3D Pcds[Name, Guid] > - # find out the PCD setting in platform > - if (Name, Guid) in self.Platform.Pcds: > - PcdInPlatform =3D self.Platform.Pcds[Name, Guid] > - else: > - PcdInPlatform =3D None > - # then override the settings if any > - self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg= =3D"DSC PCD sections", Library=3DLibrary) > - # resolve the VariableGuid value > - for SkuId in PcdInModule.SkuInfoList: > - Sku =3D PcdInModule.SkuInfoList[SkuId] > - if Sku.VariableGuid =3D=3D '': continue > - Sku.VariableGuidValue =3D GuidValue(Sku.VariableGuid, s= elf.PackageList, self.MetaFile.Path) > - if Sku.VariableGuidValue is None: > - PackageList =3D "\n\t".join(str(P) for P in self.Pa= ckageList) > - EdkLogger.error( > - 'build', > - RESOURCE_NOT_AVAILABLE, > - "Value of GUID [%s] is not found in" % = Sku.VariableGuid, > - ExtraData=3DPackageList + "\n\t(used wi= th %s.%s from module %s)" \ > - % (Guid, Name, = str(Module)), > - File=3Dself.MetaFile > - ) > - > - # override PCD settings with module specific setting > - if Module in self.Platform.Modules: > - PlatformModule =3D self.Platform.Modules[str(Module)] > - for Key in PlatformModule.Pcds: > - if GlobalData.BuildOptionPcd: > - for pcd in GlobalData.BuildOptionPcd: > - (TokenSpaceGuidCName, TokenCName, FieldName, pc= dvalue, _) =3D pcd > - if (TokenCName, TokenSpaceGuidCName) =3D=3D Key= and FieldName =3D=3D"": > - PlatformModule.Pcds[Key].DefaultValue =3D p= cdvalue > - PlatformModule.Pcds[Key].PcdValueFromComm = =3D pcdvalue > - break > - Flag =3D False > - if Key in Pcds: > - ToPcd =3D Pcds[Key] > - Flag =3D True > - elif Key in GlobalData.MixedPcd: > - for PcdItem in GlobalData.MixedPcd[Key]: > - if PcdItem in Pcds: > - ToPcd =3D Pcds[PcdItem] > - Flag =3D True > - break > - if Flag: > - self._OverridePcd(ToPcd, PlatformModule.Pcds[Key], = Module, Msg=3D"DSC Components Module scoped PCD section", Library=3DLibrary= ) > - # use PCD value to calculate the MaxDatumSize when it is not sp= ecified > - for Name, Guid in Pcds: > - Pcd =3D Pcds[Name, Guid] > - if Pcd.DatumType =3D=3D TAB_VOID and not Pcd.MaxDatumSize: > - Pcd.MaxSizeUserSet =3D None > - Value =3D Pcd.DefaultValue > - if not Value: > - Pcd.MaxDatumSize =3D '1' > - elif Value[0] =3D=3D 'L': > - Pcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > - elif Value[0] =3D=3D '{': > - Pcd.MaxDatumSize =3D str(len(Value.split(','))) > - else: > - Pcd.MaxDatumSize =3D str(len(Value) - 1) > - return list(Pcds.values()) > - > - > - > - ## Calculate the priority value of the build option > - # > - # @param Key Build option definition contain: TARGET_TOOLCHAI= N_ARCH_COMMANDTYPE_ATTRIBUTE > - # > - # @retval Value Priority value based on the priority list. > - # > - def CalculatePriorityValue(self, Key): > - Target, ToolChain, Arch, CommandType, Attr =3D Key.split('_') > - PriorityValue =3D 0x11111 > - if Target =3D=3D TAB_STAR: > - PriorityValue &=3D 0x01111 > - if ToolChain =3D=3D TAB_STAR: > - PriorityValue &=3D 0x10111 > - if Arch =3D=3D TAB_STAR: > - PriorityValue &=3D 0x11011 > - if CommandType =3D=3D TAB_STAR: > - PriorityValue &=3D 0x11101 > - if Attr =3D=3D TAB_STAR: > - PriorityValue &=3D 0x11110 > - > - return self.PrioList["0x%0.5x" % PriorityValue] > - > - > - ## Expand * in build option key > - # > - # @param Options Options to be expanded > - # @param ToolDef Use specified ToolDef instead of full versi= on. > - # This is needed during initialization to pre= vent > - # infinite recursion betweeh BuildOptions, > - # ToolDefinition, and this function. > - # > - # @retval options Options expanded > - # > - def _ExpandBuildOption(self, Options, ModuleStyle=3DNone, ToolDef= =3DNone): > - if not ToolDef: > - ToolDef =3D self.ToolDefinition > - BuildOptions =3D {} > - FamilyMatch =3D False > - FamilyIsNull =3D True > - > - OverrideList =3D {} > - # > - # Construct a list contain the build options which need overrid= e. > - # > - for Key in Options: > - # > - # Key[0] -- tool family > - # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE > - # > - if (Key[0] =3D=3D self.BuildRuleFamily and > - (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 a= nd Key[2] =3D=3D ModuleStyle))): > - Target, ToolChain, Arch, CommandType, Attr =3D Key[1].s= plit('_') > - if (Target =3D=3D self.BuildTarget or Target =3D=3D TAB= _STAR) and\ > - (ToolChain =3D=3D self.ToolChain or ToolChain =3D= =3D TAB_STAR) and\ > - (Arch =3D=3D self.Arch or Arch =3D=3D TAB_STAR) and= \ > - Options[Key].startswith("=3D"): > - > - if OverrideList.get(Key[1]) is not None: > - OverrideList.pop(Key[1]) > - OverrideList[Key[1]] =3D Options[Key] > - > - # > - # Use the highest priority value. > - # > - if (len(OverrideList) >=3D 2): > - KeyList =3D list(OverrideList.keys()) > - for Index in range(len(KeyList)): > - NowKey =3D KeyList[Index] > - Target1, ToolChain1, Arch1, CommandType1, Attr1 =3D Now= Key.split("_") > - for Index1 in range(len(KeyList) - Index - 1): > - NextKey =3D KeyList[Index1 + Index + 1] > - # > - # Compare two Key, if one is included by another, c= hoose the higher priority one > - # > - Target2, ToolChain2, Arch2, CommandType2, Attr2 =3D= NextKey.split("_") > - if (Target1 =3D=3D Target2 or Target1 =3D=3D TAB_ST= AR or Target2 =3D=3D TAB_STAR) and\ > - (ToolChain1 =3D=3D ToolChain2 or ToolChain1 =3D= = =3D TAB_STAR or ToolChain2 =3D=3D TAB_STAR) and\ > - (Arch1 =3D=3D Arch2 or Arch1 =3D=3D TAB_STAR or= Arch2 =3D=3D TAB_STAR) and\ > - (CommandType1 =3D=3D CommandType2 or CommandTyp= e1 =3D=3D TAB_STAR or CommandType2 =3D=3D TAB_STAR) and\ > - (Attr1 =3D=3D Attr2 or Attr1 =3D=3D TAB_STAR or= Attr2 =3D=3D TAB_STAR): > - > - if self.CalculatePriorityValue(NowKey) > self.C= alculatePriorityValue(NextKey): > - if Options.get((self.BuildRuleFamily, NextK= ey)) is not None: > - Options.pop((self.BuildRuleFamily, Next= Key)) > - else: > - if Options.get((self.BuildRuleFamily, NowKe= y)) is not None: > - Options.pop((self.BuildRuleFamily, NowK= ey)) > - > - for Key in Options: > - if ModuleStyle is not None and len (Key) > 2: > - # Check Module style is EDK or EDKII. > - # Only append build option for the matched style module= . > - if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NAME= : > - continue > - elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D EDKI= I_NAME: > - continue > - Family =3D Key[0] > - Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > - # if tool chain family doesn't match, skip it > - if Tool in ToolDef and Family !=3D "": > - FamilyIsNull =3D False > - if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY, "= ") !=3D "": > - if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_BUILDR= ULEFAMILY]: > - continue > - elif Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > - continue > - FamilyMatch =3D True > - # expand any wildcard > - if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarget= : > - if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > - if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > - if Tool not in BuildOptions: > - BuildOptions[Tool] =3D {} > - if Attr !=3D "FLAGS" or Attr not in BuildOption= s[Tool] or Options[Key].startswith('=3D'): > - BuildOptions[Tool][Attr] =3D Options[Key] > - else: > - # append options for the same tool except P= ATH > - if Attr !=3D 'PATH': > - BuildOptions[Tool][Attr] +=3D " " + Opt= ions[Key] > - else: > - BuildOptions[Tool][Attr] =3D Options[Ke= y] > - # Build Option Family has been checked, which need't to be chec= ked again for family. > - if FamilyMatch or FamilyIsNull: > - return BuildOptions > - > - for Key in Options: > - if ModuleStyle is not None and len (Key) > 2: > - # Check Module style is EDK or EDKII. > - # Only append build option for the matched style module= . > - if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NAME= : > - continue > - elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D EDKI= I_NAME: > - continue > - Family =3D Key[0] > - Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > - # if tool chain family doesn't match, skip it > - if Tool not in ToolDef or Family =3D=3D "": > - continue > - # option has been added before > - if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > - continue > - > - # expand any wildcard > - if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarget= : > - if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > - if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > - if Tool not in BuildOptions: > - BuildOptions[Tool] =3D {} > - if Attr !=3D "FLAGS" or Attr not in BuildOption= s[Tool] or Options[Key].startswith('=3D'): > - BuildOptions[Tool][Attr] =3D Options[Key] > - else: > - # append options for the same tool except P= ATH > - if Attr !=3D 'PATH': > - BuildOptions[Tool][Attr] +=3D " " + Opt= ions[Key] > - else: > - BuildOptions[Tool][Attr] =3D Options[Ke= y] > - return BuildOptions > - def GetGlobalBuildOptions(self,Module): > - ModuleTypeOptions =3D self.Platform.GetBuildOptionsByPkg(Module= , Module.ModuleType) > - ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptions= ) > - if Module in self.Platform.Modules: > - PlatformModule =3D self.Platform.Modules[str(Module)] > - PlatformModuleOptions =3D self._ExpandBuildOption(PlatformM= odule.BuildOptions) > - else: > - PlatformModuleOptions =3D {} > - return ModuleTypeOptions, PlatformModuleOptions > - ## Append build options in platform to a module > - # > - # @param Module The module to which the build options will be a= ppended > - # > - # @retval options The options appended with build options in = platform > - # > - def ApplyBuildOption(self, Module): > - # Get the different options for the different style module > - PlatformOptions =3D self.EdkIIBuildOption > - ModuleTypeOptions =3D self.Platform.GetBuildOptionsByModuleType= (EDKII_NAME, Module.ModuleType) > - ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptions= ) > - ModuleOptions =3D self._ExpandBuildOption(Module.BuildOptions) > - if Module in self.Platform.Modules: > - PlatformModule =3D self.Platform.Modules[str(Module)] > - PlatformModuleOptions =3D self._ExpandBuildOption(PlatformM= odule.BuildOptions) > - else: > - PlatformModuleOptions =3D {} > - > - BuildRuleOrder =3D None > - for Options in [self.ToolDefinition, ModuleOptions, PlatformOpt= ions, ModuleTypeOptions, PlatformModuleOptions]: > - for Tool in Options: > - for Attr in Options[Tool]: > - if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > - BuildRuleOrder =3D Options[Tool][Attr] > - > - AllTools =3D set(list(ModuleOptions.keys()) + list(PlatformOpti= ons.keys()) + > - list(PlatformModuleOptions.keys()) + list(Module= TypeOptions.keys()) + > - list(self.ToolDefinition.keys())) > - BuildOptions =3D defaultdict(lambda: defaultdict(str)) > - for Tool in AllTools: > - for Options in [self.ToolDefinition, ModuleOptions, Platfor= mOptions, ModuleTypeOptions, PlatformModuleOptions]: > - if Tool not in Options: > - continue > - for Attr in Options[Tool]: > - # > - # Do not generate it in Makefile > - # > - if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > - continue > - Value =3D Options[Tool][Attr] > - # check if override is indicated > - if Value.startswith('=3D'): > - BuildOptions[Tool][Attr] =3D mws.handleWsMacro(= Value[1:]) > - else: > - if Attr !=3D 'PATH': > - BuildOptions[Tool][Attr] +=3D " " + mws.han= dleWsMacro(Value) > - else: > - BuildOptions[Tool][Attr] =3D mws.handleWsMa= cro(Value) > - > - return BuildOptions, BuildRuleOrder > - > -# > -# extend lists contained in a dictionary with lists stored in another d= ictionary > -# if CopyToDict is not derived from DefaultDict(list) then this may rai= se exception > -# > -def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict): > - for Key in CopyFromDict: > - CopyToDict[Key].extend(CopyFromDict[Key]) > - > -# Create a directory specified by a set of path elements and return the= full path > -def _MakeDir(PathList): > - RetVal =3D path.join(*PathList) > - CreateDirectory(RetVal) > - return RetVal > - > -## ModuleAutoGen class > -# > -# This class encapsules the AutoGen behaviors for the build tools. In a= ddition to > -# the generation of AutoGen.h and AutoGen.c, it will generate *.depex f= ile according > -# to the [depex] section in module's inf file. > -# > -class ModuleAutoGen(AutoGen): > - # call super().__init__ then call the worker function with differen= t parameter count > - def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > - if not hasattr(self, "_Init"): > - self._InitWorker(Workspace, MetaFile, Target, Toolchain, Ar= ch, *args) > - self._Init =3D True > - > - ## Cache the timestamps of metafiles of every module in a class att= ribute > - # > - TimeDict =3D {} > - > - def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *arg= s, **kwargs): > - # check if this module is employed by active platform > - if not PlatformAutoGen(Workspace, args[0], Target, Toolchain, A= rch).ValidModule(MetaFile): > - EdkLogger.verbose("Module [%s] for [%s] is not employed by = active platform\n" \ > - % (MetaFile, Arch)) > - return None > - return super(ModuleAutoGen, cls).__new__(cls, Workspace, MetaFi= le, Target, Toolchain, Arch, *args, **kwargs) > - > - ## Initialize ModuleAutoGen > - # > - # @param Workspace EdkIIWorkspaceBuild object > - # @param ModuleFile The path of module file > - # @param Target Build target (DEBUG, RELEASE) > - # @param Toolchain Name of tool chain > - # @param Arch The arch the module supports > - # @param PlatformFile Platform meta-file > - # > - def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, Arc= h, PlatformFile): > - EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen module [%s] [%s]" %= (ModuleFile, Arch)) > - GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (ModuleFile,= Arch, Toolchain, Target) > - > - self.Workspace =3D Workspace > - self.WorkspaceDir =3D Workspace.WorkspaceDir > - self.MetaFile =3D ModuleFile > - self.PlatformInfo =3D PlatformAutoGen(Workspace, PlatformFile, = Target, Toolchain, Arch) > - > - self.SourceDir =3D self.MetaFile.SubDir > - self.SourceDir =3D mws.relpath(self.SourceDir, self.WorkspaceDi= r) > - > - self.ToolChain =3D Toolchain > - self.BuildTarget =3D Target > - self.Arch =3D Arch > - self.ToolChainFamily =3D self.PlatformInfo.ToolChainFamily > - self.BuildRuleFamily =3D self.PlatformInfo.BuildRuleFamily > - > - self.IsCodeFileCreated =3D False > - self.IsAsBuiltInfCreated =3D False > - self.DepexGenerated =3D False > - > - self.BuildDatabase =3D self.Workspace.BuildDatabase > - self.BuildRuleOrder =3D None > - self.BuildTime =3D 0 > - > - self._PcdComments =3D OrderedListDict() > - self._GuidComments =3D OrderedListDict() > - self._ProtocolComments =3D OrderedListDict() > - self._PpiComments =3D OrderedListDict() > - self._BuildTargets =3D None > - self._IntroBuildTargetList =3D None > - self._FinalBuildTargetList =3D None > - self._FileTypes =3D None > - > - self.AutoGenDepSet =3D set() > - self.ReferenceModules =3D [] > - self.ConstPcd =3D {} > - > - ## hash() operator of ModuleAutoGen > - # > - # The module file path and arch string will be used to represent > - # hash value of this object > - # > - # @retval int Hash value of the module file path and arch > - # > - @cached_class_function > - def __hash__(self): > - return hash((self.MetaFile, self.Arch)) > - > - def __repr__(self): > - return "%s [%s]" % (self.MetaFile, self.Arch) > - > - # Get FixedAtBuild Pcds of this Module > - @cached_property > - def FixedAtBuildPcds(self): > - RetVal =3D [] > - for Pcd in self.ModulePcdList: > - if Pcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > - continue > - if Pcd not in RetVal: > - RetVal.append(Pcd) > - return RetVal > - > - @cached_property > - def FixedVoidTypePcds(self): > - RetVal =3D {} > - for Pcd in self.FixedAtBuildPcds: > - if Pcd.DatumType =3D=3D TAB_VOID: > - if '{}.{}'.format(Pcd.TokenSpaceGuidCName, Pcd.TokenCNa= me) not in RetVal: > - RetVal['{}.{}'.format(Pcd.TokenSpaceGuidCName, Pcd.= TokenCName)] =3D Pcd.DefaultValue > - return RetVal > - > - @property > - def UniqueBaseName(self): > - BaseName =3D self.Name > - for Module in self.PlatformInfo.ModuleAutoGenList: > - if Module.MetaFile =3D=3D self.MetaFile: > - continue > - if Module.Name =3D=3D self.Name: > - if uuid.UUID(Module.Guid) =3D=3D uuid.UUID(self.Guid): > - EdkLogger.error("build", FILE_DUPLICATED, 'Modules = have same BaseName and FILE_GUID:\n' > - ' %s\n %s' % (Module.MetaFile, se= lf.MetaFile)) > - BaseName =3D '%s_%s' % (self.Name, self.Guid) > - return BaseName > - > - # Macros could be used in build_rule.txt (also Makefile) > - @cached_property > - def Macros(self): > - return OrderedDict(( > - ("WORKSPACE" ,self.WorkspaceDir), > - ("MODULE_NAME" ,self.Name), > - ("MODULE_NAME_GUID" ,self.UniqueBaseName), > - ("MODULE_GUID" ,self.Guid), > - ("MODULE_VERSION" ,self.Version), > - ("MODULE_TYPE" ,self.ModuleType), > - ("MODULE_FILE" ,str(self.MetaFile)), > - ("MODULE_FILE_BASE_NAME" ,self.MetaFile.BaseName), > - ("MODULE_RELATIVE_DIR" ,self.SourceDir), > - ("MODULE_DIR" ,self.SourceDir), > - ("BASE_NAME" ,self.Name), > - ("ARCH" ,self.Arch), > - ("TOOLCHAIN" ,self.ToolChain), > - ("TOOLCHAIN_TAG" ,self.ToolChain), > - ("TOOL_CHAIN_TAG" ,self.ToolChain), > - ("TARGET" ,self.BuildTarget), > - ("BUILD_DIR" ,self.PlatformInfo.BuildDir), > - ("BIN_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.A= rch)), > - ("LIB_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.A= rch)), > - ("MODULE_BUILD_DIR" ,self.BuildDir), > - ("OUTPUT_DIR" ,self.OutputDir), > - ("DEBUG_DIR" ,self.DebugDir), > - ("DEST_DIR_OUTPUT" ,self.OutputDir), > - ("DEST_DIR_DEBUG" ,self.DebugDir), > - ("PLATFORM_NAME" ,self.PlatformInfo.Name), > - ("PLATFORM_GUID" ,self.PlatformInfo.Guid), > - ("PLATFORM_VERSION" ,self.PlatformInfo.Version), > - ("PLATFORM_RELATIVE_DIR" ,self.PlatformInfo.SourceDir), > - ("PLATFORM_DIR" ,mws.join(self.WorkspaceDir, self.PlatformI= nfo.SourceDir)), > - ("PLATFORM_OUTPUT_DIR" ,self.PlatformInfo.OutputDir), > - ("FFS_OUTPUT_DIR" ,self.FfsOutputDir) > - )) > - > - ## Return the module build data object > - @cached_property > - def Module(self): > - return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildT= arget, self.ToolChain] > - > - ## Return the module name > - @cached_property > - def Name(self): > - return self.Module.BaseName > - > - ## Return the module DxsFile if exist > - @cached_property > - def DxsFile(self): > - return self.Module.DxsFile > - > - ## Return the module meta-file GUID > - @cached_property > - def Guid(self): > - # > - # To build same module more than once, the module path with FIL= E_GUID overridden has > - # the file name FILE_GUIDmodule.inf, but the relative path (sel= f.MetaFile.File) is the real path > - # in DSC. The overridden GUID can be retrieved from file name > - # > - if os.path.basename(self.MetaFile.File) !=3D os.path.basename(s= elf.MetaFile.Path): > - # > - # Length of GUID is 36 > - # > - return os.path.basename(self.MetaFile.Path)[:36] > - return self.Module.Guid > - > - ## Return the module version > - @cached_property > - def Version(self): > - return self.Module.Version > - > - ## Return the module type > - @cached_property > - def ModuleType(self): > - return self.Module.ModuleType > - > - ## Return the component type (for Edk.x style of module) > - @cached_property > - def ComponentType(self): > - return self.Module.ComponentType > - > - ## Return the build type > - @cached_property > - def BuildType(self): > - return self.Module.BuildType > - > - ## Return the PCD_IS_DRIVER setting > - @cached_property > - def PcdIsDriver(self): > - return self.Module.PcdIsDriver > - > - ## Return the autogen version, i.e. module meta-file version > - @cached_property > - def AutoGenVersion(self): > - return self.Module.AutoGenVersion > - > - ## Check if the module is library or not > - @cached_property > - def IsLibrary(self): > - return bool(self.Module.LibraryClass) > - > - ## Check if the module is binary module or not > - @cached_property > - def IsBinaryModule(self): > - return self.Module.IsBinaryModule > - > - ## Return the directory to store intermediate files of the module > - @cached_property > - def BuildDir(self): > - return _MakeDir(( > - self.PlatformInfo.BuildDir, > - self.Arch, > - self.SourceDir, > - self.MetaFile.BaseName > - )) > - > - ## Return the directory to store the intermediate object files of t= he module > - @cached_property > - def OutputDir(self): > - return _MakeDir((self.BuildDir, "OUTPUT")) > - > - ## Return the directory path to store ffs file > - @cached_property > - def FfsOutputDir(self): > - if GlobalData.gFdfParser: > - return path.join(self.PlatformInfo.BuildDir, TAB_FV_DIRECTO= RY, "Ffs", self.Guid + self.Name) > - return '' > - > - ## Return the directory to store auto-gened source files of the mod= ule > - @cached_property > - def DebugDir(self): > - return _MakeDir((self.BuildDir, "DEBUG")) > - > - ## Return the path of custom file > - @cached_property > - def CustomMakefile(self): > - RetVal =3D {} > - for Type in self.Module.CustomMakefile: > - MakeType =3D gMakeTypeMap[Type] if Type in gMakeTypeMap els= e 'nmake' > - File =3D os.path.join(self.SourceDir, self.Module.CustomMak= efile[Type]) > - RetVal[MakeType] =3D File > - return RetVal > - > - ## Return the directory of the makefile > - # > - # @retval string The directory string of module's makefile > - # > - @cached_property > - def MakeFileDir(self): > - return self.BuildDir > - > - ## Return build command string > - # > - # @retval string Build command string > - # > - @cached_property > - def BuildCommand(self): > - return self.PlatformInfo.BuildCommand > - > - ## Get object list of all packages the module and its dependent lib= raries belong to > - # > - # @retval list The list of package object > - # > - @cached_property > - def DerivedPackageList(self): > - PackageList =3D [] > - for M in [self.Module] + self.DependentLibraryList: > - for Package in M.Packages: > - if Package in PackageList: > - continue > - PackageList.append(Package) > - return PackageList > - > - ## Get the depex string > - # > - # @return : a string contain all depex expression. > - def _GetDepexExpresionString(self): > - DepexStr =3D '' > - DepexList =3D [] > - ## DPX_SOURCE IN Define section. > - if self.Module.DxsFile: > - return DepexStr > - for M in [self.Module] + self.DependentLibraryList: > - Filename =3D M.MetaFile.Path > - InfObj =3D InfSectionParser.InfSectionParser(Filename) > - DepexExpressionList =3D InfObj.GetDepexExpresionList() > - for DepexExpression in DepexExpressionList: > - for key in DepexExpression: > - Arch, ModuleType =3D key > - DepexExpr =3D [x for x in DepexExpression[key] if n= ot str(x).startswith('#')] > - # the type of build module is USER_DEFINED. > - # All different DEPEX section tags would be copied = into the As Built INF file > - # and there would be separate DEPEX section tags > - if self.ModuleType.upper() =3D=3D SUP_MODULE_USER_D= EFINED or self.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > - if (Arch.upper() =3D=3D self.Arch.upper()) and = (ModuleType.upper() !=3D TAB_ARCH_COMMON): > - DepexList.append({(Arch, ModuleType): Depex= Expr}) > - else: > - if Arch.upper() =3D=3D TAB_ARCH_COMMON or \ > - (Arch.upper() =3D=3D self.Arch.upper() and \ > - ModuleType.upper() in [TAB_ARCH_COMMON, self.= ModuleType.upper()]): > - DepexList.append({(Arch, ModuleType): Depex= Expr}) > - > - #the type of build module is USER_DEFINED. > - if self.ModuleType.upper() =3D=3D SUP_MODULE_USER_DEFINED or se= lf.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > - for Depex in DepexList: > - for key in Depex: > - DepexStr +=3D '[Depex.%s.%s]\n' % key > - DepexStr +=3D '\n'.join('# '+ val for val in Depex[= key]) > - DepexStr +=3D '\n\n' > - if not DepexStr: > - return '[Depex.%s]\n' % self.Arch > - return DepexStr > - > - #the type of build module not is USER_DEFINED. > - Count =3D 0 > - for Depex in DepexList: > - Count +=3D 1 > - if DepexStr !=3D '': > - DepexStr +=3D ' AND ' > - DepexStr +=3D '(' > - for D in Depex.values(): > - DepexStr +=3D ' '.join(val for val in D) > - Index =3D DepexStr.find('END') > - if Index > -1 and Index =3D=3D len(DepexStr) - 3: > - DepexStr =3D DepexStr[:-3] > - DepexStr =3D DepexStr.strip() > - DepexStr +=3D ')' > - if Count =3D=3D 1: > - DepexStr =3D DepexStr.lstrip('(').rstrip(')').strip() > - if not DepexStr: > - return '[Depex.%s]\n' % self.Arch > - return '[Depex.%s]\n# ' % self.Arch + DepexStr > - > - ## Merge dependency expression > - # > - # @retval list The token list of the dependency expression= after parsed > - # > - @cached_property > - def DepexList(self): > - if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_= FILE in self.FileTypes: > - return {} > - > - DepexList =3D [] > - # > - # Append depex from dependent libraries, if not "BEFORE", "AFTE= R" expression > - # > - for M in [self.Module] + self.DependentLibraryList: > - Inherited =3D False > - for D in M.Depex[self.Arch, self.ModuleType]: > - if DepexList !=3D []: > - DepexList.append('AND') > - DepexList.append('(') > - #replace D with value if D is FixedAtBuild PCD > - NewList =3D [] > - for item in D: > - if '.' not in item: > - NewList.append(item) > - else: > - FixedVoidTypePcds =3D {} > - if item in self.FixedVoidTypePcds: > - FixedVoidTypePcds =3D self.FixedVoidTypePcd= s > - elif M in self.PlatformInfo.LibraryAutoGenList: > - Index =3D self.PlatformInfo.LibraryAutoGenL= ist.index(M) > - FixedVoidTypePcds =3D self.PlatformInfo.Lib= raryAutoGenList[Index].FixedVoidTypePcds > - if item not in FixedVoidTypePcds: > - EdkLogger.error("build", FORMAT_INVALID, "{= } used in [Depex] section should be used as FixedAtBuild type and VOID* dat= um type in the module.".format(item)) > - else: > - Value =3D FixedVoidTypePcds[item] > - if len(Value.split(',')) !=3D 16: > - EdkLogger.error("build", FORMAT_INVALID= , > - "{} used in [Depex] sec= tion should be used as FixedAtBuild type and VOID* datum type and 16 bytes = in the module.".format(item)) > - NewList.append(Value) > - DepexList.extend(NewList) > - if DepexList[-1] =3D=3D 'END': # no need of a END at t= his time > - DepexList.pop() > - DepexList.append(')') > - Inherited =3D True > - if Inherited: > - EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Name= , M.BaseName, DepexList)) > - if 'BEFORE' in DepexList or 'AFTER' in DepexList: > - break > - if len(DepexList) > 0: > - EdkLogger.verbose('') > - return {self.ModuleType:DepexList} > - > - ## Merge dependency expression > - # > - # @retval list The token list of the dependency expression= after parsed > - # > - @cached_property > - def DepexExpressionDict(self): > - if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_= FILE in self.FileTypes: > - return {} > - > - DepexExpressionString =3D '' > - # > - # Append depex from dependent libraries, if not "BEFORE", "AFTE= R" expresion > - # > - for M in [self.Module] + self.DependentLibraryList: > - Inherited =3D False > - for D in M.DepexExpression[self.Arch, self.ModuleType]: > - if DepexExpressionString !=3D '': > - DepexExpressionString +=3D ' AND ' > - DepexExpressionString +=3D '(' > - DepexExpressionString +=3D D > - DepexExpressionString =3D DepexExpressionString.rstrip(= 'END').strip() > - DepexExpressionString +=3D ')' > - Inherited =3D True > - if Inherited: > - EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Name= , M.BaseName, DepexExpressionString)) > - if 'BEFORE' in DepexExpressionString or 'AFTER' in DepexExp= ressionString: > - break > - if len(DepexExpressionString) > 0: > - EdkLogger.verbose('') > - > - return {self.ModuleType:DepexExpressionString} > - > - # Get the tiano core user extension, it is contain dependent librar= y. > - # @retval: a list contain tiano core userextension. > - # > - def _GetTianoCoreUserExtensionList(self): > - TianoCoreUserExtentionList =3D [] > - for M in [self.Module] + self.DependentLibraryList: > - Filename =3D M.MetaFile.Path > - InfObj =3D InfSectionParser.InfSectionParser(Filename) > - TianoCoreUserExtenList =3D InfObj.GetUserExtensionTianoCore= () > - for TianoCoreUserExtent in TianoCoreUserExtenList: > - for Section in TianoCoreUserExtent: > - ItemList =3D Section.split(TAB_SPLIT) > - Arch =3D self.Arch > - if len(ItemList) =3D=3D 4: > - Arch =3D ItemList[3] > - if Arch.upper() =3D=3D TAB_ARCH_COMMON or Arch.uppe= r() =3D=3D self.Arch.upper(): > - TianoCoreList =3D [] > - TianoCoreList.extend([TAB_SECTION_START + Secti= on + TAB_SECTION_END]) > - TianoCoreList.extend(TianoCoreUserExtent[Sectio= n][:]) > - TianoCoreList.append('\n') > - TianoCoreUserExtentionList.append(TianoCoreList= ) > - > - return TianoCoreUserExtentionList > - > - ## Return the list of specification version required for the module > - # > - # @retval list The list of specification defined in module= file > - # > - @cached_property > - def Specification(self): > - return self.Module.Specification > - > - ## Tool option for the module build > - # > - # @param PlatformInfo The object of PlatformBuildInfo > - # @retval dict The dict containing valid options > - # > - @cached_property > - def BuildOption(self): > - RetVal, self.BuildRuleOrder =3D self.PlatformInfo.ApplyBuildOpt= ion(self.Module) > - if self.BuildRuleOrder: > - self.BuildRuleOrder =3D ['.%s' % Ext for Ext in self.BuildR= uleOrder.split()] > - return RetVal > - > - ## Get include path list from tool option for the module build > - # > - # @retval list The include path list > - # > - @cached_property > - def BuildOptionIncPathList(self): > - # > - # Regular expression for finding Include Directories, the diffe= rence between MSFT and INTEL/GCC/RVCT > - # is the former use /I , the Latter used -I to specify include = directories > - # > - if self.PlatformInfo.ToolChainFamily in (TAB_COMPILER_MSFT): > - BuildOptIncludeRegEx =3D gBuildOptIncludePatternMsft > - elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'RVC= T'): > - BuildOptIncludeRegEx =3D gBuildOptIncludePatternOther > - else: > - # > - # New ToolChainFamily, don't known whether there is option = to specify include directories > - # > - return [] > - > - RetVal =3D [] > - for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM= '): > - try: > - FlagOption =3D self.BuildOption[Tool]['FLAGS'] > - except KeyError: > - FlagOption =3D '' > - > - if self.ToolChainFamily !=3D 'RVCT': > - IncPathList =3D [NormPath(Path, self.Macros) for Path i= n BuildOptIncludeRegEx.findall(FlagOption)] > - else: > - # > - # RVCT may specify a list of directory separated by com= mas > - # > - IncPathList =3D [] > - for Path in BuildOptIncludeRegEx.findall(FlagOption): > - PathList =3D GetSplitList(Path, TAB_COMMA_SPLIT) > - IncPathList.extend(NormPath(PathEntry, self.Macros)= for PathEntry in PathList) > - > - # > - # EDK II modules must not reference header files outside of= the packages they depend on or > - # within the module's directory tree. Report error if viola= tion. > - # > - if GlobalData.gDisableIncludePathCheck =3D=3D False: > - for Path in IncPathList: > - if (Path not in self.IncludePathList) and (CommonPa= th([Path, self.MetaFile.Dir]) !=3D self.MetaFile.Dir): > - ErrMsg =3D "The include directory for the EDK I= I module in this line is invalid %s specified in %s FLAGS '%s'" % (Path, To= ol, FlagOption) > - EdkLogger.error("build", > - PARAMETER_INVALID, > - ExtraData=3DErrMsg, > - File=3Dstr(self.MetaFile)) > - RetVal +=3D IncPathList > - return RetVal > - > - ## Return a list of files which can be built from source > - # > - # What kind of files can be built is determined by build rules in > - # $(CONF_DIRECTORY)/build_rule.txt and toolchain family. > - # > - @cached_property > - def SourceFileList(self): > - RetVal =3D [] > - ToolChainTagSet =3D {"", TAB_STAR, self.ToolChain} > - ToolChainFamilySet =3D {"", TAB_STAR, self.ToolChainFamily, sel= f.BuildRuleFamily} > - for F in self.Module.Sources: > - # match tool chain > - if F.TagName not in ToolChainTagSet: > - EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s] = for processing file [%s] is found, " > - "but [%s] is currently used" % (F.TagNa= me, str(F), self.ToolChain)) > - continue > - # match tool chain family or build rule family > - if F.ToolChainFamily not in ToolChainFamilySet: > - EdkLogger.debug( > - EdkLogger.DEBUG_0, > - "The file [%s] must be built by tools of [%= s], " \ > - "but current toolchain family is [%s], buil= drule family is [%s]" \ > - % (str(F), F.ToolChainFamily, self.Tool= ChainFamily, self.BuildRuleFamily)) > - continue > - > - # add the file path into search path list for file includin= g > - if F.Dir not in self.IncludePathList: > - self.IncludePathList.insert(0, F.Dir) > - RetVal.append(F) > - > - self._MatchBuildRuleOrder(RetVal) > - > - for F in RetVal: > - self._ApplyBuildRule(F, TAB_UNKNOWN_FILE) > - return RetVal > - > - def _MatchBuildRuleOrder(self, FileList): > - Order_Dict =3D {} > - self.BuildOption > - for SingleFile in FileList: > - if self.BuildRuleOrder and SingleFile.Ext in self.BuildRule= Order and SingleFile.Ext in self.BuildRules: > - key =3D SingleFile.Path.rsplit(SingleFile.Ext,1)[0] > - if key in Order_Dict: > - Order_Dict[key].append(SingleFile.Ext) > - else: > - Order_Dict[key] =3D [SingleFile.Ext] > - > - RemoveList =3D [] > - for F in Order_Dict: > - if len(Order_Dict[F]) > 1: > - Order_Dict[F].sort(key=3Dlambda i: self.BuildRuleOrder.= index(i)) > - for Ext in Order_Dict[F][1:]: > - RemoveList.append(F + Ext) > - > - for item in RemoveList: > - FileList.remove(item) > - > - return FileList > - > - ## Return the list of unicode files > - @cached_property > - def UnicodeFileList(self): > - return self.FileTypes.get(TAB_UNICODE_FILE,[]) > - > - ## Return the list of vfr files > - @cached_property > - def VfrFileList(self): > - return self.FileTypes.get(TAB_VFR_FILE, []) > - > - ## Return the list of Image Definition files > - @cached_property > - def IdfFileList(self): > - return self.FileTypes.get(TAB_IMAGE_FILE,[]) > - > - ## Return a list of files which can be built from binary > - # > - # "Build" binary files are just to copy them to build directory. > - # > - # @retval list The list of files which can be buil= t later > - # > - @cached_property > - def BinaryFileList(self): > - RetVal =3D [] > - for F in self.Module.Binaries: > - if F.Target not in [TAB_ARCH_COMMON, TAB_STAR] and F.Target= !=3D self.BuildTarget: > - continue > - RetVal.append(F) > - self._ApplyBuildRule(F, F.Type, BinaryFileList=3DRetVal) > - return RetVal > - > - @cached_property > - def BuildRules(self): > - RetVal =3D {} > - BuildRuleDatabase =3D BuildRule > - for Type in BuildRuleDatabase.FileTypeList: > - #first try getting build rule by BuildRuleFamily > - RuleObject =3D BuildRuleDatabase[Type, self.BuildType, self= .Arch, self.BuildRuleFamily] > - if not RuleObject: > - # build type is always module type, but ... > - if self.ModuleType !=3D self.BuildType: > - RuleObject =3D BuildRuleDatabase[Type, self.ModuleT= ype, self.Arch, self.BuildRuleFamily] > - #second try getting build rule by ToolChainFamily > - if not RuleObject: > - RuleObject =3D BuildRuleDatabase[Type, self.BuildType, = self.Arch, self.ToolChainFamily] > - if not RuleObject: > - # build type is always module type, but ... > - if self.ModuleType !=3D self.BuildType: > - RuleObject =3D BuildRuleDatabase[Type, self.Mod= uleType, self.Arch, self.ToolChainFamily] > - if not RuleObject: > - continue > - RuleObject =3D RuleObject.Instantiate(self.Macros) > - RetVal[Type] =3D RuleObject > - for Ext in RuleObject.SourceFileExtList: > - RetVal[Ext] =3D RuleObject > - return RetVal > - > - def _ApplyBuildRule(self, File, FileType, BinaryFileList=3DNone): > - if self._BuildTargets is None: > - self._IntroBuildTargetList =3D set() > - self._FinalBuildTargetList =3D set() > - self._BuildTargets =3D defaultdict(set) > - self._FileTypes =3D defaultdict(set) > - > - if not BinaryFileList: > - BinaryFileList =3D self.BinaryFileList > - > - SubDirectory =3D os.path.join(self.OutputDir, File.SubDir) > - if not os.path.exists(SubDirectory): > - CreateDirectory(SubDirectory) > - LastTarget =3D None > - RuleChain =3D set() > - SourceList =3D [File] > - Index =3D 0 > - # > - # Make sure to get build rule order value > - # > - self.BuildOption > - > - while Index < len(SourceList): > - Source =3D SourceList[Index] > - Index =3D Index + 1 > - > - if Source !=3D File: > - CreateDirectory(Source.Dir) > - > - if File.IsBinary and File =3D=3D Source and File in BinaryF= ileList: > - # Skip all files that are not binary libraries > - if not self.IsLibrary: > - continue > - RuleObject =3D self.BuildRules[TAB_DEFAULT_BINARY_FILE] > - elif FileType in self.BuildRules: > - RuleObject =3D self.BuildRules[FileType] > - elif Source.Ext in self.BuildRules: > - RuleObject =3D self.BuildRules[Source.Ext] > - else: > - # stop at no more rules > - if LastTarget: > - self._FinalBuildTargetList.add(LastTarget) > - break > - > - FileType =3D RuleObject.SourceFileType > - self._FileTypes[FileType].add(Source) > - > - # stop at STATIC_LIBRARY for library > - if self.IsLibrary and FileType =3D=3D TAB_STATIC_LIBRARY: > - if LastTarget: > - self._FinalBuildTargetList.add(LastTarget) > - break > - > - Target =3D RuleObject.Apply(Source, self.BuildRuleOrder) > - if not Target: > - if LastTarget: > - self._FinalBuildTargetList.add(LastTarget) > - break > - elif not Target.Outputs: > - # Only do build for target with outputs > - self._FinalBuildTargetList.add(Target) > - > - self._BuildTargets[FileType].add(Target) > - > - if not Source.IsBinary and Source =3D=3D File: > - self._IntroBuildTargetList.add(Target) > - > - # to avoid cyclic rule > - if FileType in RuleChain: > - break > - > - RuleChain.add(FileType) > - SourceList.extend(Target.Outputs) > - LastTarget =3D Target > - FileType =3D TAB_UNKNOWN_FILE > - > - @cached_property > - def Targets(self): > - if self._BuildTargets is None: > - self._IntroBuildTargetList =3D set() > - self._FinalBuildTargetList =3D set() > - self._BuildTargets =3D defaultdict(set) > - self._FileTypes =3D defaultdict(set) > - > - #TRICK: call SourceFileList property to apply build rule for so= urce files > - self.SourceFileList > - > - #TRICK: call _GetBinaryFileList to apply build rule for binary = files > - self.BinaryFileList > - > - return self._BuildTargets > - > - @cached_property > - def IntroTargetList(self): > - self.Targets > - return self._IntroBuildTargetList > - > - @cached_property > - def CodaTargetList(self): > - self.Targets > - return self._FinalBuildTargetList > - > - @cached_property > - def FileTypes(self): > - self.Targets > - return self._FileTypes > - > - ## Get the list of package object the module depends on > - # > - # @retval list The package object list > - # > - @cached_property > - def DependentPackageList(self): > - return self.Module.Packages > - > - ## Return the list of auto-generated code file > - # > - # @retval list The list of auto-generated file > - # > - @cached_property > - def AutoGenFileList(self): > - AutoGenUniIdf =3D self.BuildType !=3D 'UEFI_HII' > - UniStringBinBuffer =3D BytesIO() > - IdfGenBinBuffer =3D BytesIO() > - RetVal =3D {} > - AutoGenC =3D TemplateString() > - AutoGenH =3D TemplateString() > - StringH =3D TemplateString() > - StringIdf =3D TemplateString() > - GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUniId= f, UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer) > - # > - # AutoGen.c is generated if there are library classes in inf, o= r there are object files > - # > - if str(AutoGenC) !=3D "" and (len(self.Module.LibraryClasses) >= 0 > - or TAB_OBJECT_FILE in self.FileType= s): > - AutoFile =3D PathClass(gAutoGenCodeFileName, self.DebugDir) > - RetVal[AutoFile] =3D str(AutoGenC) > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > - if str(AutoGenH) !=3D "": > - AutoFile =3D PathClass(gAutoGenHeaderFileName, self.DebugDi= r) > - RetVal[AutoFile] =3D str(AutoGenH) > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > - if str(StringH) !=3D "": > - AutoFile =3D PathClass(gAutoGenStringFileName % {"module_na= me":self.Name}, self.DebugDir) > - RetVal[AutoFile] =3D str(StringH) > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > - if UniStringBinBuffer is not None and UniStringBinBuffer.getval= ue() !=3D b"": > - AutoFile =3D PathClass(gAutoGenStringFormFileName % {"modul= e_name":self.Name}, self.OutputDir) > - RetVal[AutoFile] =3D UniStringBinBuffer.getvalue() > - AutoFile.IsBinary =3D True > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > - if UniStringBinBuffer is not None: > - UniStringBinBuffer.close() > - if str(StringIdf) !=3D "": > - AutoFile =3D PathClass(gAutoGenImageDefFileName % {"module_= name":self.Name}, self.DebugDir) > - RetVal[AutoFile] =3D str(StringIdf) > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > - if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue() != = =3D b"": > - AutoFile =3D PathClass(gAutoGenIdfFileName % {"module_name"= :self.Name}, self.OutputDir) > - RetVal[AutoFile] =3D IdfGenBinBuffer.getvalue() > - AutoFile.IsBinary =3D True > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > - if IdfGenBinBuffer is not None: > - IdfGenBinBuffer.close() > - return RetVal > - > - ## Return the list of library modules explicitly or implicitly used= by this module > - @cached_property > - def DependentLibraryList(self): > - # only merge library classes and PCD for non-library module > - if self.IsLibrary: > - return [] > - return self.PlatformInfo.ApplyLibraryInstance(self.Module) > - > - ## Get the list of PCDs from current module > - # > - # @retval list The list of PCD > - # > - @cached_property > - def ModulePcdList(self): > - # apply PCD settings from platform > - RetVal =3D self.PlatformInfo.ApplyPcdSetting(self.Module, self.= Module.Pcds) > - ExtendCopyDictionaryLists(self._PcdComments, self.Module.PcdCom= ments) > - return RetVal > - > - ## Get the list of PCDs from dependent libraries > - # > - # @retval list The list of PCD > - # > - @cached_property > - def LibraryPcdList(self): > - if self.IsLibrary: > - return [] > - RetVal =3D [] > - Pcds =3D set() > - # get PCDs from dependent libraries > - for Library in self.DependentLibraryList: > - PcdsInLibrary =3D OrderedDict() > - ExtendCopyDictionaryLists(self._PcdComments, Library.PcdCom= ments) > - for Key in Library.Pcds: > - # skip duplicated PCDs > - if Key in self.Module.Pcds or Key in Pcds: > - continue > - Pcds.add(Key) > - PcdsInLibrary[Key] =3D copy.copy(Library.Pcds[Key]) > - RetVal.extend(self.PlatformInfo.ApplyPcdSetting(self.Module= , PcdsInLibrary, Library=3DLibrary)) > - return RetVal > - > - ## Get the GUID value mapping > - # > - # @retval dict The mapping between GUID cname and its valu= e > - # > - @cached_property > - def GuidList(self): > - RetVal =3D OrderedDict(self.Module.Guids) > - for Library in self.DependentLibraryList: > - RetVal.update(Library.Guids) > - ExtendCopyDictionaryLists(self._GuidComments, Library.GuidC= omments) > - ExtendCopyDictionaryLists(self._GuidComments, self.Module.GuidC= omments) > - return RetVal > - > - @cached_property > - def GetGuidsUsedByPcd(self): > - RetVal =3D OrderedDict(self.Module.GetGuidsUsedByPcd()) > - for Library in self.DependentLibraryList: > - RetVal.update(Library.GetGuidsUsedByPcd()) > - return RetVal > - ## Get the protocol value mapping > - # > - # @retval dict The mapping between protocol cname and its = value > - # > - @cached_property > - def ProtocolList(self): > - RetVal =3D OrderedDict(self.Module.Protocols) > - for Library in self.DependentLibraryList: > - RetVal.update(Library.Protocols) > - ExtendCopyDictionaryLists(self._ProtocolComments, Library.P= rotocolComments) > - ExtendCopyDictionaryLists(self._ProtocolComments, self.Module.P= rotocolComments) > - return RetVal > - > - ## Get the PPI value mapping > - # > - # @retval dict The mapping between PPI cname and its value > - # > - @cached_property > - def PpiList(self): > - RetVal =3D OrderedDict(self.Module.Ppis) > - for Library in self.DependentLibraryList: > - RetVal.update(Library.Ppis) > - ExtendCopyDictionaryLists(self._PpiComments, Library.PpiCom= ments) > - ExtendCopyDictionaryLists(self._PpiComments, self.Module.PpiCom= ments) > - return RetVal > - > - ## Get the list of include search path > - # > - # @retval list The list path > - # > - @cached_property > - def IncludePathList(self): > - RetVal =3D [] > - RetVal.append(self.MetaFile.Dir) > - RetVal.append(self.DebugDir) > - > - for Package in self.Module.Packages: > - PackageDir =3D mws.join(self.WorkspaceDir, Package.MetaFile= .Dir) > - if PackageDir not in RetVal: > - RetVal.append(PackageDir) > - IncludesList =3D Package.Includes > - if Package._PrivateIncludes: > - if not self.MetaFile.OriginalPath.Path.startswith(Packa= geDir): > - IncludesList =3D list(set(Package.Includes).differe= nce(set(Package._PrivateIncludes))) > - for Inc in IncludesList: > - if Inc not in RetVal: > - RetVal.append(str(Inc)) > - return RetVal > - > - @cached_property > - def IncludePathLength(self): > - return sum(len(inc)+1 for inc in self.IncludePathList) > - > - ## Get HII EX PCDs which maybe used by VFR > - # > - # efivarstore used by VFR may relate with HII EX PCDs > - # Get the variable name and GUID from efivarstore and HII EX PCD > - # List the HII EX PCDs in As Built INF if both name and GUID match= . > - # > - # @retval list HII EX PCDs > - # > - def _GetPcdsMaybeUsedByVfr(self): > - if not self.SourceFileList: > - return [] > - > - NameGuids =3D set() > - for SrcFile in self.SourceFileList: > - if SrcFile.Ext.lower() !=3D '.vfr': > - continue > - Vfri =3D os.path.join(self.OutputDir, SrcFile.BaseName + '.= i') > - if not os.path.exists(Vfri): > - continue > - VfriFile =3D open(Vfri, 'r') > - Content =3D VfriFile.read() > - VfriFile.close() > - Pos =3D Content.find('efivarstore') > - while Pos !=3D -1: > - # > - # Make sure 'efivarstore' is the start of efivarstore s= tatement > - # In case of the value of 'name' (name =3D efivarstore)= is equal to 'efivarstore' > - # > - Index =3D Pos - 1 > - while Index >=3D 0 and Content[Index] in ' \t\r\n': > - Index -=3D 1 > - if Index >=3D 0 and Content[Index] !=3D ';': > - Pos =3D Content.find('efivarstore', Pos + len('efiv= arstore')) > - continue > - # > - # 'efivarstore' must be followed by name and guid > - # > - Name =3D gEfiVarStoreNamePattern.search(Content, Pos) > - if not Name: > - break > - Guid =3D gEfiVarStoreGuidPattern.search(Content, Pos) > - if not Guid: > - break > - NameArray =3D _ConvertStringToByteArray('L"' + Name.gro= up(1) + '"') > - NameGuids.add((NameArray, GuidStructureStringToGuidStri= ng(Guid.group(1)))) > - Pos =3D Content.find('efivarstore', Name.end()) > - if not NameGuids: > - return [] > - HiiExPcds =3D [] > - for Pcd in self.PlatformInfo.Platform.Pcds.values(): > - if Pcd.Type !=3D TAB_PCDS_DYNAMIC_EX_HII: > - continue > - for SkuInfo in Pcd.SkuInfoList.values(): > - Value =3D GuidValue(SkuInfo.VariableGuid, self.Platform= Info.PackageList, self.MetaFile.Path) > - if not Value: > - continue > - Name =3D _ConvertStringToByteArray(SkuInfo.VariableName= ) > - Guid =3D GuidStructureStringToGuidString(Value) > - if (Name, Guid) in NameGuids and Pcd not in HiiExPcds: > - HiiExPcds.append(Pcd) > - break > - > - return HiiExPcds > - > - def _GenOffsetBin(self): > - VfrUniBaseName =3D {} > - for SourceFile in self.Module.Sources: > - if SourceFile.Type.upper() =3D=3D ".VFR" : > - # > - # search the .map file to find the offset of vfr binary= in the PE32+/TE file. > - # > - VfrUniBaseName[SourceFile.BaseName] =3D (SourceFile.Bas= eName + "Bin") > - elif SourceFile.Type.upper() =3D=3D ".UNI" : > - # > - # search the .map file to find the offset of Uni string= s binary in the PE32+/TE file. > - # > - VfrUniBaseName["UniOffsetName"] =3D (self.Name + "Strin= gs") > - > - if not VfrUniBaseName: > - return None > - MapFileName =3D os.path.join(self.OutputDir, self.Name + ".map"= ) > - EfiFileName =3D os.path.join(self.OutputDir, self.Name + ".efi"= ) > - VfrUniOffsetList =3D GetVariableOffset(MapFileName, EfiFileName= , list(VfrUniBaseName.values())) > - if not VfrUniOffsetList: > - return None > - > - OutputName =3D '%sOffset.bin' % self.Name > - UniVfrOffsetFileName =3D os.path.join( self.OutputDir, Outp= utName) > - > - try: > - fInputfile =3D open(UniVfrOffsetFileName, "wb+", 0) > - except: > - EdkLogger.error("build", FILE_OPEN_FAILURE, "File open fail= ed for %s" % UniVfrOffsetFileName, None) > - > - # Use a instance of BytesIO to cache data > - fStringIO =3D BytesIO() > - > - for Item in VfrUniOffsetList: > - if (Item[0].find("Strings") !=3D -1): > - # > - # UNI offset in image. > - # GUID + Offset > - # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0xe= f, 0x89, 0xfc, 0x6, 0x66 } } > - # > - UniGuid =3D b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x= 89\xfc\x06f' > - fStringIO.write(UniGuid) > - UniValue =3D pack ('Q', int (Item[1], 16)) > - fStringIO.write (UniValue) > - else: > - # > - # VFR binary offset in image. > - # GUID + Offset > - # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0x7= , 0x46, 0xda, 0x6, 0xa2 } }; > - # > - VfrGuid =3D b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\x= a2' > - fStringIO.write(VfrGuid) > - VfrValue =3D pack ('Q', int (Item[1], 16)) > - fStringIO.write (VfrValue) > - # > - # write data into file. > - # > - try : > - fInputfile.write (fStringIO.getvalue()) > - except: > - EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to= file %s failed, please check whether the " > - "file been locked or using by other applica= tions." %UniVfrOffsetFileName, None) > - > - fStringIO.close () > - fInputfile.close () > - return OutputName > - > - @cached_property > - def OutputFile(self): > - retVal =3D set() > - OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > - DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > - for Item in self.CodaTargetList: > - File =3D Item.Target.Path.replace('\\', '/').strip('/').rep= lace(DebugDir, '').replace(OutputDir, '').strip('/') > - retVal.add(File) > - if self.DepexGenerated: > - retVal.add(self.Name + '.depex') > - > - Bin =3D self._GenOffsetBin() > - if Bin: > - retVal.add(Bin) > - > - for Root, Dirs, Files in os.walk(OutputDir): > - for File in Files: > - if File.lower().endswith('.pdb'): > - retVal.add(File) > - > - return retVal > - > - ## Create AsBuilt INF file the module > - # > - def CreateAsBuiltInf(self): > - > - if self.IsAsBuiltInfCreated: > - return > - > - # Skip INF file generation for libraries > - if self.IsLibrary: > - return > - > - # Skip the following code for modules with no source files > - if not self.SourceFileList: > - return > - > - # Skip the following code for modules without any binary files > - if self.BinaryFileList: > - return > - > - ### TODO: How to handles mixed source and binary modules > - > - # Find all DynamicEx and PatchableInModule PCDs used by this mo= dule and dependent libraries > - # Also find all packages that the DynamicEx PCDs depend on > - Pcds =3D [] > - PatchablePcds =3D [] > - Packages =3D [] > - PcdCheckList =3D [] > - PcdTokenSpaceList =3D [] > - for Pcd in self.ModulePcdList + self.LibraryPcdList: > - if Pcd.Type =3D=3D TAB_PCDS_PATCHABLE_IN_MODULE: > - PatchablePcds.append(Pcd) > - PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuid= CName, TAB_PCDS_PATCHABLE_IN_MODULE)) > - elif Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET: > - if Pcd not in Pcds: > - Pcds.append(Pcd) > - PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpace= GuidCName, TAB_PCDS_DYNAMIC_EX)) > - PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpace= GuidCName, TAB_PCDS_DYNAMIC)) > - PcdTokenSpaceList.append(Pcd.TokenSpaceGuidCName) > - GuidList =3D OrderedDict(self.GuidList) > - for TokenSpace in self.GetGuidsUsedByPcd: > - # If token space is not referred by patch PCD or Ex PCD, re= move the GUID from GUID list > - # The GUIDs in GUIDs section should really be the GUIDs in = source INF or referred by Ex an patch PCDs > - if TokenSpace not in PcdTokenSpaceList and TokenSpace in Gu= idList: > - GuidList.pop(TokenSpace) > - CheckList =3D (GuidList, self.PpiList, self.ProtocolList, PcdCh= eckList) > - for Package in self.DerivedPackageList: > - if Package in Packages: > - continue > - BeChecked =3D (Package.Guids, Package.Ppis, Package.Protoco= ls, Package.Pcds) > - Found =3D False > - for Index in range(len(BeChecked)): > - for Item in CheckList[Index]: > - if Item in BeChecked[Index]: > - Packages.append(Package) > - Found =3D True > - break > - if Found: > - break > - > - VfrPcds =3D self._GetPcdsMaybeUsedByVfr() > - for Pkg in self.PlatformInfo.PackageList: > - if Pkg in Packages: > - continue > - for VfrPcd in VfrPcds: > - if ((VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB= _PCDS_DYNAMIC_EX) in Pkg.Pcds or > - (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB= _PCDS_DYNAMIC) in Pkg.Pcds): > - Packages.append(Pkg) > - break > - > - ModuleType =3D SUP_MODULE_DXE_DRIVER if self.ModuleType =3D=3D = SUP_MODULE_UEFI_DRIVER and self.DepexGenerated else self.ModuleType > - DriverType =3D self.PcdIsDriver if self.PcdIsDriver else '' > - Guid =3D self.Guid > - MDefs =3D self.Module.Defines > - > - AsBuiltInfDict =3D { > - 'module_name' : self.Name, > - 'module_guid' : Guid, > - 'module_module_type' : ModuleType, > - 'module_version_string' : [MDefs['VERSION_STRING'= ]] if 'VERSION_STRING' in MDefs else [], > - 'pcd_is_driver_string' : [], > - 'module_uefi_specification_version' : [], > - 'module_pi_specification_version' : [], > - 'module_entry_point' : self.Module.ModuleEntry= PointList, > - 'module_unload_image' : self.Module.ModuleUnloa= dImageList, > - 'module_constructor' : self.Module.Constructor= List, > - 'module_destructor' : self.Module.DestructorL= ist, > - 'module_shadow' : [MDefs['SHADOW']] if 'S= HADOW' in MDefs else [], > - 'module_pci_vendor_id' : [MDefs['PCI_VENDOR_ID']= ] if 'PCI_VENDOR_ID' in MDefs else [], > - 'module_pci_device_id' : [MDefs['PCI_DEVICE_ID']= ] if 'PCI_DEVICE_ID' in MDefs else [], > - 'module_pci_class_code' : [MDefs['PCI_CLASS_CODE'= ]] if 'PCI_CLASS_CODE' in MDefs else [], > - 'module_pci_revision' : [MDefs['PCI_REVISION']]= if 'PCI_REVISION' in MDefs else [], > - 'module_build_number' : [MDefs['BUILD_NUMBER']]= if 'BUILD_NUMBER' in MDefs else [], > - 'module_spec' : [MDefs['SPEC']] if 'SPE= C' in MDefs else [], > - 'module_uefi_hii_resource_section' : [MDefs['UEFI_HII_RESOUR= CE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [], > - 'module_uni_file' : [MDefs['MODULE_UNI_FILE= ']] if 'MODULE_UNI_FILE' in MDefs else [], > - 'module_arch' : self.Arch, > - 'package_item' : [Package.MetaFile.File.= replace('\\', '/') for Package in Packages], > - 'binary_item' : [], > - 'patchablepcd_item' : [], > - 'pcd_item' : [], > - 'protocol_item' : [], > - 'ppi_item' : [], > - 'guid_item' : [], > - 'flags_item' : [], > - 'libraryclasses_item' : [] > - } > - > - if 'MODULE_UNI_FILE' in MDefs: > - UNIFile =3D os.path.join(self.MetaFile.Dir, MDefs['MODULE_U= NI_FILE']) > - if os.path.isfile(UNIFile): > - shutil.copy2(UNIFile, self.OutputDir) > - > - if self.AutoGenVersion > int(gInfSpecVersion, 0): > - AsBuiltInfDict['module_inf_version'] =3D '0x%08x' % self.Au= toGenVersion > - else: > - AsBuiltInfDict['module_inf_version'] =3D gInfSpecVersion > - > - if DriverType: > - AsBuiltInfDict['pcd_is_driver_string'].append(DriverType) > - > - if 'UEFI_SPECIFICATION_VERSION' in self.Specification: > - AsBuiltInfDict['module_uefi_specification_version'].append(= self.Specification['UEFI_SPECIFICATION_VERSION']) > - if 'PI_SPECIFICATION_VERSION' in self.Specification: > - AsBuiltInfDict['module_pi_specification_version'].append(se= lf.Specification['PI_SPECIFICATION_VERSION']) > - > - OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > - DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > - for Item in self.CodaTargetList: > - File =3D Item.Target.Path.replace('\\', '/').strip('/').rep= lace(DebugDir, '').replace(OutputDir, '').strip('/') > - if os.path.isabs(File): > - File =3D File.replace('\\', '/').strip('/').replace(Out= putDir, '').strip('/') > - if Item.Target.Ext.lower() =3D=3D '.aml': > - AsBuiltInfDict['binary_item'].append('ASL|' + File) > - elif Item.Target.Ext.lower() =3D=3D '.acpi': > - AsBuiltInfDict['binary_item'].append('ACPI|' + File) > - elif Item.Target.Ext.lower() =3D=3D '.efi': > - AsBuiltInfDict['binary_item'].append('PE32|' + self.Nam= e + '.efi') > - else: > - AsBuiltInfDict['binary_item'].append('BIN|' + File) > - if not self.DepexGenerated: > - DepexFile =3D os.path.join(self.OutputDir, self.Name + '.de= pex') > - if os.path.exists(DepexFile): > - self.DepexGenerated =3D True > - if self.DepexGenerated: > - if self.ModuleType in [SUP_MODULE_PEIM]: > - AsBuiltInfDict['binary_item'].append('PEI_DEPEX|' + sel= f.Name + '.depex') > - elif self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODULE_= DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]: > - AsBuiltInfDict['binary_item'].append('DXE_DEPEX|' + sel= f.Name + '.depex') > - elif self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]: > - AsBuiltInfDict['binary_item'].append('SMM_DEPEX|' + sel= f.Name + '.depex') > - > - Bin =3D self._GenOffsetBin() > - if Bin: > - AsBuiltInfDict['binary_item'].append('BIN|%s' % Bin) > - > - for Root, Dirs, Files in os.walk(OutputDir): > - for File in Files: > - if File.lower().endswith('.pdb'): > - AsBuiltInfDict['binary_item'].append('DISPOSABLE|' = + File) > - HeaderComments =3D self.Module.HeaderComments > - StartPos =3D 0 > - for Index in range(len(HeaderComments)): > - if HeaderComments[Index].find('@BinaryHeader') !=3D -1: > - HeaderComments[Index] =3D HeaderComments[Index].replace= ('@BinaryHeader', '@file') > - StartPos =3D Index > - break > - AsBuiltInfDict['header_comments'] =3D '\n'.join(HeaderComments[= StartPos:]).replace(':#', '://') > - AsBuiltInfDict['tail_comments'] =3D '\n'.join(self.Module.TailC= omments) > - > - GenList =3D [ > - (self.ProtocolList, self._ProtocolComments, 'protocol_item'= ), > - (self.PpiList, self._PpiComments, 'ppi_item'), > - (GuidList, self._GuidComments, 'guid_item') > - ] > - for Item in GenList: > - for CName in Item[0]: > - Comments =3D '\n '.join(Item[1][CName]) if CName in It= em[1] else '' > - Entry =3D Comments + '\n ' + CName if Comments else CN= ame > - AsBuiltInfDict[Item[2]].append(Entry) > - PatchList =3D parsePcdInfoFromMapFile( > - os.path.join(self.OutputDir, self.Name + '.= map'), > - os.path.join(self.OutputDir, self.Name + '.= efi') > - ) > - if PatchList: > - for Pcd in PatchablePcds: > - TokenCName =3D Pcd.TokenCName > - for PcdItem in GlobalData.MixedPcd: > - if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in Glo= balData.MixedPcd[PcdItem]: > - TokenCName =3D PcdItem[0] > - break > - for PatchPcd in PatchList: > - if TokenCName =3D=3D PatchPcd[0]: > - break > - else: > - continue > - PcdValue =3D '' > - if Pcd.DatumType =3D=3D 'BOOLEAN': > - BoolValue =3D Pcd.DefaultValue.upper() > - if BoolValue =3D=3D 'TRUE': > - Pcd.DefaultValue =3D '1' > - elif BoolValue =3D=3D 'FALSE': > - Pcd.DefaultValue =3D '0' > - > - if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES: > - HexFormat =3D '0x%02x' > - if Pcd.DatumType =3D=3D TAB_UINT16: > - HexFormat =3D '0x%04x' > - elif Pcd.DatumType =3D=3D TAB_UINT32: > - HexFormat =3D '0x%08x' > - elif Pcd.DatumType =3D=3D TAB_UINT64: > - HexFormat =3D '0x%016x' > - PcdValue =3D HexFormat % int(Pcd.DefaultValue, 0) > - else: > - if Pcd.MaxDatumSize is None or Pcd.MaxDatumSize =3D= = =3D '': > - EdkLogger.error("build", AUTOGEN_ERROR, > - "Unknown [MaxDatumSize] of PCD = [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName) > - ) > - ArraySize =3D int(Pcd.MaxDatumSize, 0) > - PcdValue =3D Pcd.DefaultValue > - if PcdValue[0] !=3D '{': > - Unicode =3D False > - if PcdValue[0] =3D=3D 'L': > - Unicode =3D True > - PcdValue =3D PcdValue.lstrip('L') > - PcdValue =3D eval(PcdValue) > - NewValue =3D '{' > - for Index in range(0, len(PcdValue)): > - if Unicode: > - CharVal =3D ord(PcdValue[Index]) > - NewValue =3D NewValue + '0x%02x' % (Cha= rVal & 0x00FF) + ', ' \ > - + '0x%02x' % (CharVal >> 8) + '= , ' > - else: > - NewValue =3D NewValue + '0x%02x' % (ord= (PcdValue[Index]) % 0x100) + ', ' > - Padding =3D '0x00, ' > - if Unicode: > - Padding =3D Padding * 2 > - ArraySize =3D ArraySize // 2 > - if ArraySize < (len(PcdValue) + 1): > - if Pcd.MaxSizeUserSet: > - EdkLogger.error("build", AUTOGEN_ERROR, > - "The maximum size of VOID* = type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceG= uidCName, TokenCName) > - ) > - else: > - ArraySize =3D len(PcdValue) + 1 > - if ArraySize > len(PcdValue) + 1: > - NewValue =3D NewValue + Padding * (ArraySiz= e - len(PcdValue) - 1) > - PcdValue =3D NewValue + Padding.strip().rstrip(= ',') + '}' > - elif len(PcdValue.split(',')) <=3D ArraySize: > - PcdValue =3D PcdValue.rstrip('}') + ', 0x00' * = (ArraySize - len(PcdValue.split(','))) > - PcdValue +=3D '}' > - else: > - if Pcd.MaxSizeUserSet: > - EdkLogger.error("build", AUTOGEN_ERROR, > - "The maximum size of VOID* type= PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidC= Name, TokenCName) > - ) > - else: > - ArraySize =3D len(PcdValue) + 1 > - PcdItem =3D '%s.%s|%s|0x%X' % \ > - (Pcd.TokenSpaceGuidCName, TokenCName, PcdValue, Pat= chPcd[1]) > - PcdComments =3D '' > - if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._P= cdComments: > - PcdComments =3D '\n '.join(self._PcdComments[Pcd.T= okenSpaceGuidCName, Pcd.TokenCName]) > - if PcdComments: > - PcdItem =3D PcdComments + '\n ' + PcdItem > - AsBuiltInfDict['patchablepcd_item'].append(PcdItem) > - > - for Pcd in Pcds + VfrPcds: > - PcdCommentList =3D [] > - HiiInfo =3D '' > - TokenCName =3D Pcd.TokenCName > - for PcdItem in GlobalData.MixedPcd: > - if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalD= ata.MixedPcd[PcdItem]: > - TokenCName =3D PcdItem[0] > - break > - if Pcd.Type =3D=3D TAB_PCDS_DYNAMIC_EX_HII: > - for SkuName in Pcd.SkuInfoList: > - SkuInfo =3D Pcd.SkuInfoList[SkuName] > - HiiInfo =3D '## %s|%s|%s' % (SkuInfo.VariableName, = SkuInfo.VariableGuid, SkuInfo.VariableOffset) > - break > - if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdCo= mments: > - PcdCommentList =3D self._PcdComments[Pcd.TokenSpaceGuid= CName, Pcd.TokenCName][:] > - if HiiInfo: > - UsageIndex =3D -1 > - UsageStr =3D '' > - for Index, Comment in enumerate(PcdCommentList): > - for Usage in UsageList: > - if Comment.find(Usage) !=3D -1: > - UsageStr =3D Usage > - UsageIndex =3D Index > - break > - if UsageIndex !=3D -1: > - PcdCommentList[UsageIndex] =3D '## %s %s %s' % (Usa= geStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, '')) > - else: > - PcdCommentList.append('## UNDEFINED ' + HiiInfo) > - PcdComments =3D '\n '.join(PcdCommentList) > - PcdEntry =3D Pcd.TokenSpaceGuidCName + '.' + TokenCName > - if PcdComments: > - PcdEntry =3D PcdComments + '\n ' + PcdEntry > - AsBuiltInfDict['pcd_item'].append(PcdEntry) > - for Item in self.BuildOption: > - if 'FLAGS' in self.BuildOption[Item]: > - AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_FLA= GS =3D %s' % (self.ToolChainFamily, self.BuildTarget, self.ToolChain, self.= Arch, Item, self.BuildOption[Item]['FLAGS'].strip())) > - > - # Generated LibraryClasses section in comments. > - for Library in self.LibraryAutoGenList: > - AsBuiltInfDict['libraryclasses_item'].append(Library.MetaFi= le.File.replace('\\', '/')) > - > - # Generated UserExtensions TianoCore section. > - # All tianocore user extensions are copied. > - UserExtStr =3D '' > - for TianoCore in self._GetTianoCoreUserExtensionList(): > - UserExtStr +=3D '\n'.join(TianoCore) > - ExtensionFile =3D os.path.join(self.MetaFile.Dir, TianoCore= [1]) > - if os.path.isfile(ExtensionFile): > - shutil.copy2(ExtensionFile, self.OutputDir) > - AsBuiltInfDict['userextension_tianocore_item'] =3D UserExtStr > - > - # Generated depex expression section in comments. > - DepexExpression =3D self._GetDepexExpresionString() > - AsBuiltInfDict['depexsection_item'] =3D DepexExpression if Depe= xExpression else '' > - > - AsBuiltInf =3D TemplateString() > - AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfDic= t)) > - > - SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.inf= '), str(AsBuiltInf), False) > - > - self.IsAsBuiltInfCreated =3D True > - > - def CopyModuleToCache(self): > - FileDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformIn= fo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sour= ceDir, self.MetaFile.BaseName) > - CreateDirectory (FileDir) > - HashFile =3D path.join(self.BuildDir, self.Name + '.hash') > - if os.path.exists(HashFile): > - CopyFileOnChange(HashFile, FileDir) > - ModuleFile =3D path.join(self.OutputDir, self.Name + '.inf') > - if os.path.exists(ModuleFile): > - CopyFileOnChange(ModuleFile, FileDir) > - > - if not self.OutputFile: > - Ma =3D self.BuildDatabase[self.MetaFile, self.Arch, self.Bu= ildTarget, self.ToolChain] > - self.OutputFile =3D Ma.Binaries > - > - for File in self.OutputFile: > - File =3D str(File) > - if not os.path.isabs(File): > - File =3D os.path.join(self.OutputDir, File) > - if os.path.exists(File): > - sub_dir =3D os.path.relpath(File, self.OutputDir) > - destination_file =3D os.path.join(FileDir, sub_dir) > - destination_dir =3D os.path.dirname(destination_file) > - CreateDirectory(destination_dir) > - CopyFileOnChange(File, destination_dir) > - > - def AttemptModuleCacheCopy(self): > - # If library or Module is binary do not skip by hash > - if self.IsBinaryModule: > - return False > - # .inc is contains binary information so do not skip by hash as= well > - for f_ext in self.SourceFileList: > - if '.inc' in str(f_ext): > - return False > - FileDir =3D path.join(GlobalData.gBinCacheSource, self.Platform= Info.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.So= urceDir, self.MetaFile.BaseName) > - HashFile =3D path.join(FileDir, self.Name + '.hash') > - if os.path.exists(HashFile): > - f =3D open(HashFile, 'r') > - CacheHash =3D f.read() > - f.close() > - self.GenModuleHash() > - if GlobalData.gModuleHash[self.Arch][self.Name]: > - if CacheHash =3D=3D GlobalData.gModuleHash[self.Arch][s= elf.Name]: > - for root, dir, files in os.walk(FileDir): > - for f in files: > - if self.Name + '.hash' in f: > - CopyFileOnChange(HashFile, self.BuildDi= r) > - else: > - File =3D path.join(root, f) > - sub_dir =3D os.path.relpath(File, FileD= ir) > - destination_file =3D os.path.join(self.= OutputDir, sub_dir) > - destination_dir =3D os.path.dirname(des= tination_file) > - CreateDirectory(destination_dir) > - CopyFileOnChange(File, destination_dir) > - if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "= PcdDxe": > - CreatePcdDatabaseCode(self, TemplateString(), T= emplateString()) > - return True > - return False > - > - ## Create makefile for the module and its dependent libraries > - # > - # @param CreateLibraryMakeFile Flag indicating if or not t= he makefiles of > - # dependent libraries will be= created > - # > - @cached_class_function > - def CreateMakeFile(self, CreateLibraryMakeFile=3DTrue, GenFfsList = =3D []): > - # nest this function inside its only caller. > - def CreateTimeStamp(): > - FileSet =3D {self.MetaFile.Path} > - > - for SourceFile in self.Module.Sources: > - FileSet.add (SourceFile.Path) > - > - for Lib in self.DependentLibraryList: > - FileSet.add (Lib.MetaFile.Path) > - > - for f in self.AutoGenDepSet: > - FileSet.add (f.Path) > - > - if os.path.exists (self.TimeStampPath): > - os.remove (self.TimeStampPath) > - with open(self.TimeStampPath, 'w+') as file: > - for f in FileSet: > - print(f, file=3Dfile) > - > - # Ignore generating makefile when it is a binary module > - if self.IsBinaryModule: > - return > - > - self.GenFfsList =3D GenFfsList > - if not self.IsLibrary and CreateLibraryMakeFile: > - for LibraryAutoGen in self.LibraryAutoGenList: > - LibraryAutoGen.CreateMakeFile() > - > - # Don't enable if hash feature enabled, CanSkip uses timestamps= to determine build skipping > - if not GlobalData.gUseHashCache and self.CanSkip(): > - return > - > - if len(self.CustomMakefile) =3D=3D 0: > - Makefile =3D GenMake.ModuleMakefile(self) > - else: > - Makefile =3D GenMake.CustomMakefile(self) > - if Makefile.Generate(): > - EdkLogger.debug(EdkLogger.DEBUG_9, "Generated makefile for = module %s [%s]" % > - (self.Name, self.Arch)) > - else: > - EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation = of makefile for module %s [%s]" % > - (self.Name, self.Arch)) > - > - CreateTimeStamp() > - > - def CopyBinaryFiles(self): > - for File in self.Module.Binaries: > - SrcPath =3D File.Path > - DstPath =3D os.path.join(self.OutputDir, os.path.basename(S= rcPath)) > - CopyLongFilePath(SrcPath, DstPath) > - ## Create autogen code for the module and its dependent libraries > - # > - # @param CreateLibraryCodeFile Flag indicating if or not t= he code of > - # dependent libraries will be= created > - # > - def CreateCodeFile(self, CreateLibraryCodeFile=3DTrue): > - if self.IsCodeFileCreated: > - return > - > - # Need to generate PcdDatabase even PcdDriver is binarymodule > - if self.IsBinaryModule and self.PcdIsDriver !=3D '': > - CreatePcdDatabaseCode(self, TemplateString(), TemplateStrin= g()) > - return > - if self.IsBinaryModule: > - if self.IsLibrary: > - self.CopyBinaryFiles() > - return > - > - if not self.IsLibrary and CreateLibraryCodeFile: > - for LibraryAutoGen in self.LibraryAutoGenList: > - LibraryAutoGen.CreateCodeFile() > - > - # Don't enable if hash feature enabled, CanSkip uses timestamps= to determine build skipping > - if not GlobalData.gUseHashCache and self.CanSkip(): > - return > - > - AutoGenList =3D [] > - IgoredAutoGenList =3D [] > - > - for File in self.AutoGenFileList: > - if GenC.Generate(File.Path, self.AutoGenFileList[File], Fil= e.IsBinary): > - AutoGenList.append(str(File)) > - else: > - IgoredAutoGenList.append(str(File)) > - > - > - for ModuleType in self.DepexList: > - # Ignore empty [depex] section or [depex] section for SUP_M= ODULE_USER_DEFINED module > - if len(self.DepexList[ModuleType]) =3D=3D 0 or ModuleType = =3D=3D SUP_MODULE_USER_DEFINED or ModuleType =3D=3D SUP_MODULE_HOST_APPLIC= ATION: > - continue > - > - Dpx =3D GenDepex.DependencyExpression(self.DepexList[Module= Type], ModuleType, True) > - DpxFile =3D gAutoGenDepexFileName % {"module_name" : self.N= ame} > - > - if len(Dpx.PostfixNotation) !=3D 0: > - self.DepexGenerated =3D True > - > - if Dpx.Generate(path.join(self.OutputDir, DpxFile)): > - AutoGenList.append(str(DpxFile)) > - else: > - IgoredAutoGenList.append(str(DpxFile)) > - > - if IgoredAutoGenList =3D=3D []: > - EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] files fo= r module %s [%s]" % > - (" ".join(AutoGenList), self.Name, self.Arc= h)) > - elif AutoGenList =3D=3D []: > - EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation = of [%s] files for module %s [%s]" % > - (" ".join(IgoredAutoGenList), self.Name, se= lf.Arch)) > - else: > - EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] (skipped= %s) files for module %s [%s]" % > - (" ".join(AutoGenList), " ".join(IgoredAuto= GenList), self.Name, self.Arch)) > - > - self.IsCodeFileCreated =3D True > - return AutoGenList > - > - ## Summarize the ModuleAutoGen objects of all libraries used by thi= s module > - @cached_property > - def LibraryAutoGenList(self): > - RetVal =3D [] > - for Library in self.DependentLibraryList: > - La =3D ModuleAutoGen( > - self.Workspace, > - Library.MetaFile, > - self.BuildTarget, > - self.ToolChain, > - self.Arch, > - self.PlatformInfo.MetaFile > - ) > - if La not in RetVal: > - RetVal.append(La) > - for Lib in La.CodaTargetList: > - self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE) > - return RetVal > - > - def GenModuleHash(self): > - # Initialize a dictionary for each arch type > - if self.Arch not in GlobalData.gModuleHash: > - GlobalData.gModuleHash[self.Arch] =3D {} > - > - # Early exit if module or library has been hashed and is in mem= ory > - if self.Name in GlobalData.gModuleHash[self.Arch]: > - return GlobalData.gModuleHash[self.Arch][self.Name].encode(= 'utf-8') > - > - # Initialze hash object > - m =3D hashlib.md5() > - > - # Add Platform level hash > - m.update(GlobalData.gPlatformHash.encode('utf-8')) > - > - # Add Package level hash > - if self.DependentPackageList: > - for Pkg in sorted(self.DependentPackageList, key=3Dlambda x= : x.PackageName): > - if Pkg.PackageName in GlobalData.gPackageHash: > - m.update(GlobalData.gPackageHash[Pkg.PackageName].e= ncode('utf-8')) > - > - # Add Library hash > - if self.LibraryAutoGenList: > - for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: = x.Name): > - if Lib.Name not in GlobalData.gModuleHash[self.Arch]: > - Lib.GenModuleHash() > - m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].en= code('utf-8')) > - > - # Add Module self > - f =3D open(str(self.MetaFile), 'rb') > - Content =3D f.read() > - f.close() > - m.update(Content) > - > - # Add Module's source files > - if self.SourceFileList: > - for File in sorted(self.SourceFileList, key=3Dlambda x: str= (x)): > - f =3D open(str(File), 'rb') > - Content =3D f.read() > - f.close() > - m.update(Content) > - > - GlobalData.gModuleHash[self.Arch][self.Name] =3D m.hexdigest() > - > - return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf= -8') > - > - ## Decide whether we can skip the ModuleAutoGen process > - def CanSkipbyHash(self): > - # Hashing feature is off > - if not GlobalData.gUseHashCache: > - return False > - > - # Initialize a dictionary for each arch type > - if self.Arch not in GlobalData.gBuildHashSkipTracking: > - GlobalData.gBuildHashSkipTracking[self.Arch] =3D dict() > - > - # If library or Module is binary do not skip by hash > - if self.IsBinaryModule: > - return False > - > - # .inc is contains binary information so do not skip by hash as= well > - for f_ext in self.SourceFileList: > - if '.inc' in str(f_ext): > - return False > - > - # Use Cache, if exists and if Module has a copy in cache > - if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy()= : > - return True > - > - # Early exit for libraries that haven't yet finished building > - HashFile =3D path.join(self.BuildDir, self.Name + ".hash") > - if self.IsLibrary and not os.path.exists(HashFile): > - return False > - > - # Return a Boolean based on if can skip by hash, either from me= mory or from IO. > - if self.Name not in GlobalData.gBuildHashSkipTracking[self.Arch= ]: > - # If hashes are the same, SaveFileOnChange() will return Fa= lse. > - GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] =3D= not SaveFileOnChange(HashFile, self.GenModuleHash(), True) > - return GlobalData.gBuildHashSkipTracking[self.Arch][self.Na= me] > - else: > - return GlobalData.gBuildHashSkipTracking[self.Arch][self.Na= me] > - > - ## Decide whether we can skip the ModuleAutoGen process > - # If any source file is newer than the module than we cannot skip > - # > - def CanSkip(self): > - if self.MakeFileDir in GlobalData.gSikpAutoGenCache: > - return True > - if not os.path.exists(self.TimeStampPath): > - return False > - #last creation time of the module > - DstTimeStamp =3D os.stat(self.TimeStampPath)[8] > - > - SrcTimeStamp =3D self.Workspace._SrcTimeStamp > - if SrcTimeStamp > DstTimeStamp: > - return False > - > - with open(self.TimeStampPath,'r') as f: > - for source in f: > - source =3D source.rstrip('\n') > - if not os.path.exists(source): > - return False > - if source not in ModuleAutoGen.TimeDict : > - ModuleAutoGen.TimeDict[source] =3D os.stat(source)[= 8] > - if ModuleAutoGen.TimeDict[source] > DstTimeStamp: > - return False > - GlobalData.gSikpAutoGenCache.add(self.MakeFileDir) > - return True > - > - @cached_property > - def TimeStampPath(self): > - return os.path.join(self.MakeFileDir, 'AutoGenTimeStamp') > + @classmethod > + def Cache(cls): > + return cls.__ObjectCache > + > +# > +# The priority list while override build option > +# > +PrioList =3D {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE= _ATTRIBUTE (Highest) > + "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE_A= TTRIBUTE > + "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE_A= TTRIBUTE > + "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE_A= TTRIBUTE > + "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE_A= TTRIBUTE > + "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE_A= TTRIBUTE > + "0x10011" : 10, # TARGET_*********_****_COMMANDTYPE_A= TTRIBUTE > + "0x00011" : 9, # ******_*********_****_COMMANDTYPE_A= TTRIBUTE > + "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_***********_A= TTRIBUTE > + "0x01101" : 7, # ******_TOOLCHAIN_ARCH_***********_A= TTRIBUTE > + "0x10101" : 6, # TARGET_*********_ARCH_***********_A= TTRIBUTE > + "0x00101" : 5, # ******_*********_ARCH_***********_A= TTRIBUTE > + "0x11001" : 4, # TARGET_TOOLCHAIN_****_***********_A= TTRIBUTE > + "0x01001" : 3, # ******_TOOLCHAIN_****_***********_A= TTRIBUTE > + "0x10001" : 2, # TARGET_*********_****_***********_A= TTRIBUTE > + "0x00001" : 1} # ******_*********_****_***********_A= TTRIBUTE (Lowest) > +## Calculate the priority value of the build option > +# > +# @param Key Build option definition contain: TARGET_TOOLCHAIN_AR= CH_COMMANDTYPE_ATTRIBUTE > +# > +# @retval Value Priority value based on the priority list. > +# > +def CalculatePriorityValue(Key): > + Target, ToolChain, Arch, CommandType, Attr =3D Key.split('_') > + PriorityValue =3D 0x11111 > + if Target =3D=3D TAB_STAR: > + PriorityValue &=3D 0x01111 > + if ToolChain =3D=3D TAB_STAR: > + PriorityValue &=3D 0x10111 > + if Arch =3D=3D TAB_STAR: > + PriorityValue &=3D 0x11011 > + if CommandType =3D=3D TAB_STAR: > + PriorityValue &=3D 0x11101 > + if Attr =3D=3D TAB_STAR: > + PriorityValue &=3D 0x11110 > + > + return PrioList["0x%0.5x" % PriorityValue] > diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Sou= rce/Python/AutoGen/DataPipe.py > new file mode 100644 > index 000000000000..5bcc39bd380d > --- /dev/null > +++ b/BaseTools/Source/Python/AutoGen/DataPipe.py > @@ -0,0 +1,147 @@ > +## @file > +# Create makefile for MS nmake and GNU make > +# > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> +# SPDX-License-Identifier: BSD-2-Clause-Patent > +# > +from __future__ import absolute_import > +from Workspace.WorkspaceDatabase import BuildDB > +from Workspace.WorkspaceCommon import GetModuleLibInstances > +import Common.GlobalData as GlobalData > +import os > +import pickle > +from pickle import HIGHEST_PROTOCOL > + > +class PCD_DATA(): > + def __init__(self,TokenCName,TokenSpaceGuidCName,Type,DatumType,Sku= InfoList,DefaultValue, > + MaxDatumSize,UserDefinedDefaultStoresFlag,validaterang= es, > + validlists,expressions,CustomAttribute,TokenValue): > + self.TokenCName =3D TokenCName > + self.TokenSpaceGuidCName =3D TokenSpaceGuidCName > + self.Type =3D Type > + self.DatumType =3D DatumType > + self.SkuInfoList =3D SkuInfoList > + self.DefaultValue =3D DefaultValue > + self.MaxDatumSize =3D MaxDatumSize > + self.UserDefinedDefaultStoresFlag =3D UserDefinedDefaultStoresF= lag > + self.validateranges =3D validateranges > + self.validlists =3D validlists > + self.expressions =3D expressions > + self.CustomAttribute =3D CustomAttribute > + self.TokenValue =3D TokenValue > + > +class DataPipe(object): > + def __init__(self, BuildDir=3DNone): > + self.data_container =3D {} > + self.BuildDir =3D BuildDir > + > +class MemoryDataPipe(DataPipe): > + > + def Get(self,key): > + return self.data_container.get(key) > + > + def dump(self,file_path): > + with open(file_path,'wb') as fd: > + pickle.dump(self.data_container,fd,pickle.HIGHEST_PROTOCOL) > + > + def load(self,file_path): > + with open(file_path,'rb') as fd: > + self.data_container =3D pickle.load(fd) > + > + @property > + def DataContainer(self): > + return self.data_container > + @DataContainer.setter > + def DataContainer(self,data): > + self.data_container.update(data) > + > + def FillData(self,PlatformInfo): > + #Platform Pcds > + self.DataContainer =3D { > + "PLA_PCD" : [PCD_DATA( > + pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type, > + pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue, > + pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.valid= ateranges, > + pcd.validlists,pcd.expressions,pcd.CustomAttribute,pcd= .TokenValue) > + for pcd in PlatformInfo.Platform.Pcds.values()] > + } > + > + #Platform Module Pcds > + ModulePcds =3D {} > + for m in PlatformInfo.Platform.Modules: > + m_pcds =3D PlatformInfo.Platform.Modules[m].Pcds > + if m_pcds: > + ModulePcds[(m.File,m.Root)] =3D [PCD_DATA( > + pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type, > + pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue, > + pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.valid= ateranges, > + pcd.validlists,pcd.expressions,pcd.CustomAttribute,pcd= .TokenValue) > + for pcd in PlatformInfo.Platform.Modules[m].Pcds.values()] > + > + > + self.DataContainer =3D {"MOL_PCDS":ModulePcds} > + > + #Module's Library Instance > + ModuleLibs =3D {} > + for m in PlatformInfo.Platform.Modules: > + module_obj =3D BuildDB.BuildObject[m,PlatformInfo.Arch,Plat= formInfo.BuildTarget,PlatformInfo.ToolChain] > + Libs =3D GetModuleLibInstances(module_obj, PlatformInfo.Pla= tform, BuildDB.BuildObject, PlatformInfo.Arch,PlatformInfo.BuildTarget,Plat= formInfo.ToolChain) > + ModuleLibs[(m.File,m.Root,module_obj.Arch)] =3D [(l.MetaFil= e.File,l.MetaFile.Root,l.Arch) for l in Libs] > + self.DataContainer =3D {"DEPS":ModuleLibs} > + > + #Platform BuildOptions > + > + platform_build_opt =3D PlatformInfo.EdkIIBuildOption > + > + ToolDefinition =3D PlatformInfo.ToolDefinition > + module_build_opt =3D {} > + for m in PlatformInfo.Platform.Modules: > + ModuleTypeOptions, PlatformModuleOptions =3D PlatformInfo.G= etGlobalBuildOptions(BuildDB.BuildObject[m,PlatformInfo.Arch,PlatformInfo.B= uildTarget,PlatformInfo.ToolChain]) > + if ModuleTypeOptions or PlatformModuleOptions: > + module_build_opt.update({(m.File,m.Root): {"ModuleTypeO= ptions":ModuleTypeOptions, "PlatformModuleOptions":PlatformModuleOptions}}) > + > + self.DataContainer =3D {"PLA_BO":platform_build_opt, > + "TOOLDEF":ToolDefinition, > + "MOL_BO":module_build_opt > + } > + > + > + > + #Platform Info > + PInfo =3D { > + "WorkspaceDir":PlatformInfo.Workspace.WorkspaceDir, > + "Target":PlatformInfo.BuildTarget, > + "ToolChain":PlatformInfo.Workspace.ToolChain, > + "BuildRuleFile":PlatformInfo.BuildRule, > + "Arch": PlatformInfo.Arch, > + "ArchList":PlatformInfo.Workspace.ArchList, > + "ActivePlatform":PlatformInfo.MetaFile > + } > + self.DataContainer =3D {'P_Info':PInfo} > + > + self.DataContainer =3D {'M_Name':PlatformInfo.UniqueBaseName} > + > + self.DataContainer =3D {"ToolChainFamily": PlatformInfo.ToolCha= inFamily} > + > + self.DataContainer =3D {"BuildRuleFamily": PlatformInfo.BuildRu= leFamily} > + > + self.DataContainer =3D {"MixedPcd":GlobalData.MixedPcd} > + > + self.DataContainer =3D {"BuildOptPcd":GlobalData.BuildOptionPcd= } > + > + self.DataContainer =3D {"BuildCommand": PlatformInfo.BuildComma= nd} > + > + self.DataContainer =3D {"AsBuildModuleList": PlatformInfo._AsBu= ildModuleList} > + > + self.DataContainer =3D {"G_defines": GlobalData.gGlobalDefines} > + > + self.DataContainer =3D {"CL_defines": GlobalData.gCommandLineDe= fines} > + > + self.DataContainer =3D {"Env_Var": {k:v for k, v in os.environ.= items()}} > + > + self.DataContainer =3D {"PackageList": [(dec.MetaFile,dec.Arch)= for dec in PlatformInfo.PackageList]} > + > + self.DataContainer =3D {"GuidDict": PlatformInfo.Platform._Guid= Dict} > + > + self.DataContainer =3D {"FdfParser": True if GlobalData.gFdfPar= ser else False} > + > diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/= Python/AutoGen/GenC.py > index 4cb776206e90..4c3f4e3e55ae 100644 > --- a/BaseTools/Source/Python/AutoGen/GenC.py > +++ b/BaseTools/Source/Python/AutoGen/GenC.py > @@ -1627,11 +1627,11 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH): > TokenSpaceList =3D [] > for Pcd in Info.ModulePcdList: > if Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET and Pcd.TokenSpaceGuidCN= ame not in TokenSpaceList: > TokenSpaceList.append(Pcd.TokenSpaceGuidCName) > > - SkuMgr =3D Info.Workspace.Platform.SkuIdMgr > + SkuMgr =3D Info.PlatformInfo.Platform.SkuIdMgr > AutoGenH.Append("\n// Definition of SkuId Array\n") > AutoGenH.Append("extern UINT64 _gPcd_SkuId_Array[];\n") > # Add extern declarations to AutoGen.h if one or more Token Space G= UIDs were found > if TokenSpaceList: > AutoGenH.Append("\n// Definition of PCD Token Space GUIDs used = in this module\n\n") > diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTool= s/Source/Python/AutoGen/ModuleAutoGen.py > new file mode 100644 > index 000000000000..d19c03862094 > --- /dev/null > +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py > @@ -0,0 +1,1908 @@ > +## @file > +# Create makefile for MS nmake and GNU make > +# > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> +# SPDX-License-Identifier: BSD-2-Clause-Patent > +# > +from __future__ import absolute_import > +from AutoGen.AutoGen import AutoGen > +from Common.LongFilePathSupport import CopyLongFilePath > +from Common.BuildToolError import * > +from Common.DataType import * > +from Common.Misc import * > +from Common.StringUtils import NormPath,GetSplitList > +from collections import defaultdict > +from Workspace.WorkspaceCommon import OrderedListDict > +import os.path as path > +import copy > +import hashlib > +from . import InfSectionParser > +from . import GenC > +from . import GenMake > +from . import GenDepex > +from io import BytesIO > +from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile > +from Workspace.MetaFileCommentParser import UsageList > +from .GenPcdDb import CreatePcdDatabaseCode > +from Common.caching import cached_class_function > +from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo > + > +## Mapping Makefile type > +gMakeTypeMap =3D {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"} > +# > +# Regular expression for finding Include Directories, the difference be= tween MSFT and INTEL/GCC/RVCT > +# is the former use /I , the Latter used -I to specify include director= ies > +# > +gBuildOptIncludePatternMsft =3D re.compile(r"(?:.*?)/I[ \t]*([^ ]*)", r= e.MULTILINE | re.DOTALL) > +gBuildOptIncludePatternOther =3D re.compile(r"(?:.*?)-I[ \t]*([^ ]*)", = re.MULTILINE | re.DOTALL) > + > +## default file name for AutoGen > +gAutoGenCodeFileName =3D "AutoGen.c" > +gAutoGenHeaderFileName =3D "AutoGen.h" > +gAutoGenStringFileName =3D "%(module_name)sStrDefs.h" > +gAutoGenStringFormFileName =3D "%(module_name)sStrDefs.hpk" > +gAutoGenDepexFileName =3D "%(module_name)s.depex" > +gAutoGenImageDefFileName =3D "%(module_name)sImgDefs.h" > +gAutoGenIdfFileName =3D "%(module_name)sIdf.hpk" > +gInfSpecVersion =3D "0x00010017" > + > +# > +# Match name =3D variable > +# > +gEfiVarStoreNamePattern =3D re.compile("\s*name\s*=3D\s*(\w+)") > +# > +# The format of guid in efivarstore statement likes following and must = be correct: > +# guid =3D {0xA04A27f4, 0xDF00, 0x4D42, {0xB5, 0x52, 0x39, 0x51, 0x13, = 0x02, 0x11, 0x3D}} > +# > +gEfiVarStoreGuidPattern =3D re.compile("\s*guid\s*=3D\s*({.*?{.*?}\s*})= ") > + > +# > +# Template string to generic AsBuilt INF > +# > +gAsBuiltInfHeaderString =3D TemplateString("""${header_comments} > + > +# DO NOT EDIT > +# FILE auto-generated > + > +[Defines] > + INF_VERSION =3D ${module_inf_version} > + BASE_NAME =3D ${module_name} > + FILE_GUID =3D ${module_guid} > + MODULE_TYPE =3D ${module_module_type}${BEGIN} > + VERSION_STRING =3D ${module_version_string}${END}${BEGIN} > + PCD_IS_DRIVER =3D ${pcd_is_driver_string}${END}${BEGIN} > + UEFI_SPECIFICATION_VERSION =3D ${module_uefi_specification_version}${= END}${BEGIN} > + PI_SPECIFICATION_VERSION =3D ${module_pi_specification_version}${EN= D}${BEGIN} > + ENTRY_POINT =3D ${module_entry_point}${END}${BEGIN} > + UNLOAD_IMAGE =3D ${module_unload_image}${END}${BEGIN} > + CONSTRUCTOR =3D ${module_constructor}${END}${BEGIN} > + DESTRUCTOR =3D ${module_destructor}${END}${BEGIN} > + SHADOW =3D ${module_shadow}${END}${BEGIN} > + PCI_VENDOR_ID =3D ${module_pci_vendor_id}${END}${BEGIN} > + PCI_DEVICE_ID =3D ${module_pci_device_id}${END}${BEGIN} > + PCI_CLASS_CODE =3D ${module_pci_class_code}${END}${BEGIN} > + PCI_REVISION =3D ${module_pci_revision}${END}${BEGIN} > + BUILD_NUMBER =3D ${module_build_number}${END}${BEGIN} > + SPEC =3D ${module_spec}${END}${BEGIN} > + UEFI_HII_RESOURCE_SECTION =3D ${module_uefi_hii_resource_section}${E= ND}${BEGIN} > + MODULE_UNI_FILE =3D ${module_uni_file}${END} > + > +[Packages.${module_arch}]${BEGIN} > + ${package_item}${END} > + > +[Binaries.${module_arch}]${BEGIN} > + ${binary_item}${END} > + > +[PatchPcd.${module_arch}]${BEGIN} > + ${patchablepcd_item} > +${END} > + > +[Protocols.${module_arch}]${BEGIN} > + ${protocol_item} > +${END} > + > +[Ppis.${module_arch}]${BEGIN} > + ${ppi_item} > +${END} > + > +[Guids.${module_arch}]${BEGIN} > + ${guid_item} > +${END} > + > +[PcdEx.${module_arch}]${BEGIN} > + ${pcd_item} > +${END} > + > +[LibraryClasses.${module_arch}] > +## @LIB_INSTANCES${BEGIN} > +# ${libraryclasses_item}${END} > + > +${depexsection_item} > + > +${userextension_tianocore_item} > + > +${tail_comments} > + > +[BuildOptions.${module_arch}] > +## @AsBuilt${BEGIN} > +## ${flags_item}${END} > +""") > +# > +# extend lists contained in a dictionary with lists stored in another d= ictionary > +# if CopyToDict is not derived from DefaultDict(list) then this may rai= se exception > +# > +def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict): > + for Key in CopyFromDict: > + CopyToDict[Key].extend(CopyFromDict[Key]) > + > +# Create a directory specified by a set of path elements and return the= full path > +def _MakeDir(PathList): > + RetVal =3D path.join(*PathList) > + CreateDirectory(RetVal) > + return RetVal > + > +# > +# Convert string to C format array > +# > +def _ConvertStringToByteArray(Value): > + Value =3D Value.strip() > + if not Value: > + return None > + if Value[0] =3D=3D '{': > + if not Value.endswith('}'): > + return None > + Value =3D Value.replace(' ', '').replace('{', '').replace('}', = '') > + ValFields =3D Value.split(',') > + try: > + for Index in range(len(ValFields)): > + ValFields[Index] =3D str(int(ValFields[Index], 0)) > + except ValueError: > + return None > + Value =3D '{' + ','.join(ValFields) + '}' > + return Value > + > + Unicode =3D False > + if Value.startswith('L"'): > + if not Value.endswith('"'): > + return None > + Value =3D Value[1:] > + Unicode =3D True > + elif not Value.startswith('"') or not Value.endswith('"'): > + return None > + > + Value =3D eval(Value) # translate escape character > + NewValue =3D '{' > + for Index in range(0, len(Value)): > + if Unicode: > + NewValue =3D NewValue + str(ord(Value[Index]) % 0x10000) + = ',' > + else: > + NewValue =3D NewValue + str(ord(Value[Index]) % 0x100) + ',= ' > + Value =3D NewValue + '0}' > + return Value > + > +## ModuleAutoGen class > +# > +# This class encapsules the AutoGen behaviors for the build tools. In a= ddition to > +# the generation of AutoGen.h and AutoGen.c, it will generate *.depex f= ile according > +# to the [depex] section in module's inf file. > +# > +class ModuleAutoGen(AutoGen): > + # call super().__init__ then call the worker function with differen= t parameter count > + def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > + if not hasattr(self, "_Init"): > + self._InitWorker(Workspace, MetaFile, Target, Toolchain, Ar= ch, *args) > + self._Init =3D True > + > + ## Cache the timestamps of metafiles of every module in a class att= ribute > + # > + TimeDict =3D {} > + > + def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *arg= s, **kwargs): > +# check if this module is employed by active platform > + if not PlatformInfo(Workspace, args[0], Target, Toolchain, Arch= ,args[-1]).ValidModule(MetaFile): > + EdkLogger.verbose("Module [%s] for [%s] is not employed by = active platform\n" \ > + % (MetaFile, Arch)) > + return None > + return super(ModuleAutoGen, cls).__new__(cls, Workspace, MetaFi= le, Target, Toolchain, Arch, *args, **kwargs) > + > + ## Initialize ModuleAutoGen > + # > + # @param Workspace EdkIIWorkspaceBuild object > + # @param ModuleFile The path of module file > + # @param Target Build target (DEBUG, RELEASE) > + # @param Toolchain Name of tool chain > + # @param Arch The arch the module supports > + # @param PlatformFile Platform meta-file > + # > + def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, Arc= h, PlatformFile,DataPipe): > + EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen module [%s] [%s]" %= (ModuleFile, Arch)) > + GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (ModuleFile,= Arch, Toolchain, Target) > + > + self.Workspace =3D None > + self.WorkspaceDir =3D "" > + self.PlatformInfo =3D None > + self.DataPipe =3D DataPipe > + self.__init_platform_info__() > + self.MetaFile =3D ModuleFile > + self.SourceDir =3D self.MetaFile.SubDir > + self.SourceDir =3D mws.relpath(self.SourceDir, self.WorkspaceDi= r) > + > + self.ToolChain =3D Toolchain > + self.BuildTarget =3D Target > + self.Arch =3D Arch > + self.ToolChainFamily =3D self.PlatformInfo.ToolChainFamily > + self.BuildRuleFamily =3D self.PlatformInfo.BuildRuleFamily > + > + self.IsCodeFileCreated =3D False > + self.IsAsBuiltInfCreated =3D False > + self.DepexGenerated =3D False > + > + self.BuildDatabase =3D self.Workspace.BuildDatabase > + self.BuildRuleOrder =3D None > + self.BuildTime =3D 0 > + > + self._GuidComments =3D OrderedListDict() > + self._ProtocolComments =3D OrderedListDict() > + self._PpiComments =3D OrderedListDict() > + self._BuildTargets =3D None > + self._IntroBuildTargetList =3D None > + self._FinalBuildTargetList =3D None > + self._FileTypes =3D None > + > + self.AutoGenDepSet =3D set() > + self.ReferenceModules =3D [] > + self.ConstPcd =3D {} > + > + def __init_platform_info__(self): > + pinfo =3D self.DataPipe.Get("P_Info") > + self.Workspace =3D WorkSpaceInfo(pinfo.get("WorkspaceDir"),pinf= o.get("ActivePlatform"),pinfo.get("Target"),pinfo.get("ToolChain"),pinfo.ge= t("ArchList")) > + self.WorkspaceDir =3D pinfo.get("WorkspaceDir") > + self.PlatformInfo =3D PlatformInfo(self.Workspace,pinfo.get("Ac= tivePlatform"),pinfo.get("Target"),pinfo.get("ToolChain"),pinfo.get("Arch")= ,self.DataPipe) > + ## hash() operator of ModuleAutoGen > + # > + # The module file path and arch string will be used to represent > + # hash value of this object > + # > + # @retval int Hash value of the module file path and arch > + # > + @cached_class_function > + def __hash__(self): > + return hash((self.MetaFile, self.Arch)) > + def __repr__(self): > + return "%s [%s]" % (self.MetaFile, self.Arch) > + > + # Get FixedAtBuild Pcds of this Module > + @cached_property > + def FixedAtBuildPcds(self): > + RetVal =3D [] > + for Pcd in self.ModulePcdList: > + if Pcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > + continue > + if Pcd not in RetVal: > + RetVal.append(Pcd) > + return RetVal > + > + @cached_property > + def FixedVoidTypePcds(self): > + RetVal =3D {} > + for Pcd in self.FixedAtBuildPcds: > + if Pcd.DatumType =3D=3D TAB_VOID: > + if '.'.join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName)) = not in RetVal: > + RetVal['.'.join((Pcd.TokenSpaceGuidCName, Pcd.Token= CName))] =3D Pcd.DefaultValue > + return RetVal > + > + @property > + def UniqueBaseName(self): > + ModuleNames =3D self.DataPipe.Get("M_Name") > + if not ModuleNames: > + return self.Name > + return ModuleNames.get(self.Name,self.Name) > + > + # Macros could be used in build_rule.txt (also Makefile) > + @cached_property > + def Macros(self): > + return OrderedDict(( > + ("WORKSPACE" ,self.WorkspaceDir), > + ("MODULE_NAME" ,self.Name), > + ("MODULE_NAME_GUID" ,self.UniqueBaseName), > + ("MODULE_GUID" ,self.Guid), > + ("MODULE_VERSION" ,self.Version), > + ("MODULE_TYPE" ,self.ModuleType), > + ("MODULE_FILE" ,str(self.MetaFile)), > + ("MODULE_FILE_BASE_NAME" ,self.MetaFile.BaseName), > + ("MODULE_RELATIVE_DIR" ,self.SourceDir), > + ("MODULE_DIR" ,self.SourceDir), > + ("BASE_NAME" ,self.Name), > + ("ARCH" ,self.Arch), > + ("TOOLCHAIN" ,self.ToolChain), > + ("TOOLCHAIN_TAG" ,self.ToolChain), > + ("TOOL_CHAIN_TAG" ,self.ToolChain), > + ("TARGET" ,self.BuildTarget), > + ("BUILD_DIR" ,self.PlatformInfo.BuildDir), > + ("BIN_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.A= rch)), > + ("LIB_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.A= rch)), > + ("MODULE_BUILD_DIR" ,self.BuildDir), > + ("OUTPUT_DIR" ,self.OutputDir), > + ("DEBUG_DIR" ,self.DebugDir), > + ("DEST_DIR_OUTPUT" ,self.OutputDir), > + ("DEST_DIR_DEBUG" ,self.DebugDir), > + ("PLATFORM_NAME" ,self.PlatformInfo.Name), > + ("PLATFORM_GUID" ,self.PlatformInfo.Guid), > + ("PLATFORM_VERSION" ,self.PlatformInfo.Version), > + ("PLATFORM_RELATIVE_DIR" ,self.PlatformInfo.SourceDir), > + ("PLATFORM_DIR" ,mws.join(self.WorkspaceDir, self.PlatformI= nfo.SourceDir)), > + ("PLATFORM_OUTPUT_DIR" ,self.PlatformInfo.OutputDir), > + ("FFS_OUTPUT_DIR" ,self.FfsOutputDir) > + )) > + > + ## Return the module build data object > + @cached_property > + def Module(self): > + return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildT= arget, self.ToolChain] > + > + ## Return the module name > + @cached_property > + def Name(self): > + return self.Module.BaseName > + > + ## Return the module DxsFile if exist > + @cached_property > + def DxsFile(self): > + return self.Module.DxsFile > + > + ## Return the module meta-file GUID > + @cached_property > + def Guid(self): > + # > + # To build same module more than once, the module path with FIL= E_GUID overridden has > + # the file name FILE_GUIDmodule.inf, but the relative path (sel= f.MetaFile.File) is the real path > + # in DSC. The overridden GUID can be retrieved from file name > + # > + if os.path.basename(self.MetaFile.File) !=3D os.path.basename(s= elf.MetaFile.Path): > + # > + # Length of GUID is 36 > + # > + return os.path.basename(self.MetaFile.Path)[:36] > + return self.Module.Guid > + > + ## Return the module version > + @cached_property > + def Version(self): > + return self.Module.Version > + > + ## Return the module type > + @cached_property > + def ModuleType(self): > + return self.Module.ModuleType > + > + ## Return the component type (for Edk.x style of module) > + @cached_property > + def ComponentType(self): > + return self.Module.ComponentType > + > + ## Return the build type > + @cached_property > + def BuildType(self): > + return self.Module.BuildType > + > + ## Return the PCD_IS_DRIVER setting > + @cached_property > + def PcdIsDriver(self): > + return self.Module.PcdIsDriver > + > + ## Return the autogen version, i.e. module meta-file version > + @cached_property > + def AutoGenVersion(self): > + return self.Module.AutoGenVersion > + > + ## Check if the module is library or not > + @cached_property > + def IsLibrary(self): > + return bool(self.Module.LibraryClass) > + > + ## Check if the module is binary module or not > + @cached_property > + def IsBinaryModule(self): > + return self.Module.IsBinaryModule > + > + ## Return the directory to store intermediate files of the module > + @cached_property > + def BuildDir(self): > + return _MakeDir(( > + self.PlatformInfo.BuildDir, > + self.Arch, > + self.SourceDir, > + self.MetaFile.BaseName > + )) > + > + ## Return the directory to store the intermediate object files of t= he module > + @cached_property > + def OutputDir(self): > + return _MakeDir((self.BuildDir, "OUTPUT")) > + > + ## Return the directory path to store ffs file > + @cached_property > + def FfsOutputDir(self): > + if GlobalData.gFdfParser: > + return path.join(self.PlatformInfo.BuildDir, TAB_FV_DIRECTO= RY, "Ffs", self.Guid + self.Name) > + return '' > + > + ## Return the directory to store auto-gened source files of the mod= ule > + @cached_property > + def DebugDir(self): > + return _MakeDir((self.BuildDir, "DEBUG")) > + > + ## Return the path of custom file > + @cached_property > + def CustomMakefile(self): > + RetVal =3D {} > + for Type in self.Module.CustomMakefile: > + MakeType =3D gMakeTypeMap[Type] if Type in gMakeTypeMap els= e 'nmake' > + File =3D os.path.join(self.SourceDir, self.Module.CustomMak= efile[Type]) > + RetVal[MakeType] =3D File > + return RetVal > + > + ## Return the directory of the makefile > + # > + # @retval string The directory string of module's makefile > + # > + @cached_property > + def MakeFileDir(self): > + return self.BuildDir > + > + ## Return build command string > + # > + # @retval string Build command string > + # > + @cached_property > + def BuildCommand(self): > + return self.PlatformInfo.BuildCommand > + > + ## Get object list of all packages the module and its dependent lib= raries belong to > + # > + # @retval list The list of package object > + # > + @cached_property > + def DerivedPackageList(self): > + PackageList =3D [] > + for M in [self.Module] + self.DependentLibraryList: > + for Package in M.Packages: > + if Package in PackageList: > + continue > + PackageList.append(Package) > + return PackageList > + > + ## Get the depex string > + # > + # @return : a string contain all depex expression. > + def _GetDepexExpresionString(self): > + DepexStr =3D '' > + DepexList =3D [] > + ## DPX_SOURCE IN Define section. > + if self.Module.DxsFile: > + return DepexStr > + for M in [self.Module] + self.DependentLibraryList: > + Filename =3D M.MetaFile.Path > + InfObj =3D InfSectionParser.InfSectionParser(Filename) > + DepexExpressionList =3D InfObj.GetDepexExpresionList() > + for DepexExpression in DepexExpressionList: > + for key in DepexExpression: > + Arch, ModuleType =3D key > + DepexExpr =3D [x for x in DepexExpression[key] if n= ot str(x).startswith('#')] > + # the type of build module is USER_DEFINED. > + # All different DEPEX section tags would be copied = into the As Built INF file > + # and there would be separate DEPEX section tags > + if self.ModuleType.upper() =3D=3D SUP_MODULE_USER_D= EFINED or self.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > + if (Arch.upper() =3D=3D self.Arch.upper()) and = (ModuleType.upper() !=3D TAB_ARCH_COMMON): > + DepexList.append({(Arch, ModuleType): Depex= Expr}) > + else: > + if Arch.upper() =3D=3D TAB_ARCH_COMMON or \ > + (Arch.upper() =3D=3D self.Arch.upper() and \ > + ModuleType.upper() in [TAB_ARCH_COMMON, self.= ModuleType.upper()]): > + DepexList.append({(Arch, ModuleType): Depex= Expr}) > + > + #the type of build module is USER_DEFINED. > + if self.ModuleType.upper() =3D=3D SUP_MODULE_USER_DEFINED or se= lf.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > + for Depex in DepexList: > + for key in Depex: > + DepexStr +=3D '[Depex.%s.%s]\n' % key > + DepexStr +=3D '\n'.join('# '+ val for val in Depex[= key]) > + DepexStr +=3D '\n\n' > + if not DepexStr: > + return '[Depex.%s]\n' % self.Arch > + return DepexStr > + > + #the type of build module not is USER_DEFINED. > + Count =3D 0 > + for Depex in DepexList: > + Count +=3D 1 > + if DepexStr !=3D '': > + DepexStr +=3D ' AND ' > + DepexStr +=3D '(' > + for D in Depex.values(): > + DepexStr +=3D ' '.join(val for val in D) > + Index =3D DepexStr.find('END') > + if Index > -1 and Index =3D=3D len(DepexStr) - 3: > + DepexStr =3D DepexStr[:-3] > + DepexStr =3D DepexStr.strip() > + DepexStr +=3D ')' > + if Count =3D=3D 1: > + DepexStr =3D DepexStr.lstrip('(').rstrip(')').strip() > + if not DepexStr: > + return '[Depex.%s]\n' % self.Arch > + return '[Depex.%s]\n# ' % self.Arch + DepexStr > + > + ## Merge dependency expression > + # > + # @retval list The token list of the dependency expression= after parsed > + # > + @cached_property > + def DepexList(self): > + if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_= FILE in self.FileTypes: > + return {} > + > + DepexList =3D [] > + # > + # Append depex from dependent libraries, if not "BEFORE", "AFTE= R" expression > + # > + FixedVoidTypePcds =3D {} > + for M in [self] + self.LibraryAutoGenList: > + FixedVoidTypePcds.update(M.FixedVoidTypePcds) > + for M in [self] + self.LibraryAutoGenList: > + Inherited =3D False > + for D in M.Module.Depex[self.Arch, self.ModuleType]: > + if DepexList !=3D []: > + DepexList.append('AND') > + DepexList.append('(') > + #replace D with value if D is FixedAtBuild PCD > + NewList =3D [] > + for item in D: > + if '.' not in item: > + NewList.append(item) > + else: > + try: > + Value =3D FixedVoidTypePcds[item] > + if len(Value.split(',')) !=3D 16: > + EdkLogger.error("build", FORMAT_INVALID= , > + "{} used in [Depex] sec= tion should be used as FixedAtBuild type and VOID* datum type and 16 bytes = in the module.".format(item)) > + NewList.append(Value) > + except: > + EdkLogger.error("build", FORMAT_INVALID, "{= } used in [Depex] section should be used as FixedAtBuild type and VOID* dat= um type in the module.".format(item)) > + > + DepexList.extend(NewList) > + if DepexList[-1] =3D=3D 'END': # no need of a END at t= his time > + DepexList.pop() > + DepexList.append(')') > + Inherited =3D True > + if Inherited: > + EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Name= , M.Module.BaseName, DepexList)) > + if 'BEFORE' in DepexList or 'AFTER' in DepexList: > + break > + if len(DepexList) > 0: > + EdkLogger.verbose('') > + return {self.ModuleType:DepexList} > + > + ## Merge dependency expression > + # > + # @retval list The token list of the dependency expression= after parsed > + # > + @cached_property > + def DepexExpressionDict(self): > + if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSION_= FILE in self.FileTypes: > + return {} > + > + DepexExpressionString =3D '' > + # > + # Append depex from dependent libraries, if not "BEFORE", "AFTE= R" expresion > + # > + for M in [self.Module] + self.DependentLibraryList: > + Inherited =3D False > + for D in M.DepexExpression[self.Arch, self.ModuleType]: > + if DepexExpressionString !=3D '': > + DepexExpressionString +=3D ' AND ' > + DepexExpressionString +=3D '(' > + DepexExpressionString +=3D D > + DepexExpressionString =3D DepexExpressionString.rstrip(= 'END').strip() > + DepexExpressionString +=3D ')' > + Inherited =3D True > + if Inherited: > + EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Name= , M.BaseName, DepexExpressionString)) > + if 'BEFORE' in DepexExpressionString or 'AFTER' in DepexExp= ressionString: > + break > + if len(DepexExpressionString) > 0: > + EdkLogger.verbose('') > + > + return {self.ModuleType:DepexExpressionString} > + > + # Get the tiano core user extension, it is contain dependent librar= y. > + # @retval: a list contain tiano core userextension. > + # > + def _GetTianoCoreUserExtensionList(self): > + TianoCoreUserExtentionList =3D [] > + for M in [self.Module] + self.DependentLibraryList: > + Filename =3D M.MetaFile.Path > + InfObj =3D InfSectionParser.InfSectionParser(Filename) > + TianoCoreUserExtenList =3D InfObj.GetUserExtensionTianoCore= () > + for TianoCoreUserExtent in TianoCoreUserExtenList: > + for Section in TianoCoreUserExtent: > + ItemList =3D Section.split(TAB_SPLIT) > + Arch =3D self.Arch > + if len(ItemList) =3D=3D 4: > + Arch =3D ItemList[3] > + if Arch.upper() =3D=3D TAB_ARCH_COMMON or Arch.uppe= r() =3D=3D self.Arch.upper(): > + TianoCoreList =3D [] > + TianoCoreList.extend([TAB_SECTION_START + Secti= on + TAB_SECTION_END]) > + TianoCoreList.extend(TianoCoreUserExtent[Sectio= n][:]) > + TianoCoreList.append('\n') > + TianoCoreUserExtentionList.append(TianoCoreList= ) > + > + return TianoCoreUserExtentionList > + > + ## Return the list of specification version required for the module > + # > + # @retval list The list of specification defined in module= file > + # > + @cached_property > + def Specification(self): > + return self.Module.Specification > + > + ## Tool option for the module build > + # > + # @param PlatformInfo The object of PlatformBuildInfo > + # @retval dict The dict containing valid options > + # > + @cached_property > + def BuildOption(self): > + RetVal, self.BuildRuleOrder =3D self.PlatformInfo.ApplyBuildOpt= ion(self.Module) > + if self.BuildRuleOrder: > + self.BuildRuleOrder =3D ['.%s' % Ext for Ext in self.BuildR= uleOrder.split()] > + return RetVal > + > + ## Get include path list from tool option for the module build > + # > + # @retval list The include path list > + # > + @cached_property > + def BuildOptionIncPathList(self): > + # > + # Regular expression for finding Include Directories, the diffe= rence between MSFT and INTEL/GCC/RVCT > + # is the former use /I , the Latter used -I to specify include = directories > + # > + if self.PlatformInfo.ToolChainFamily in (TAB_COMPILER_MSFT): > + BuildOptIncludeRegEx =3D gBuildOptIncludePatternMsft > + elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'RVC= T'): > + BuildOptIncludeRegEx =3D gBuildOptIncludePatternOther > + else: > + # > + # New ToolChainFamily, don't known whether there is option = to specify include directories > + # > + return [] > + > + RetVal =3D [] > + for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM= '): > + try: > + FlagOption =3D self.BuildOption[Tool]['FLAGS'] > + except KeyError: > + FlagOption =3D '' > + > + if self.ToolChainFamily !=3D 'RVCT': > + IncPathList =3D [NormPath(Path, self.Macros) for Path i= n BuildOptIncludeRegEx.findall(FlagOption)] > + else: > + # > + # RVCT may specify a list of directory seperated by com= mas > + # > + IncPathList =3D [] > + for Path in BuildOptIncludeRegEx.findall(FlagOption): > + PathList =3D GetSplitList(Path, TAB_COMMA_SPLIT) > + IncPathList.extend(NormPath(PathEntry, self.Macros)= for PathEntry in PathList) > + > + # > + # EDK II modules must not reference header files outside of= the packages they depend on or > + # within the module's directory tree. Report error if viola= tion. > + # > + if GlobalData.gDisableIncludePathCheck =3D=3D False: > + for Path in IncPathList: > + if (Path not in self.IncludePathList) and (CommonPa= th([Path, self.MetaFile.Dir]) !=3D self.MetaFile.Dir): > + ErrMsg =3D "The include directory for the EDK I= I module in this line is invalid %s specified in %s FLAGS '%s'" % (Path, To= ol, FlagOption) > + EdkLogger.error("build", > + PARAMETER_INVALID, > + ExtraData=3DErrMsg, > + File=3Dstr(self.MetaFile)) > + RetVal +=3D IncPathList > + return RetVal > + > + ## Return a list of files which can be built from source > + # > + # What kind of files can be built is determined by build rules in > + # $(CONF_DIRECTORY)/build_rule.txt and toolchain family. > + # > + @cached_property > + def SourceFileList(self): > + RetVal =3D [] > + ToolChainTagSet =3D {"", TAB_STAR, self.ToolChain} > + ToolChainFamilySet =3D {"", TAB_STAR, self.ToolChainFamily, sel= f.BuildRuleFamily} > + for F in self.Module.Sources: > + # match tool chain > + if F.TagName not in ToolChainTagSet: > + EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s] = for processing file [%s] is found, " > + "but [%s] is currently used" % (F.TagNa= me, str(F), self.ToolChain)) > + continue > + # match tool chain family or build rule family > + if F.ToolChainFamily not in ToolChainFamilySet: > + EdkLogger.debug( > + EdkLogger.DEBUG_0, > + "The file [%s] must be built by tools of [%= s], " \ > + "but current toolchain family is [%s], buil= drule family is [%s]" \ > + % (str(F), F.ToolChainFamily, self.Tool= ChainFamily, self.BuildRuleFamily)) > + continue > + > + # add the file path into search path list for file includin= g > + if F.Dir not in self.IncludePathList: > + self.IncludePathList.insert(0, F.Dir) > + RetVal.append(F) > + > + self._MatchBuildRuleOrder(RetVal) > + > + for F in RetVal: > + self._ApplyBuildRule(F, TAB_UNKNOWN_FILE) > + return RetVal > + > + def _MatchBuildRuleOrder(self, FileList): > + Order_Dict =3D {} > + self.BuildOption > + for SingleFile in FileList: > + if self.BuildRuleOrder and SingleFile.Ext in self.BuildRule= Order and SingleFile.Ext in self.BuildRules: > + key =3D SingleFile.Path.rsplit(SingleFile.Ext,1)[0] > + if key in Order_Dict: > + Order_Dict[key].append(SingleFile.Ext) > + else: > + Order_Dict[key] =3D [SingleFile.Ext] > + > + RemoveList =3D [] > + for F in Order_Dict: > + if len(Order_Dict[F]) > 1: > + Order_Dict[F].sort(key=3Dlambda i: self.BuildRuleOrder.= index(i)) > + for Ext in Order_Dict[F][1:]: > + RemoveList.append(F + Ext) > + > + for item in RemoveList: > + FileList.remove(item) > + > + return FileList > + > + ## Return the list of unicode files > + @cached_property > + def UnicodeFileList(self): > + return self.FileTypes.get(TAB_UNICODE_FILE,[]) > + > + ## Return the list of vfr files > + @cached_property > + def VfrFileList(self): > + return self.FileTypes.get(TAB_VFR_FILE, []) > + > + ## Return the list of Image Definition files > + @cached_property > + def IdfFileList(self): > + return self.FileTypes.get(TAB_IMAGE_FILE,[]) > + > + ## Return a list of files which can be built from binary > + # > + # "Build" binary files are just to copy them to build directory. > + # > + # @retval list The list of files which can be buil= t later > + # > + @cached_property > + def BinaryFileList(self): > + RetVal =3D [] > + for F in self.Module.Binaries: > + if F.Target not in [TAB_ARCH_COMMON, TAB_STAR] and F.Target= !=3D self.BuildTarget: > + continue > + RetVal.append(F) > + self._ApplyBuildRule(F, F.Type, BinaryFileList=3DRetVal) > + return RetVal > + > + @cached_property > + def BuildRules(self): > + RetVal =3D {} > + BuildRuleDatabase =3D self.PlatformInfo.BuildRule > + for Type in BuildRuleDatabase.FileTypeList: > + #first try getting build rule by BuildRuleFamily > + RuleObject =3D BuildRuleDatabase[Type, self.BuildType, self= .Arch, self.BuildRuleFamily] > + if not RuleObject: > + # build type is always module type, but ... > + if self.ModuleType !=3D self.BuildType: > + RuleObject =3D BuildRuleDatabase[Type, self.ModuleT= ype, self.Arch, self.BuildRuleFamily] > + #second try getting build rule by ToolChainFamily > + if not RuleObject: > + RuleObject =3D BuildRuleDatabase[Type, self.BuildType, = self.Arch, self.ToolChainFamily] > + if not RuleObject: > + # build type is always module type, but ... > + if self.ModuleType !=3D self.BuildType: > + RuleObject =3D BuildRuleDatabase[Type, self.Mod= uleType, self.Arch, self.ToolChainFamily] > + if not RuleObject: > + continue > + RuleObject =3D RuleObject.Instantiate(self.Macros) > + RetVal[Type] =3D RuleObject > + for Ext in RuleObject.SourceFileExtList: > + RetVal[Ext] =3D RuleObject > + return RetVal > + > + def _ApplyBuildRule(self, File, FileType, BinaryFileList=3DNone): > + if self._BuildTargets is None: > + self._IntroBuildTargetList =3D set() > + self._FinalBuildTargetList =3D set() > + self._BuildTargets =3D defaultdict(set) > + self._FileTypes =3D defaultdict(set) > + > + if not BinaryFileList: > + BinaryFileList =3D self.BinaryFileList > + > + SubDirectory =3D os.path.join(self.OutputDir, File.SubDir) > + if not os.path.exists(SubDirectory): > + CreateDirectory(SubDirectory) > + LastTarget =3D None > + RuleChain =3D set() > + SourceList =3D [File] > + Index =3D 0 > + # > + # Make sure to get build rule order value > + # > + self.BuildOption > + > + while Index < len(SourceList): > + Source =3D SourceList[Index] > + Index =3D Index + 1 > + > + if Source !=3D File: > + CreateDirectory(Source.Dir) > + > + if File.IsBinary and File =3D=3D Source and File in BinaryF= ileList: > + # Skip all files that are not binary libraries > + if not self.IsLibrary: > + continue > + RuleObject =3D self.BuildRules[TAB_DEFAULT_BINARY_FILE] > + elif FileType in self.BuildRules: > + RuleObject =3D self.BuildRules[FileType] > + elif Source.Ext in self.BuildRules: > + RuleObject =3D self.BuildRules[Source.Ext] > + else: > + # stop at no more rules > + if LastTarget: > + self._FinalBuildTargetList.add(LastTarget) > + break > + > + FileType =3D RuleObject.SourceFileType > + self._FileTypes[FileType].add(Source) > + > + # stop at STATIC_LIBRARY for library > + if self.IsLibrary and FileType =3D=3D TAB_STATIC_LIBRARY: > + if LastTarget: > + self._FinalBuildTargetList.add(LastTarget) > + break > + > + Target =3D RuleObject.Apply(Source, self.BuildRuleOrder) > + if not Target: > + if LastTarget: > + self._FinalBuildTargetList.add(LastTarget) > + break > + elif not Target.Outputs: > + # Only do build for target with outputs > + self._FinalBuildTargetList.add(Target) > + > + self._BuildTargets[FileType].add(Target) > + > + if not Source.IsBinary and Source =3D=3D File: > + self._IntroBuildTargetList.add(Target) > + > + # to avoid cyclic rule > + if FileType in RuleChain: > + break > + > + RuleChain.add(FileType) > + SourceList.extend(Target.Outputs) > + LastTarget =3D Target > + FileType =3D TAB_UNKNOWN_FILE > + > + @cached_property > + def Targets(self): > + if self._BuildTargets is None: > + self._IntroBuildTargetList =3D set() > + self._FinalBuildTargetList =3D set() > + self._BuildTargets =3D defaultdict(set) > + self._FileTypes =3D defaultdict(set) > + > + #TRICK: call SourceFileList property to apply build rule for so= urce files > + self.SourceFileList > + > + #TRICK: call _GetBinaryFileList to apply build rule for binary = files > + self.BinaryFileList > + > + return self._BuildTargets > + > + @cached_property > + def IntroTargetList(self): > + self.Targets > + return self._IntroBuildTargetList > + > + @cached_property > + def CodaTargetList(self): > + self.Targets > + return self._FinalBuildTargetList > + > + @cached_property > + def FileTypes(self): > + self.Targets > + return self._FileTypes > + > + ## Get the list of package object the module depends on > + # > + # @retval list The package object list > + # > + @cached_property > + def DependentPackageList(self): > + return self.Module.Packages > + > + ## Return the list of auto-generated code file > + # > + # @retval list The list of auto-generated file > + # > + @cached_property > + def AutoGenFileList(self): > + AutoGenUniIdf =3D self.BuildType !=3D 'UEFI_HII' > + UniStringBinBuffer =3D BytesIO() > + IdfGenBinBuffer =3D BytesIO() > + RetVal =3D {} > + AutoGenC =3D TemplateString() > + AutoGenH =3D TemplateString() > + StringH =3D TemplateString() > + StringIdf =3D TemplateString() > + GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUniId= f, UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer) > + # > + # AutoGen.c is generated if there are library classes in inf, o= r there are object files > + # > + if str(AutoGenC) !=3D "" and (len(self.Module.LibraryClasses) >= 0 > + or TAB_OBJECT_FILE in self.FileType= s): > + AutoFile =3D PathClass(gAutoGenCodeFileName, self.DebugDir) > + RetVal[AutoFile] =3D str(AutoGenC) > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > + if str(AutoGenH) !=3D "": > + AutoFile =3D PathClass(gAutoGenHeaderFileName, self.DebugDi= r) > + RetVal[AutoFile] =3D str(AutoGenH) > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > + if str(StringH) !=3D "": > + AutoFile =3D PathClass(gAutoGenStringFileName % {"module_na= me":self.Name}, self.DebugDir) > + RetVal[AutoFile] =3D str(StringH) > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > + if UniStringBinBuffer is not None and UniStringBinBuffer.getval= ue() !=3D b"": > + AutoFile =3D PathClass(gAutoGenStringFormFileName % {"modul= e_name":self.Name}, self.OutputDir) > + RetVal[AutoFile] =3D UniStringBinBuffer.getvalue() > + AutoFile.IsBinary =3D True > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > + if UniStringBinBuffer is not None: > + UniStringBinBuffer.close() > + if str(StringIdf) !=3D "": > + AutoFile =3D PathClass(gAutoGenImageDefFileName % {"module_= name":self.Name}, self.DebugDir) > + RetVal[AutoFile] =3D str(StringIdf) > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > + if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue() != = =3D b"": > + AutoFile =3D PathClass(gAutoGenIdfFileName % {"module_name"= :self.Name}, self.OutputDir) > + RetVal[AutoFile] =3D IdfGenBinBuffer.getvalue() > + AutoFile.IsBinary =3D True > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > + if IdfGenBinBuffer is not None: > + IdfGenBinBuffer.close() > + return RetVal > + > + ## Return the list of library modules explicitly or implicitly used= by this module > + @cached_property > + def DependentLibraryList(self): > + # only merge library classes and PCD for non-library module > + if self.IsLibrary: > + return [] > + return self.PlatformInfo.ApplyLibraryInstance(self.Module) > + > + ## Get the list of PCDs from current module > + # > + # @retval list The list of PCD > + # > + @cached_property > + def ModulePcdList(self): > + # apply PCD settings from platform > + RetVal =3D self.PlatformInfo.ApplyPcdSetting(self.Module, self.= Module.Pcds) > + > + return RetVal > + @cached_property > + def _PcdComments(self): > + ReVal =3D OrderedListDict() > + ExtendCopyDictionaryLists(ReVal, self.Module.PcdComments) > + if not self.IsLibrary: > + for Library in self.DependentLibraryList: > + ExtendCopyDictionaryLists(ReVal, Library.PcdComments) > + return ReVal > + > + ## Get the list of PCDs from dependent libraries > + # > + # @retval list The list of PCD > + # > + @cached_property > + def LibraryPcdList(self): > + if self.IsLibrary: > + return [] > + RetVal =3D [] > + Pcds =3D set() > + # get PCDs from dependent libraries > + for Library in self.DependentLibraryList: > + PcdsInLibrary =3D OrderedDict() > + for Key in Library.Pcds: > + # skip duplicated PCDs > + if Key in self.Module.Pcds or Key in Pcds: > + continue > + Pcds.add(Key) > + PcdsInLibrary[Key] =3D copy.copy(Library.Pcds[Key]) > + RetVal.extend(self.PlatformInfo.ApplyPcdSetting(self.Module= , PcdsInLibrary, Library=3DLibrary)) > + return RetVal > + > + ## Get the GUID value mapping > + # > + # @retval dict The mapping between GUID cname and its valu= e > + # > + @cached_property > + def GuidList(self): > + RetVal =3D self.Module.Guids > + for Library in self.DependentLibraryList: > + RetVal.update(Library.Guids) > + ExtendCopyDictionaryLists(self._GuidComments, Library.GuidC= omments) > + ExtendCopyDictionaryLists(self._GuidComments, self.Module.GuidC= omments) > + return RetVal > + > + @cached_property > + def GetGuidsUsedByPcd(self): > + RetVal =3D OrderedDict(self.Module.GetGuidsUsedByPcd()) > + for Library in self.DependentLibraryList: > + RetVal.update(Library.GetGuidsUsedByPcd()) > + return RetVal > + ## Get the protocol value mapping > + # > + # @retval dict The mapping between protocol cname and its = value > + # > + @cached_property > + def ProtocolList(self): > + RetVal =3D OrderedDict(self.Module.Protocols) > + for Library in self.DependentLibraryList: > + RetVal.update(Library.Protocols) > + ExtendCopyDictionaryLists(self._ProtocolComments, Library.P= rotocolComments) > + ExtendCopyDictionaryLists(self._ProtocolComments, self.Module.P= rotocolComments) > + return RetVal > + > + ## Get the PPI value mapping > + # > + # @retval dict The mapping between PPI cname and its value > + # > + @cached_property > + def PpiList(self): > + RetVal =3D OrderedDict(self.Module.Ppis) > + for Library in self.DependentLibraryList: > + RetVal.update(Library.Ppis) > + ExtendCopyDictionaryLists(self._PpiComments, Library.PpiCom= ments) > + ExtendCopyDictionaryLists(self._PpiComments, self.Module.PpiCom= ments) > + return RetVal > + > + ## Get the list of include search path > + # > + # @retval list The list path > + # > + @cached_property > + def IncludePathList(self): > + RetVal =3D [] > + RetVal.append(self.MetaFile.Dir) > + RetVal.append(self.DebugDir) > + > + for Package in self.Module.Packages: > + PackageDir =3D mws.join(self.WorkspaceDir, Package.MetaFile= .Dir) > + if PackageDir not in RetVal: > + RetVal.append(PackageDir) > + IncludesList =3D Package.Includes > + if Package._PrivateIncludes: > + if not self.MetaFile.OriginalPath.Path.startswith(Packa= geDir): > + IncludesList =3D list(set(Package.Includes).differe= nce(set(Package._PrivateIncludes))) > + for Inc in IncludesList: > + if Inc not in RetVal: > + RetVal.append(str(Inc)) > + return RetVal > + > + @cached_property > + def IncludePathLength(self): > + return sum(len(inc)+1 for inc in self.IncludePathList) > + > + ## Get HII EX PCDs which maybe used by VFR > + # > + # efivarstore used by VFR may relate with HII EX PCDs > + # Get the variable name and GUID from efivarstore and HII EX PCD > + # List the HII EX PCDs in As Built INF if both name and GUID match= . > + # > + # @retval list HII EX PCDs > + # > + def _GetPcdsMaybeUsedByVfr(self): > + if not self.SourceFileList: > + return [] > + > + NameGuids =3D set() > + for SrcFile in self.SourceFileList: > + if SrcFile.Ext.lower() !=3D '.vfr': > + continue > + Vfri =3D os.path.join(self.OutputDir, SrcFile.BaseName + '.= i') > + if not os.path.exists(Vfri): > + continue > + VfriFile =3D open(Vfri, 'r') > + Content =3D VfriFile.read() > + VfriFile.close() > + Pos =3D Content.find('efivarstore') > + while Pos !=3D -1: > + # > + # Make sure 'efivarstore' is the start of efivarstore s= tatement > + # In case of the value of 'name' (name =3D efivarstore)= is equal to 'efivarstore' > + # > + Index =3D Pos - 1 > + while Index >=3D 0 and Content[Index] in ' \t\r\n': > + Index -=3D 1 > + if Index >=3D 0 and Content[Index] !=3D ';': > + Pos =3D Content.find('efivarstore', Pos + len('efiv= arstore')) > + continue > + # > + # 'efivarstore' must be followed by name and guid > + # > + Name =3D gEfiVarStoreNamePattern.search(Content, Pos) > + if not Name: > + break > + Guid =3D gEfiVarStoreGuidPattern.search(Content, Pos) > + if not Guid: > + break > + NameArray =3D _ConvertStringToByteArray('L"' + Name.gro= up(1) + '"') > + NameGuids.add((NameArray, GuidStructureStringToGuidStri= ng(Guid.group(1)))) > + Pos =3D Content.find('efivarstore', Name.end()) > + if not NameGuids: > + return [] > + HiiExPcds =3D [] > + for Pcd in self.PlatformInfo.Pcds.values(): > + if Pcd.Type !=3D TAB_PCDS_DYNAMIC_EX_HII: > + continue > + for SkuInfo in Pcd.SkuInfoList.values(): > + Value =3D GuidValue(SkuInfo.VariableGuid, self.Platform= Info.PackageList, self.MetaFile.Path) > + if not Value: > + continue > + Name =3D _ConvertStringToByteArray(SkuInfo.VariableName= ) > + Guid =3D GuidStructureStringToGuidString(Value) > + if (Name, Guid) in NameGuids and Pcd not in HiiExPcds: > + HiiExPcds.append(Pcd) > + break > + > + return HiiExPcds > + > + def _GenOffsetBin(self): > + VfrUniBaseName =3D {} > + for SourceFile in self.Module.Sources: > + if SourceFile.Type.upper() =3D=3D ".VFR" : > + # > + # search the .map file to find the offset of vfr binary= in the PE32+/TE file. > + # > + VfrUniBaseName[SourceFile.BaseName] =3D (SourceFile.Bas= eName + "Bin") > + elif SourceFile.Type.upper() =3D=3D ".UNI" : > + # > + # search the .map file to find the offset of Uni string= s binary in the PE32+/TE file. > + # > + VfrUniBaseName["UniOffsetName"] =3D (self.Name + "Strin= gs") > + > + if not VfrUniBaseName: > + return None > + MapFileName =3D os.path.join(self.OutputDir, self.Name + ".map"= ) > + EfiFileName =3D os.path.join(self.OutputDir, self.Name + ".efi"= ) > + VfrUniOffsetList =3D GetVariableOffset(MapFileName, EfiFileName= , list(VfrUniBaseName.values())) > + if not VfrUniOffsetList: > + return None > + > + OutputName =3D '%sOffset.bin' % self.Name > + UniVfrOffsetFileName =3D os.path.join( self.OutputDir, Outp= utName) > + > + try: > + fInputfile =3D open(UniVfrOffsetFileName, "wb+", 0) > + except: > + EdkLogger.error("build", FILE_OPEN_FAILURE, "File open fail= ed for %s" % UniVfrOffsetFileName, None) > + > + # Use a instance of BytesIO to cache data > + fStringIO =3D BytesIO() > + > + for Item in VfrUniOffsetList: > + if (Item[0].find("Strings") !=3D -1): > + # > + # UNI offset in image. > + # GUID + Offset > + # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0xe= f, 0x89, 0xfc, 0x6, 0x66 } } > + # > + UniGuid =3D b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x= 89\xfc\x06f' > + fStringIO.write(UniGuid) > + UniValue =3D pack ('Q', int (Item[1], 16)) > + fStringIO.write (UniValue) > + else: > + # > + # VFR binary offset in image. > + # GUID + Offset > + # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0x7= , 0x46, 0xda, 0x6, 0xa2 } }; > + # > + VfrGuid =3D b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\x= a2' > + fStringIO.write(VfrGuid) > + VfrValue =3D pack ('Q', int (Item[1], 16)) > + fStringIO.write (VfrValue) > + # > + # write data into file. > + # > + try : > + fInputfile.write (fStringIO.getvalue()) > + except: > + EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to= file %s failed, please check whether the " > + "file been locked or using by other applica= tions." %UniVfrOffsetFileName, None) > + > + fStringIO.close () > + fInputfile.close () > + return OutputName > + @cached_property > + def OutputFile(self): > + retVal =3D set() > + OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > + DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > + for Item in self.CodaTargetList: > + File =3D Item.Target.Path.replace('\\', '/').strip('/').rep= lace(DebugDir, '').replace(OutputDir, '').strip('/') > + retVal.add(File) > + if self.DepexGenerated: > + retVal.add(self.Name + '.depex') > + > + Bin =3D self._GenOffsetBin() > + if Bin: > + retVal.add(Bin) > + > + for Root, Dirs, Files in os.walk(OutputDir): > + for File in Files: > + if File.lower().endswith('.pdb'): > + retVal.add(File) > + > + return retVal > + > + ## Create AsBuilt INF file the module > + # > + def CreateAsBuiltInf(self): > + > + if self.IsAsBuiltInfCreated: > + return > + > + # Skip INF file generation for libraries > + if self.IsLibrary: > + return > + > + # Skip the following code for modules with no source files > + if not self.SourceFileList: > + return > + > + # Skip the following code for modules without any binary files > + if self.BinaryFileList: > + return > + > + ### TODO: How to handles mixed source and binary modules > + > + # Find all DynamicEx and PatchableInModule PCDs used by this mo= dule and dependent libraries > + # Also find all packages that the DynamicEx PCDs depend on > + Pcds =3D [] > + PatchablePcds =3D [] > + Packages =3D [] > + PcdCheckList =3D [] > + PcdTokenSpaceList =3D [] > + for Pcd in self.ModulePcdList + self.LibraryPcdList: > + if Pcd.Type =3D=3D TAB_PCDS_PATCHABLE_IN_MODULE: > + PatchablePcds.append(Pcd) > + PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuid= CName, TAB_PCDS_PATCHABLE_IN_MODULE)) > + elif Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET: > + if Pcd not in Pcds: > + Pcds.append(Pcd) > + PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpace= GuidCName, TAB_PCDS_DYNAMIC_EX)) > + PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpace= GuidCName, TAB_PCDS_DYNAMIC)) > + PcdTokenSpaceList.append(Pcd.TokenSpaceGuidCName) > + GuidList =3D OrderedDict(self.GuidList) > + for TokenSpace in self.GetGuidsUsedByPcd: > + # If token space is not referred by patch PCD or Ex PCD, re= move the GUID from GUID list > + # The GUIDs in GUIDs section should really be the GUIDs in = source INF or referred by Ex an patch PCDs > + if TokenSpace not in PcdTokenSpaceList and TokenSpace in Gu= idList: > + GuidList.pop(TokenSpace) > + CheckList =3D (GuidList, self.PpiList, self.ProtocolList, PcdCh= eckList) > + for Package in self.DerivedPackageList: > + if Package in Packages: > + continue > + BeChecked =3D (Package.Guids, Package.Ppis, Package.Protoco= ls, Package.Pcds) > + Found =3D False > + for Index in range(len(BeChecked)): > + for Item in CheckList[Index]: > + if Item in BeChecked[Index]: > + Packages.append(Package) > + Found =3D True > + break > + if Found: > + break > + > + VfrPcds =3D self._GetPcdsMaybeUsedByVfr() > + for Pkg in self.PlatformInfo.PackageList: > + if Pkg in Packages: > + continue > + for VfrPcd in VfrPcds: > + if ((VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB= _PCDS_DYNAMIC_EX) in Pkg.Pcds or > + (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB= _PCDS_DYNAMIC) in Pkg.Pcds): > + Packages.append(Pkg) > + break > + > + ModuleType =3D SUP_MODULE_DXE_DRIVER if self.ModuleType =3D=3D = SUP_MODULE_UEFI_DRIVER and self.DepexGenerated else self.ModuleType > + DriverType =3D self.PcdIsDriver if self.PcdIsDriver else '' > + Guid =3D self.Guid > + MDefs =3D self.Module.Defines > + > + AsBuiltInfDict =3D { > + 'module_name' : self.Name, > + 'module_guid' : Guid, > + 'module_module_type' : ModuleType, > + 'module_version_string' : [MDefs['VERSION_STRING'= ]] if 'VERSION_STRING' in MDefs else [], > + 'pcd_is_driver_string' : [], > + 'module_uefi_specification_version' : [], > + 'module_pi_specification_version' : [], > + 'module_entry_point' : self.Module.ModuleEntry= PointList, > + 'module_unload_image' : self.Module.ModuleUnloa= dImageList, > + 'module_constructor' : self.Module.Constructor= List, > + 'module_destructor' : self.Module.DestructorL= ist, > + 'module_shadow' : [MDefs['SHADOW']] if 'S= HADOW' in MDefs else [], > + 'module_pci_vendor_id' : [MDefs['PCI_VENDOR_ID']= ] if 'PCI_VENDOR_ID' in MDefs else [], > + 'module_pci_device_id' : [MDefs['PCI_DEVICE_ID']= ] if 'PCI_DEVICE_ID' in MDefs else [], > + 'module_pci_class_code' : [MDefs['PCI_CLASS_CODE'= ]] if 'PCI_CLASS_CODE' in MDefs else [], > + 'module_pci_revision' : [MDefs['PCI_REVISION']]= if 'PCI_REVISION' in MDefs else [], > + 'module_build_number' : [MDefs['BUILD_NUMBER']]= if 'BUILD_NUMBER' in MDefs else [], > + 'module_spec' : [MDefs['SPEC']] if 'SPE= C' in MDefs else [], > + 'module_uefi_hii_resource_section' : [MDefs['UEFI_HII_RESOUR= CE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [], > + 'module_uni_file' : [MDefs['MODULE_UNI_FILE= ']] if 'MODULE_UNI_FILE' in MDefs else [], > + 'module_arch' : self.Arch, > + 'package_item' : [Package.MetaFile.File.= replace('\\', '/') for Package in Packages], > + 'binary_item' : [], > + 'patchablepcd_item' : [], > + 'pcd_item' : [], > + 'protocol_item' : [], > + 'ppi_item' : [], > + 'guid_item' : [], > + 'flags_item' : [], > + 'libraryclasses_item' : [] > + } > + > + if 'MODULE_UNI_FILE' in MDefs: > + UNIFile =3D os.path.join(self.MetaFile.Dir, MDefs['MODULE_U= NI_FILE']) > + if os.path.isfile(UNIFile): > + shutil.copy2(UNIFile, self.OutputDir) > + > + if self.AutoGenVersion > int(gInfSpecVersion, 0): > + AsBuiltInfDict['module_inf_version'] =3D '0x%08x' % self.Au= toGenVersion > + else: > + AsBuiltInfDict['module_inf_version'] =3D gInfSpecVersion > + > + if DriverType: > + AsBuiltInfDict['pcd_is_driver_string'].append(DriverType) > + > + if 'UEFI_SPECIFICATION_VERSION' in self.Specification: > + AsBuiltInfDict['module_uefi_specification_version'].append(= self.Specification['UEFI_SPECIFICATION_VERSION']) > + if 'PI_SPECIFICATION_VERSION' in self.Specification: > + AsBuiltInfDict['module_pi_specification_version'].append(se= lf.Specification['PI_SPECIFICATION_VERSION']) > + > + OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > + DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > + for Item in self.CodaTargetList: > + File =3D Item.Target.Path.replace('\\', '/').strip('/').rep= lace(DebugDir, '').replace(OutputDir, '').strip('/') > + if os.path.isabs(File): > + File =3D File.replace('\\', '/').strip('/').replace(Out= putDir, '').strip('/') > + if Item.Target.Ext.lower() =3D=3D '.aml': > + AsBuiltInfDict['binary_item'].append('ASL|' + File) > + elif Item.Target.Ext.lower() =3D=3D '.acpi': > + AsBuiltInfDict['binary_item'].append('ACPI|' + File) > + elif Item.Target.Ext.lower() =3D=3D '.efi': > + AsBuiltInfDict['binary_item'].append('PE32|' + self.Nam= e + '.efi') > + else: > + AsBuiltInfDict['binary_item'].append('BIN|' + File) > + if not self.DepexGenerated: > + DepexFile =3D os.path.join(self.OutputDir, self.Name + '.de= pex') > + if os.path.exists(DepexFile): > + self.DepexGenerated =3D True > + if self.DepexGenerated: > + if self.ModuleType in [SUP_MODULE_PEIM]: > + AsBuiltInfDict['binary_item'].append('PEI_DEPEX|' + sel= f.Name + '.depex') > + elif self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODULE_= DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]: > + AsBuiltInfDict['binary_item'].append('DXE_DEPEX|' + sel= f.Name + '.depex') > + elif self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]: > + AsBuiltInfDict['binary_item'].append('SMM_DEPEX|' + sel= f.Name + '.depex') > + > + Bin =3D self._GenOffsetBin() > + if Bin: > + AsBuiltInfDict['binary_item'].append('BIN|%s' % Bin) > + > + for Root, Dirs, Files in os.walk(OutputDir): > + for File in Files: > + if File.lower().endswith('.pdb'): > + AsBuiltInfDict['binary_item'].append('DISPOSABLE|' = + File) > + HeaderComments =3D self.Module.HeaderComments > + StartPos =3D 0 > + for Index in range(len(HeaderComments)): > + if HeaderComments[Index].find('@BinaryHeader') !=3D -1: > + HeaderComments[Index] =3D HeaderComments[Index].replace= ('@BinaryHeader', '@file') > + StartPos =3D Index > + break > + AsBuiltInfDict['header_comments'] =3D '\n'.join(HeaderComments[= StartPos:]).replace(':#', '://') > + AsBuiltInfDict['tail_comments'] =3D '\n'.join(self.Module.TailC= omments) > + > + GenList =3D [ > + (self.ProtocolList, self._ProtocolComments, 'protocol_item'= ), > + (self.PpiList, self._PpiComments, 'ppi_item'), > + (GuidList, self._GuidComments, 'guid_item') > + ] > + for Item in GenList: > + for CName in Item[0]: > + Comments =3D '\n '.join(Item[1][CName]) if CName in It= em[1] else '' > + Entry =3D Comments + '\n ' + CName if Comments else CN= ame > + AsBuiltInfDict[Item[2]].append(Entry) > + PatchList =3D parsePcdInfoFromMapFile( > + os.path.join(self.OutputDir, self.Name + '.= map'), > + os.path.join(self.OutputDir, self.Name + '.= efi') > + ) > + if PatchList: > + for Pcd in PatchablePcds: > + TokenCName =3D Pcd.TokenCName > + for PcdItem in GlobalData.MixedPcd: > + if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in Glo= balData.MixedPcd[PcdItem]: > + TokenCName =3D PcdItem[0] > + break > + for PatchPcd in PatchList: > + if TokenCName =3D=3D PatchPcd[0]: > + break > + else: > + continue > + PcdValue =3D '' > + if Pcd.DatumType =3D=3D 'BOOLEAN': > + BoolValue =3D Pcd.DefaultValue.upper() > + if BoolValue =3D=3D 'TRUE': > + Pcd.DefaultValue =3D '1' > + elif BoolValue =3D=3D 'FALSE': > + Pcd.DefaultValue =3D '0' > + > + if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES: > + HexFormat =3D '0x%02x' > + if Pcd.DatumType =3D=3D TAB_UINT16: > + HexFormat =3D '0x%04x' > + elif Pcd.DatumType =3D=3D TAB_UINT32: > + HexFormat =3D '0x%08x' > + elif Pcd.DatumType =3D=3D TAB_UINT64: > + HexFormat =3D '0x%016x' > + PcdValue =3D HexFormat % int(Pcd.DefaultValue, 0) > + else: > + if Pcd.MaxDatumSize is None or Pcd.MaxDatumSize =3D= = =3D '': > + EdkLogger.error("build", AUTOGEN_ERROR, > + "Unknown [MaxDatumSize] of PCD = [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName) > + ) > + ArraySize =3D int(Pcd.MaxDatumSize, 0) > + PcdValue =3D Pcd.DefaultValue > + if PcdValue[0] !=3D '{': > + Unicode =3D False > + if PcdValue[0] =3D=3D 'L': > + Unicode =3D True > + PcdValue =3D PcdValue.lstrip('L') > + PcdValue =3D eval(PcdValue) > + NewValue =3D '{' > + for Index in range(0, len(PcdValue)): > + if Unicode: > + CharVal =3D ord(PcdValue[Index]) > + NewValue =3D NewValue + '0x%02x' % (Cha= rVal & 0x00FF) + ', ' \ > + + '0x%02x' % (CharVal >> 8) + '= , ' > + else: > + NewValue =3D NewValue + '0x%02x' % (ord= (PcdValue[Index]) % 0x100) + ', ' > + Padding =3D '0x00, ' > + if Unicode: > + Padding =3D Padding * 2 > + ArraySize =3D ArraySize // 2 > + if ArraySize < (len(PcdValue) + 1): > + if Pcd.MaxSizeUserSet: > + EdkLogger.error("build", AUTOGEN_ERROR, > + "The maximum size of VOID* = type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceG= uidCName, TokenCName) > + ) > + else: > + ArraySize =3D len(PcdValue) + 1 > + if ArraySize > len(PcdValue) + 1: > + NewValue =3D NewValue + Padding * (ArraySiz= e - len(PcdValue) - 1) > + PcdValue =3D NewValue + Padding.strip().rstrip(= ',') + '}' > + elif len(PcdValue.split(',')) <=3D ArraySize: > + PcdValue =3D PcdValue.rstrip('}') + ', 0x00' * = (ArraySize - len(PcdValue.split(','))) > + PcdValue +=3D '}' > + else: > + if Pcd.MaxSizeUserSet: > + EdkLogger.error("build", AUTOGEN_ERROR, > + "The maximum size of VOID* type= PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidC= Name, TokenCName) > + ) > + else: > + ArraySize =3D len(PcdValue) + 1 > + PcdItem =3D '%s.%s|%s|0x%X' % \ > + (Pcd.TokenSpaceGuidCName, TokenCName, PcdValue, Pat= chPcd[1]) > + PcdComments =3D '' > + if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._P= cdComments: > + PcdComments =3D '\n '.join(self._PcdComments[Pcd.T= okenSpaceGuidCName, Pcd.TokenCName]) > + if PcdComments: > + PcdItem =3D PcdComments + '\n ' + PcdItem > + AsBuiltInfDict['patchablepcd_item'].append(PcdItem) > + > + for Pcd in Pcds + VfrPcds: > + PcdCommentList =3D [] > + HiiInfo =3D '' > + TokenCName =3D Pcd.TokenCName > + for PcdItem in GlobalData.MixedPcd: > + if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalD= ata.MixedPcd[PcdItem]: > + TokenCName =3D PcdItem[0] > + break > + if Pcd.Type =3D=3D TAB_PCDS_DYNAMIC_EX_HII: > + for SkuName in Pcd.SkuInfoList: > + SkuInfo =3D Pcd.SkuInfoList[SkuName] > + HiiInfo =3D '## %s|%s|%s' % (SkuInfo.VariableName, = SkuInfo.VariableGuid, SkuInfo.VariableOffset) > + break > + if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdCo= mments: > + PcdCommentList =3D self._PcdComments[Pcd.TokenSpaceGuid= CName, Pcd.TokenCName][:] > + if HiiInfo: > + UsageIndex =3D -1 > + UsageStr =3D '' > + for Index, Comment in enumerate(PcdCommentList): > + for Usage in UsageList: > + if Comment.find(Usage) !=3D -1: > + UsageStr =3D Usage > + UsageIndex =3D Index > + break > + if UsageIndex !=3D -1: > + PcdCommentList[UsageIndex] =3D '## %s %s %s' % (Usa= geStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, '')) > + else: > + PcdCommentList.append('## UNDEFINED ' + HiiInfo) > + PcdComments =3D '\n '.join(PcdCommentList) > + PcdEntry =3D Pcd.TokenSpaceGuidCName + '.' + TokenCName > + if PcdComments: > + PcdEntry =3D PcdComments + '\n ' + PcdEntry > + AsBuiltInfDict['pcd_item'].append(PcdEntry) > + for Item in self.BuildOption: > + if 'FLAGS' in self.BuildOption[Item]: > + AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_FLA= GS =3D %s' % (self.ToolChainFamily, self.BuildTarget, self.ToolChain, self.= Arch, Item, self.BuildOption[Item]['FLAGS'].strip())) > + > + # Generated LibraryClasses section in comments. > + for Library in self.LibraryAutoGenList: > + AsBuiltInfDict['libraryclasses_item'].append(Library.MetaFi= le.File.replace('\\', '/')) > + > + # Generated UserExtensions TianoCore section. > + # All tianocore user extensions are copied. > + UserExtStr =3D '' > + for TianoCore in self._GetTianoCoreUserExtensionList(): > + UserExtStr +=3D '\n'.join(TianoCore) > + ExtensionFile =3D os.path.join(self.MetaFile.Dir, TianoCore= [1]) > + if os.path.isfile(ExtensionFile): > + shutil.copy2(ExtensionFile, self.OutputDir) > + AsBuiltInfDict['userextension_tianocore_item'] =3D UserExtStr > + > + # Generated depex expression section in comments. > + DepexExpression =3D self._GetDepexExpresionString() > + AsBuiltInfDict['depexsection_item'] =3D DepexExpression if Depe= xExpression else '' > + > + AsBuiltInf =3D TemplateString() > + AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfDic= t)) > + > + SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.inf= '), str(AsBuiltInf), False) > + > + self.IsAsBuiltInfCreated =3D True > + > + def CopyModuleToCache(self): > + FileDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformIn= fo.Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir= , self.MetaFile.BaseName) > + CreateDirectory (FileDir) > + HashFile =3D path.join(self.BuildDir, self.Name + '.hash') > + if os.path.exists(HashFile): > + CopyFileOnChange(HashFile, FileDir) > + ModuleFile =3D path.join(self.OutputDir, self.Name + '.inf') > + if os.path.exists(ModuleFile): > + CopyFileOnChange(ModuleFile, FileDir) > + if not self.OutputFile: > + Ma =3D self.BuildDatabase[self.MetaFile, self.Arch, self.Bu= ildTarget, self.ToolChain] > + self.OutputFile =3D Ma.Binaries > + for File in self.OutputFile: > + File =3D str(File) > + if not os.path.isabs(File): > + File =3D os.path.join(self.OutputDir, File) > + if os.path.exists(File): > + sub_dir =3D os.path.relpath(File, self.OutputDir) > + destination_file =3D os.path.join(FileDir, sub_dir) > + destination_dir =3D os.path.dirname(destination_file) > + CreateDirectory(destination_dir) > + CopyFileOnChange(File, destination_dir) > + > + def AttemptModuleCacheCopy(self): > + # If library or Module is binary do not skip by hash > + if self.IsBinaryModule: > + return False > + # .inc is contains binary information so do not skip by hash as= well > + for f_ext in self.SourceFileList: > + if '.inc' in str(f_ext): > + return False > + FileDir =3D path.join(GlobalData.gBinCacheSource, self.Platform= Info.Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceD= ir, self.MetaFile.BaseName) > + HashFile =3D path.join(FileDir, self.Name + '.hash') > + if os.path.exists(HashFile): > + f =3D open(HashFile, 'r') > + CacheHash =3D f.read() > + f.close() > + self.GenModuleHash() > + if GlobalData.gModuleHash[self.Arch][self.Name]: > + if CacheHash =3D=3D GlobalData.gModuleHash[self.Arch][s= elf.Name]: > + for root, dir, files in os.walk(FileDir): > + for f in files: > + if self.Name + '.hash' in f: > + CopyFileOnChange(HashFile, self.BuildDi= r) > + else: > + File =3D path.join(root, f) > + sub_dir =3D os.path.relpath(File, FileD= ir) > + destination_file =3D os.path.join(self.= OutputDir, sub_dir) > + destination_dir =3D os.path.dirname(des= tination_file) > + CreateDirectory(destination_dir) > + CopyFileOnChange(File, destination_dir) > + if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "= PcdDxe": > + CreatePcdDatabaseCode(self, TemplateString(), T= emplateString()) > + return True > + return False > + > + ## Create makefile for the module and its dependent libraries > + # > + # @param CreateLibraryMakeFile Flag indicating if or not t= he makefiles of > + # dependent libraries will be= created > + # > + @cached_class_function > + def CreateMakeFile(self, CreateLibraryMakeFile=3DTrue, GenFfsList = =3D []): > + # nest this function inside it's only caller. > + def CreateTimeStamp(): > + FileSet =3D {self.MetaFile.Path} > + > + for SourceFile in self.Module.Sources: > + FileSet.add (SourceFile.Path) > + > + for Lib in self.DependentLibraryList: > + FileSet.add (Lib.MetaFile.Path) > + > + for f in self.AutoGenDepSet: > + FileSet.add (f.Path) > + > + if os.path.exists (self.TimeStampPath): > + os.remove (self.TimeStampPath) > + with open(self.TimeStampPath, 'w+') as fd: > + for f in FileSet: > + fd.write(f) > + fd.write("\n") > + > + # Ignore generating makefile when it is a binary module > + if self.IsBinaryModule: > + return > + > + self.GenFfsList =3D GenFfsList > + > + if not self.IsLibrary and CreateLibraryMakeFile: > + for LibraryAutoGen in self.LibraryAutoGenList: > + LibraryAutoGen.CreateMakeFile() > + # Don't enable if hash feature enabled, CanSkip uses timestamps= to determine build skipping > + if not GlobalData.gUseHashCache and self.CanSkip(): > + return > + > + if len(self.CustomMakefile) =3D=3D 0: > + Makefile =3D GenMake.ModuleMakefile(self) > + else: > + Makefile =3D GenMake.CustomMakefile(self) > + if Makefile.Generate(): > + EdkLogger.debug(EdkLogger.DEBUG_9, "Generated makefile for = module %s [%s]" % > + (self.Name, self.Arch)) > + else: > + EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation = of makefile for module %s [%s]" % > + (self.Name, self.Arch)) > + > + CreateTimeStamp() > + > + def CopyBinaryFiles(self): > + for File in self.Module.Binaries: > + SrcPath =3D File.Path > + DstPath =3D os.path.join(self.OutputDir, os.path.basename(S= rcPath)) > + CopyLongFilePath(SrcPath, DstPath) > + ## Create autogen code for the module and its dependent libraries > + # > + # @param CreateLibraryCodeFile Flag indicating if or not t= he code of > + # dependent libraries will be= created > + # > + def CreateCodeFile(self, CreateLibraryCodeFile=3DTrue): > + if self.IsCodeFileCreated: > + return > + > + # Need to generate PcdDatabase even PcdDriver is binarymodule > + if self.IsBinaryModule and self.PcdIsDriver !=3D '': > + CreatePcdDatabaseCode(self, TemplateString(), TemplateStrin= g()) > + return > + if self.IsBinaryModule: > + if self.IsLibrary: > + self.CopyBinaryFiles() > + return > + > + if not self.IsLibrary and CreateLibraryCodeFile: > + for LibraryAutoGen in self.LibraryAutoGenList: > + LibraryAutoGen.CreateCodeFile() > + > + # Don't enable if hash feature enabled, CanSkip uses timestamps= to determine build skipping > + if not GlobalData.gUseHashCache and self.CanSkip(): > + return > + > + AutoGenList =3D [] > + IgoredAutoGenList =3D [] > + > + for File in self.AutoGenFileList: > + if GenC.Generate(File.Path, self.AutoGenFileList[File], Fil= e.IsBinary): > + AutoGenList.append(str(File)) > + else: > + IgoredAutoGenList.append(str(File)) > + > + > + for ModuleType in self.DepexList: > + # Ignore empty [depex] section or [depex] section for SUP_M= ODULE_USER_DEFINED module > + if len(self.DepexList[ModuleType]) =3D=3D 0 or ModuleType = =3D=3D SUP_MODULE_USER_DEFINED or ModuleType =3D=3D SUP_MODULE_HOST_APPLIC= ATION: > + continue > + > + Dpx =3D GenDepex.DependencyExpression(self.DepexList[Module= Type], ModuleType, True) > + DpxFile =3D gAutoGenDepexFileName % {"module_name" : self.N= ame} > + > + if len(Dpx.PostfixNotation) !=3D 0: > + self.DepexGenerated =3D True > + > + if Dpx.Generate(path.join(self.OutputDir, DpxFile)): > + AutoGenList.append(str(DpxFile)) > + else: > + IgoredAutoGenList.append(str(DpxFile)) > + > + if IgoredAutoGenList =3D=3D []: > + EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] files fo= r module %s [%s]" % > + (" ".join(AutoGenList), self.Name, self.Arc= h)) > + elif AutoGenList =3D=3D []: > + EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generation = of [%s] files for module %s [%s]" % > + (" ".join(IgoredAutoGenList), self.Name, se= lf.Arch)) > + else: > + EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] (skipped= %s) files for module %s [%s]" % > + (" ".join(AutoGenList), " ".join(IgoredAuto= GenList), self.Name, self.Arch)) > + > + self.IsCodeFileCreated =3D True > + return AutoGenList > + > + ## Summarize the ModuleAutoGen objects of all libraries used by thi= s module > + @cached_property > + def LibraryAutoGenList(self): > + RetVal =3D [] > + for Library in self.DependentLibraryList: > + La =3D ModuleAutoGen( > + self.Workspace, > + Library.MetaFile, > + self.BuildTarget, > + self.ToolChain, > + self.Arch, > + self.PlatformInfo.MetaFile, > + self.DataPipe > + ) > + La.IsLibrary =3D True > + if La not in RetVal: > + RetVal.append(La) > + for Lib in La.CodaTargetList: > + self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE) > + return RetVal > + > + def GenModuleHash(self): > + # Initialize a dictionary for each arch type > + if self.Arch not in GlobalData.gModuleHash: > + GlobalData.gModuleHash[self.Arch] =3D {} > + > + # Early exit if module or library has been hashed and is in mem= ory > + if self.Name in GlobalData.gModuleHash[self.Arch]: > + return GlobalData.gModuleHash[self.Arch][self.Name].encode(= 'utf-8') > + > + # Initialze hash object > + m =3D hashlib.md5() > + > + # Add Platform level hash > + m.update(GlobalData.gPlatformHash.encode('utf-8')) > + > + # Add Package level hash > + if self.DependentPackageList: > + for Pkg in sorted(self.DependentPackageList, key=3Dlambda x= : x.PackageName): > + if Pkg.PackageName in GlobalData.gPackageHash: > + m.update(GlobalData.gPackageHash[Pkg.PackageName].e= ncode('utf-8')) > + > + # Add Library hash > + if self.LibraryAutoGenList: > + for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: = x.Name): > + if Lib.Name not in GlobalData.gModuleHash[self.Arch]: > + Lib.GenModuleHash() > + m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].en= code('utf-8')) > + > + # Add Module self > + f =3D open(str(self.MetaFile), 'rb') > + Content =3D f.read() > + f.close() > + m.update(Content) > + > + # Add Module's source files > + if self.SourceFileList: > + for File in sorted(self.SourceFileList, key=3Dlambda x: str= (x)): > + f =3D open(str(File), 'rb') > + Content =3D f.read() > + f.close() > + m.update(Content) > + > + GlobalData.gModuleHash[self.Arch][self.Name] =3D m.hexdigest() > + > + return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf= -8') > + > + ## Decide whether we can skip the ModuleAutoGen process > + def CanSkipbyHash(self): > + # Hashing feature is off > + if not GlobalData.gUseHashCache: > + return False > + > + # Initialize a dictionary for each arch type > + if self.Arch not in GlobalData.gBuildHashSkipTracking: > + GlobalData.gBuildHashSkipTracking[self.Arch] =3D dict() > + > + # If library or Module is binary do not skip by hash > + if self.IsBinaryModule: > + return False > + > + # .inc is contains binary information so do not skip by hash as= well > + for f_ext in self.SourceFileList: > + if '.inc' in str(f_ext): > + return False > + > + # Use Cache, if exists and if Module has a copy in cache > + if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy()= : > + return True > + > + # Early exit for libraries that haven't yet finished building > + HashFile =3D path.join(self.BuildDir, self.Name + ".hash") > + if self.IsLibrary and not os.path.exists(HashFile): > + return False > + > + # Return a Boolean based on if can skip by hash, either from me= mory or from IO. > + if self.Name not in GlobalData.gBuildHashSkipTracking[self.Arch= ]: > + # If hashes are the same, SaveFileOnChange() will return Fa= lse. > + GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] =3D= not SaveFileOnChange(HashFile, self.GenModuleHash(), True) > + return GlobalData.gBuildHashSkipTracking[self.Arch][self.Na= me] > + else: > + return GlobalData.gBuildHashSkipTracking[self.Arch][self.Na= me] > + > + ## Decide whether we can skip the ModuleAutoGen process > + # If any source file is newer than the module than we cannot skip > + # > + def CanSkip(self): > + if self.MakeFileDir in GlobalData.gSikpAutoGenCache: > + return True > + if not os.path.exists(self.TimeStampPath): > + return False > + #last creation time of the module > + DstTimeStamp =3D os.stat(self.TimeStampPath)[8] > + > + SrcTimeStamp =3D self.Workspace._SrcTimeStamp > + if SrcTimeStamp > DstTimeStamp: > + return False > + > + with open(self.TimeStampPath,'r') as f: > + for source in f: > + source =3D source.rstrip('\n') > + if not os.path.exists(source): > + return False > + if source not in ModuleAutoGen.TimeDict : > + ModuleAutoGen.TimeDict[source] =3D os.stat(source)[= 8] > + if ModuleAutoGen.TimeDict[source] > DstTimeStamp: > + return False > + GlobalData.gSikpAutoGenCache.add(self.MakeFileDir) > + return True > + > + @cached_property > + def TimeStampPath(self): > + return os.path.join(self.MakeFileDir, 'AutoGenTimeStamp') > diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py b/Ba= seTools/Source/Python/AutoGen/ModuleAutoGenHelper.py > new file mode 100644 > index 000000000000..c7591253debd > --- /dev/null > +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py > @@ -0,0 +1,619 @@ > +## @file > +# Create makefile for MS nmake and GNU make > +# > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> +# SPDX-License-Identifier: BSD-2-Clause-Patent > +# > +from __future__ import absolute_import > +from Workspace.WorkspaceDatabase import WorkspaceDatabase,BuildDB > +from Common.caching import cached_property > +from AutoGen.BuildEngine import BuildRule,AutoGenReqBuildRuleVerNum > +from AutoGen.AutoGen import CalculatePriorityValue > +from Common.Misc import CheckPcdDatum,GuidValue > +from Common.Expression import ValueExpressionEx > +from Common.DataType import * > +from CommonDataClass.Exceptions import * > +from CommonDataClass.CommonClass import SkuInfoClass > +import Common.EdkLogger as EdkLogger > +from Common.BuildToolError import OPTION_CONFLICT,FORMAT_INVALID,RESOUR= CE_NOT_AVAILABLE > +from Common.MultipleWorkspace import MultipleWorkspace as mws > +from collections import defaultdict > +from Common.Misc import PathClass > +import os > + > + > +# > +# The priority list while override build option > +# > +PrioList =3D {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE= _ATTRIBUTE (Highest) > + "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE_A= TTRIBUTE > + "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE_A= TTRIBUTE > + "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE_A= TTRIBUTE > + "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE_A= TTRIBUTE > + "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE_A= TTRIBUTE > + "0x10011" : 10, # TARGET_*********_****_COMMANDTYPE_A= TTRIBUTE > + "0x00011" : 9, # ******_*********_****_COMMANDTYPE_A= TTRIBUTE > + "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_***********_A= TTRIBUTE > + "0x01101" : 7, # ******_TOOLCHAIN_ARCH_***********_A= TTRIBUTE > + "0x10101" : 6, # TARGET_*********_ARCH_***********_A= TTRIBUTE > + "0x00101" : 5, # ******_*********_ARCH_***********_A= TTRIBUTE > + "0x11001" : 4, # TARGET_TOOLCHAIN_****_***********_A= TTRIBUTE > + "0x01001" : 3, # ******_TOOLCHAIN_****_***********_A= TTRIBUTE > + "0x10001" : 2, # TARGET_*********_****_***********_A= TTRIBUTE > + "0x00001" : 1} # ******_*********_****_***********_A= TTRIBUTE (Lowest) > +## Base class for AutoGen > +# > +# This class just implements the cache mechanism of AutoGen objects. > +# > +class AutoGenInfo(object): > + # database to maintain the objects in each child class > + __ObjectCache =3D {} # (BuildTarget, ToolChain, ARCH, platform f= ile): AutoGen object > + > + ## Factory method > + # > + # @param Class class object of real AutoGen class > + # (WorkspaceAutoGen, ModuleAutoGen or Pla= tformAutoGen) > + # @param Workspace Workspace directory or WorkspaceAutoGen= object > + # @param MetaFile The path of meta file > + # @param Target Build target > + # @param Toolchain Tool chain name > + # @param Arch Target arch > + # @param *args The specific class related parameters > + # @param **kwargs The specific class related dict paramet= ers > + # > + @classmethod > + def GetCache(cls): > + return cls.__ObjectCache > + def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *arg= s, **kwargs): > + # check if the object has been created > + Key =3D (Target, Toolchain, Arch, MetaFile) > + if Key in cls.__ObjectCache: > + # if it exists, just return it directly > + return cls.__ObjectCache[Key] > + # it didnt exist. create it, cache it, then return it > + RetVal =3D cls.__ObjectCache[Key] =3D super(AutoGenInfo, cls)._= _new__(cls) > + return RetVal > + > + > + ## hash() operator > + # > + # The file path of platform file will be used to represent hash va= lue of this object > + # > + # @retval int Hash value of the file path of platform file > + # > + def __hash__(self): > + return hash(self.MetaFile) > + > + ## str() operator > + # > + # The file path of platform file will be used to represent this ob= ject > + # > + # @retval string String of platform file path > + # > + def __str__(self): > + return str(self.MetaFile) > + > + ## "=3D=3D" operator > + def __eq__(self, Other): > + return Other and self.MetaFile =3D=3D Other > + > + ## Expand * in build option key > + # > + # @param Options Options to be expanded > + # @param ToolDef Use specified ToolDef instead of full versi= on. > + # This is needed during initialization to pre= vent > + # infinite recursion betweeh BuildOptions, > + # ToolDefinition, and this function. > + # > + # @retval options Options expanded > + # > + def _ExpandBuildOption(self, Options, ModuleStyle=3DNone, ToolDef= =3DNone): > + if not ToolDef: > + ToolDef =3D self.ToolDefinition > + BuildOptions =3D {} > + FamilyMatch =3D False > + FamilyIsNull =3D True > + > + OverrideList =3D {} > + # > + # Construct a list contain the build options which need overrid= e. > + # > + for Key in Options: > + # > + # Key[0] -- tool family > + # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE > + # > + if (Key[0] =3D=3D self.BuildRuleFamily and > + (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 a= nd Key[2] =3D=3D ModuleStyle))): > + Target, ToolChain, Arch, CommandType, Attr =3D Key[1].s= plit('_') > + if (Target =3D=3D self.BuildTarget or Target =3D=3D TAB= _STAR) and\ > + (ToolChain =3D=3D self.ToolChain or ToolChain =3D= =3D TAB_STAR) and\ > + (Arch =3D=3D self.Arch or Arch =3D=3D TAB_STAR) and= \ > + Options[Key].startswith("=3D"): > + > + if OverrideList.get(Key[1]) is not None: > + OverrideList.pop(Key[1]) > + OverrideList[Key[1]] =3D Options[Key] > + > + # > + # Use the highest priority value. > + # > + if (len(OverrideList) >=3D 2): > + KeyList =3D list(OverrideList.keys()) > + for Index in range(len(KeyList)): > + NowKey =3D KeyList[Index] > + Target1, ToolChain1, Arch1, CommandType1, Attr1 =3D Now= Key.split("_") > + for Index1 in range(len(KeyList) - Index - 1): > + NextKey =3D KeyList[Index1 + Index + 1] > + # > + # Compare two Key, if one is included by another, c= hoose the higher priority one > + # > + Target2, ToolChain2, Arch2, CommandType2, Attr2 =3D= NextKey.split("_") > + if (Target1 =3D=3D Target2 or Target1 =3D=3D TAB_ST= AR or Target2 =3D=3D TAB_STAR) and\ > + (ToolChain1 =3D=3D ToolChain2 or ToolChain1 =3D= = =3D TAB_STAR or ToolChain2 =3D=3D TAB_STAR) and\ > + (Arch1 =3D=3D Arch2 or Arch1 =3D=3D TAB_STAR or= Arch2 =3D=3D TAB_STAR) and\ > + (CommandType1 =3D=3D CommandType2 or CommandTyp= e1 =3D=3D TAB_STAR or CommandType2 =3D=3D TAB_STAR) and\ > + (Attr1 =3D=3D Attr2 or Attr1 =3D=3D TAB_STAR or= Attr2 =3D=3D TAB_STAR): > + > + if CalculatePriorityValue(NowKey) > CalculatePr= iorityValue(NextKey): > + if Options.get((self.BuildRuleFamily, NextK= ey)) is not None: > + Options.pop((self.BuildRuleFamily, Next= Key)) > + else: > + if Options.get((self.BuildRuleFamily, NowKe= y)) is not None: > + Options.pop((self.BuildRuleFamily, NowK= ey)) > + > + for Key in Options: > + if ModuleStyle is not None and len (Key) > 2: > + # Check Module style is EDK or EDKII. > + # Only append build option for the matched style module= . > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NAME= : > + continue > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D EDKI= I_NAME: > + continue > + Family =3D Key[0] > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > + # if tool chain family doesn't match, skip it > + if Tool in ToolDef and Family !=3D "": > + FamilyIsNull =3D False > + if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY, "= ") !=3D "": > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_BUILDR= ULEFAMILY]: > + continue > + elif Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > + continue > + FamilyMatch =3D True > + # expand any wildcard > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarget= : > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > + if Tool not in BuildOptions: > + BuildOptions[Tool] =3D {} > + if Attr !=3D "FLAGS" or Attr not in BuildOption= s[Tool] or Options[Key].startswith('=3D'): > + BuildOptions[Tool][Attr] =3D Options[Key] > + else: > + # append options for the same tool except P= ATH > + if Attr !=3D 'PATH': > + BuildOptions[Tool][Attr] +=3D " " + Opt= ions[Key] > + else: > + BuildOptions[Tool][Attr] =3D Options[Ke= y] > + # Build Option Family has been checked, which need't to be chec= ked again for family. > + if FamilyMatch or FamilyIsNull: > + return BuildOptions > + > + for Key in Options: > + if ModuleStyle is not None and len (Key) > 2: > + # Check Module style is EDK or EDKII. > + # Only append build option for the matched style module= . > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NAME= : > + continue > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D EDKI= I_NAME: > + continue > + Family =3D Key[0] > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > + # if tool chain family doesn't match, skip it > + if Tool not in ToolDef or Family =3D=3D "": > + continue > + # option has been added before > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > + continue > + > + # expand any wildcard > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarget= : > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > + if Tool not in BuildOptions: > + BuildOptions[Tool] =3D {} > + if Attr !=3D "FLAGS" or Attr not in BuildOption= s[Tool] or Options[Key].startswith('=3D'): > + BuildOptions[Tool][Attr] =3D Options[Key] > + else: > + # append options for the same tool except P= ATH > + if Attr !=3D 'PATH': > + BuildOptions[Tool][Attr] +=3D " " + Opt= ions[Key] > + else: > + BuildOptions[Tool][Attr] =3D Options[Ke= y] > + return BuildOptions > +# > +#This class is the pruned WorkSpaceAutoGen for ModuleAutoGen in multipl= e thread > +# > +class WorkSpaceInfo(AutoGenInfo): > + def __init__(self,Workspace, MetaFile, Target, ToolChain, Arch): > + self._SrcTimeStamp =3D 0 > + self.Db =3D BuildDB > + self.BuildDatabase =3D self.Db.BuildObject > + self.Target =3D Target > + self.ToolChain =3D ToolChain > + self.WorkspaceDir =3D Workspace > + self.ActivePlatform =3D MetaFile > + self.ArchList =3D Arch > + > + > +class PlatformInfo(AutoGenInfo): > + def __init__(self, Workspace, MetaFile, Target, ToolChain, Arch,Dat= aPipe): > + self.Wa =3D Workspace > + self.WorkspaceDir =3D self.Wa.WorkspaceDir > + self.MetaFile =3D MetaFile > + self.Arch =3D Arch > + self.Target =3D Target > + self.BuildTarget =3D Target > + self.ToolChain =3D ToolChain > + self.Platform =3D self.Wa.BuildDatabase[self.MetaFile, self.Arc= h, self.Target, self.ToolChain] > + > + self.SourceDir =3D MetaFile.SubDir > + self.DataPipe =3D DataPipe > + @cached_property > + def _AsBuildModuleList(self): > + retVal =3D self.DataPipe.Get("AsBuildModuleList") > + if retVal is None: > + retVal =3D {} > + return retVal > + > + ## Test if a module is supported by the platform > + # > + # An error will be raised directly if the module or its arch is no= t supported > + # by the platform or current configuration > + # > + def ValidModule(self, Module): > + return Module in self.Platform.Modules or Module in self.Platfo= rm.LibraryInstances \ > + or Module in self._AsBuildModuleList > + > + @cached_property > + def ToolChainFamily(self): > + retVal =3D self.DataPipe.Get("ToolChainFamily") > + if retVal is None: > + retVal =3D {} > + return retVal > + > + @cached_property > + def BuildRuleFamily(self): > + retVal =3D self.DataPipe.Get("BuildRuleFamily") > + if retVal is None: > + retVal =3D {} > + return retVal > + > + @cached_property > + def _MbList(self): > + return [self.Wa.BuildDatabase[m, self.Arch, self.BuildTarget, s= elf.ToolChain] for m in self.Platform.Modules] > + > + @cached_property > + def PackageList(self): > + RetVal =3D set() > + for dec_file,Arch in self.DataPipe.Get("PackageList"): > + RetVal.add(self.Wa.BuildDatabase[dec_file,Arch,self.BuildTa= rget, self.ToolChain]) > + return list(RetVal) > + > + ## Return the directory to store all intermediate and final files b= uilt > + @cached_property > + def BuildDir(self): > + if os.path.isabs(self.OutputDir): > + RetVal =3D os.path.join( > + os.path.abspath(self.OutputDir), > + self.Target + "_" + self.ToolChain, > + ) > + else: > + RetVal =3D os.path.join( > + self.WorkspaceDir, > + self.OutputDir, > + self.Target + "_" + self.ToolChain, > + ) > + return RetVal > + > + ## Return the build output directory platform specifies > + @cached_property > + def OutputDir(self): > + return self.Platform.OutputDirectory > + > + ## Return platform name > + @cached_property > + def Name(self): > + return self.Platform.PlatformName > + > + ## Return meta-file GUID > + @cached_property > + def Guid(self): > + return self.Platform.Guid > + > + ## Return platform version > + @cached_property > + def Version(self): > + return self.Platform.Version > + > + ## Return paths of tools > + @cached_property > + def ToolDefinition(self): > + retVal =3D self.DataPipe.Get("TOOLDEF") > + if retVal is None: > + retVal =3D {} > + return retVal > + > + ## Return build command string > + # > + # @retval string Build command string > + # > + @cached_property > + def BuildCommand(self): > + retVal =3D self.DataPipe.Get("BuildCommand") > + if retVal is None: > + retVal =3D [] > + return retVal > + > + @cached_property > + def PcdTokenNumber(self): > + retVal =3D self.DataPipe.Get("PCD_TNUM") > + if retVal is None: > + retVal =3D {} > + return retVal > + > + ## Override PCD setting (type, value, ...) > + # > + # @param ToPcd The PCD to be overridden > + # @param FromPcd The PCD overriding from > + # > + def _OverridePcd(self, ToPcd, FromPcd, Module=3D"", Msg=3D"", Libra= ry=3D""): > + # > + # in case there's PCDs coming from FDF file, which have no type= given. > + # at this point, ToPcd.Type has the type found from dependent > + # package > + # > + TokenCName =3D ToPcd.TokenCName > + for PcdItem in self.MixedPcd: > + if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in self.Mi= xedPcd[PcdItem]: > + TokenCName =3D PcdItem[0] > + break > + if FromPcd is not None: > + if ToPcd.Pending and FromPcd.Type: > + ToPcd.Type =3D FromPcd.Type > + elif ToPcd.Type and FromPcd.Type\ > + and ToPcd.Type !=3D FromPcd.Type and ToPcd.Type in From= Pcd.Type: > + if ToPcd.Type.strip() =3D=3D TAB_PCDS_DYNAMIC_EX: > + ToPcd.Type =3D FromPcd.Type > + elif ToPcd.Type and FromPcd.Type \ > + and ToPcd.Type !=3D FromPcd.Type: > + if Library: > + Module =3D str(Module) + " 's library file (" + str= (Library) + ")" > + EdkLogger.error("build", OPTION_CONFLICT, "Mismatched P= CD type", > + ExtraData=3D"%s.%s is used as [%s] in m= odule %s, but as [%s] in %s."\ > + % (ToPcd.TokenSpaceGuidCName,= TokenCName, > + ToPcd.Type, Module, FromPc= d.Type, Msg), > + File=3Dself.MetaFile) > + > + if FromPcd.MaxDatumSize: > + ToPcd.MaxDatumSize =3D FromPcd.MaxDatumSize > + ToPcd.MaxSizeUserSet =3D FromPcd.MaxDatumSize > + if FromPcd.DefaultValue: > + ToPcd.DefaultValue =3D FromPcd.DefaultValue > + if FromPcd.TokenValue: > + ToPcd.TokenValue =3D FromPcd.TokenValue > + if FromPcd.DatumType: > + ToPcd.DatumType =3D FromPcd.DatumType > + if FromPcd.SkuInfoList: > + ToPcd.SkuInfoList =3D FromPcd.SkuInfoList > + if FromPcd.UserDefinedDefaultStoresFlag: > + ToPcd.UserDefinedDefaultStoresFlag =3D FromPcd.UserDefi= nedDefaultStoresFlag > + # Add Flexible PCD format parse > + if ToPcd.DefaultValue: > + try: > + ToPcd.DefaultValue =3D ValueExpressionEx(ToPcd.Defa= ultValue, ToPcd.DatumType, self._GuidDict)(True) > + except BadExpression as Value: > + EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.= %s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.De= faultValue, Value), > + File=3Dself.MetaFile) > + > + # check the validation of datum > + IsValid, Cause =3D CheckPcdDatum(ToPcd.DatumType, ToPcd.Def= aultValue) > + if not IsValid: > + EdkLogger.error('build', FORMAT_INVALID, Cause, File=3D= self.MetaFile, > + ExtraData=3D"%s.%s" % (ToPcd.TokenSpace= GuidCName, TokenCName)) > + ToPcd.validateranges =3D FromPcd.validateranges > + ToPcd.validlists =3D FromPcd.validlists > + ToPcd.expressions =3D FromPcd.expressions > + ToPcd.CustomAttribute =3D FromPcd.CustomAttribute > + > + if FromPcd is not None and ToPcd.DatumType =3D=3D TAB_VOID and = not ToPcd.MaxDatumSize: > + EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specifi= ed for PCD %s.%s" \ > + % (ToPcd.TokenSpaceGuidCName, TokenCName)) > + Value =3D ToPcd.DefaultValue > + if not Value: > + ToPcd.MaxDatumSize =3D '1' > + elif Value[0] =3D=3D 'L': > + ToPcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > + elif Value[0] =3D=3D '{': > + ToPcd.MaxDatumSize =3D str(len(Value.split(','))) > + else: > + ToPcd.MaxDatumSize =3D str(len(Value) - 1) > + > + # apply default SKU for dynamic PCDS if specified one is not av= ailable > + if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYN= AMIC_EX_TYPE_SET) \ > + and not ToPcd.SkuInfoList: > + if self.Platform.SkuName in self.Platform.SkuIds: > + SkuName =3D self.Platform.SkuName > + else: > + SkuName =3D TAB_DEFAULT > + ToPcd.SkuInfoList =3D { > + SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[Sk= uName][0], '', '', '', '', '', ToPcd.DefaultValue) > + } > + > + def ApplyPcdSetting(self, Module, Pcds, Library=3D""): > + # for each PCD in module > + for Name, Guid in Pcds: > + PcdInModule =3D Pcds[Name, Guid] > + # find out the PCD setting in platform > + if (Name, Guid) in self.Pcds: > + PcdInPlatform =3D self.Pcds[Name, Guid] > + else: > + PcdInPlatform =3D None > + # then override the settings if any > + self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg= =3D"DSC PCD sections", Library=3DLibrary) > + # resolve the VariableGuid value > + for SkuId in PcdInModule.SkuInfoList: > + Sku =3D PcdInModule.SkuInfoList[SkuId] > + if Sku.VariableGuid =3D=3D '': continue > + Sku.VariableGuidValue =3D GuidValue(Sku.VariableGuid, s= elf.PackageList, self.MetaFile.Path) > + if Sku.VariableGuidValue is None: > + PackageList =3D "\n\t".join(str(P) for P in self.Pa= ckageList) > + EdkLogger.error( > + 'build', > + RESOURCE_NOT_AVAILABLE, > + "Value of GUID [%s] is not found in" % = Sku.VariableGuid, > + ExtraData=3DPackageList + "\n\t(used wi= th %s.%s from module %s)" \ > + % (Guid, Name, = str(Module)), > + File=3Dself.MetaFile > + ) > + > + # override PCD settings with module specific setting > + if Module in self.Platform.Modules: > + PlatformModule =3D self.Platform.Modules[str(Module)] > + for Key in PlatformModule.Pcds: > + if self.BuildOptionPcd: > + for pcd in self.BuildOptionPcd: > + (TokenSpaceGuidCName, TokenCName, FieldName, pc= dvalue, _) =3D pcd > + if (TokenCName, TokenSpaceGuidCName) =3D=3D Key= and FieldName =3D=3D"": > + PlatformModule.Pcds[Key].DefaultValue =3D p= cdvalue > + PlatformModule.Pcds[Key].PcdValueFromComm = =3D pcdvalue > + break > + Flag =3D False > + if Key in Pcds: > + ToPcd =3D Pcds[Key] > + Flag =3D True > + elif Key in self.MixedPcd: > + for PcdItem in self.MixedPcd[Key]: > + if PcdItem in Pcds: > + ToPcd =3D Pcds[PcdItem] > + Flag =3D True > + break > + if Flag: > + self._OverridePcd(ToPcd, PlatformModule.Pcds[Key], = Module, Msg=3D"DSC Components Module scoped PCD section", Library=3DLibrary= ) > + # use PCD value to calculate the MaxDatumSize when it is not sp= ecified > + for Name, Guid in Pcds: > + Pcd =3D Pcds[Name, Guid] > + if Pcd.DatumType =3D=3D TAB_VOID and not Pcd.MaxDatumSize: > + Pcd.MaxSizeUserSet =3D None > + Value =3D Pcd.DefaultValue > + if not Value: > + Pcd.MaxDatumSize =3D '1' > + elif Value[0] =3D=3D 'L': > + Pcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > + elif Value[0] =3D=3D '{': > + Pcd.MaxDatumSize =3D str(len(Value.split(','))) > + else: > + Pcd.MaxDatumSize =3D str(len(Value) - 1) > + return list(Pcds.values()) > + > + @cached_property > + def Pcds(self): > + PlatformPcdData =3D self.DataPipe.Get("PLA_PCD") > +# for pcd in PlatformPcdData: > +# for skuid in pcd.SkuInfoList: > +# pcd.SkuInfoList[skuid] =3D self.CreateSkuInfoFromDict= (pcd.SkuInfoList[skuid]) > + return {(pcddata.TokenCName,pcddata.TokenSpaceGuidCName):pcddat= a for pcddata in PlatformPcdData} > + > + def CreateSkuInfoFromDict(self,SkuInfoDict): > + return SkuInfoClass( > + SkuInfoDict.get("SkuIdName"), > + SkuInfoDict.get("SkuId"), > + SkuInfoDict.get("VariableName"), > + SkuInfoDict.get("VariableGuid"), > + SkuInfoDict.get("VariableOffset"), > + SkuInfoDict.get("HiiDefaultValue"), > + SkuInfoDict.get("VpdOffset"), > + SkuInfoDict.get("DefaultValue"), > + SkuInfoDict.get("VariableGuidValue"), > + SkuInfoDict.get("VariableAttribute",""), > + SkuInfoDict.get("DefaultStore",None) > + ) > + @cached_property > + def MixedPcd(self): > + return self.DataPipe.Get("MixedPcd") > + @cached_property > + def _GuidDict(self): > + RetVal =3D self.DataPipe.Get("GuidDict") > + if RetVal is None: > + RetVal =3D {} > + return RetVal > + @cached_property > + def BuildOptionPcd(self): > + return self.DataPipe.Get("BuildOptPcd") > + def ApplyBuildOption(self,module): > + PlatformOptions =3D self.DataPipe.Get("PLA_BO") > + ModuleBuildOptions =3D self.DataPipe.Get("MOL_BO") > + ModuleOptionFromDsc =3D ModuleBuildOptions.get((module.MetaFile= .File,module.MetaFile.Root)) > + if ModuleOptionFromDsc: > + ModuleTypeOptions, PlatformModuleOptions =3D ModuleOptionFr= omDsc["ModuleTypeOptions"],ModuleOptionFromDsc["PlatformModuleOptions"] > + else: > + ModuleTypeOptions, PlatformModuleOptions =3D {}, {} > + ToolDefinition =3D self.DataPipe.Get("TOOLDEF") > + ModuleOptions =3D self._ExpandBuildOption(module.BuildOptions) > + BuildRuleOrder =3D None > + for Options in [ToolDefinition, ModuleOptions, PlatformOptions,= ModuleTypeOptions, PlatformModuleOptions]: > + for Tool in Options: > + for Attr in Options[Tool]: > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > + BuildRuleOrder =3D Options[Tool][Attr] > + > + AllTools =3D set(list(ModuleOptions.keys()) + list(PlatformOpti= ons.keys()) + > + list(PlatformModuleOptions.keys()) + list(Module= TypeOptions.keys()) + > + list(ToolDefinition.keys())) > + BuildOptions =3D defaultdict(lambda: defaultdict(str)) > + for Tool in AllTools: > + for Options in [ToolDefinition, ModuleOptions, PlatformOpti= ons, ModuleTypeOptions, PlatformModuleOptions]: > + if Tool not in Options: > + continue > + for Attr in Options[Tool]: > + # > + # Do not generate it in Makefile > + # > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > + continue > + Value =3D Options[Tool][Attr] > + # check if override is indicated > + if Value.startswith('=3D'): > + BuildOptions[Tool][Attr] =3D mws.handleWsMacro(= Value[1:]) > + else: > + if Attr !=3D 'PATH': > + BuildOptions[Tool][Attr] +=3D " " + mws.han= dleWsMacro(Value) > + else: > + BuildOptions[Tool][Attr] =3D mws.handleWsMa= cro(Value) > + > + return BuildOptions, BuildRuleOrder > + > + def ApplyLibraryInstance(self,module): > + alldeps =3D self.DataPipe.Get("DEPS") > + if alldeps is None: > + alldeps =3D {} > + mod_libs =3D alldeps.get((module.MetaFile.File,module.MetaFile.= Root,module.Arch,module.MetaFile.Path),[]) > + retVal =3D [] > + for (file_path,root,arch,abs_path) in mod_libs: > + libMetaFile =3D PathClass(file_path,root) > + libMetaFile.OriginalPath =3D PathClass(file_path,root) > + libMetaFile.Path =3D abs_path > + retVal.append(self.Wa.BuildDatabase[libMetaFile, arch, self= .Target,self.ToolChain]) > + return retVal > + > + ## Parse build_rule.txt in Conf Directory. > + # > + # @retval BuildRule object > + # > + @cached_property > + def BuildRule(self): > + WInfo =3D self.DataPipe.Get("P_Info") > + RetVal =3D WInfo.get("BuildRuleFile") > + if RetVal._FileVersion =3D=3D "": > + RetVal._FileVersion =3D AutoGenReqBuildRuleVerNum > + return RetVal > diff --git a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py b/BaseTo= ols/Source/Python/AutoGen/PlatformAutoGen.py > new file mode 100644 > index 000000000000..6c947eca2b57 > --- /dev/null > +++ b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py > @@ -0,0 +1,1505 @@ > +## @file > +# Create makefile for MS nmake and GNU make > +# > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> +# SPDX-License-Identifier: BSD-2-Clause-Patent > +# > + > +## Import Modules > +# > +from __future__ import print_function > +from __future__ import absolute_import > +import os.path as path > +import copy > +from collections import defaultdict > + > +from .BuildEngine import BuildRule,gDefaultBuildRuleFile,AutoGenReqBuil= dRuleVerNum > +from .GenVar import VariableMgr, var_info > +from . import GenMake > +from AutoGen.DataPipe import MemoryDataPipe > +from AutoGen.ModuleAutoGen import ModuleAutoGen > +from AutoGen.AutoGen import AutoGen > +from AutoGen.AutoGen import CalculatePriorityValue > +from Workspace.WorkspaceCommon import GetModuleLibInstances > +from CommonDataClass.CommonClass import SkuInfoClass > +from Common.caching import cached_class_function > +from Common.Expression import ValueExpressionEx > +from Common.StringUtils import StringToArray,NormPath > +from Common.BuildToolError import * > +from Common.DataType import * > +from Common.Misc import * > +import Common.VpdInfoFile as VpdInfoFile > + > +## Split command line option string to list > +# > +# subprocess.Popen needs the args to be a sequence. Otherwise there's p= roblem > +# in non-windows platform to launch command > +# > +def _SplitOption(OptionString): > + OptionList =3D [] > + LastChar =3D " " > + OptionStart =3D 0 > + QuotationMark =3D "" > + for Index in range(0, len(OptionString)): > + CurrentChar =3D OptionString[Index] > + if CurrentChar in ['"', "'"]: > + if QuotationMark =3D=3D CurrentChar: > + QuotationMark =3D "" > + elif QuotationMark =3D=3D "": > + QuotationMark =3D CurrentChar > + continue > + elif QuotationMark: > + continue > + > + if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r", = "\n"]: > + if Index > OptionStart: > + OptionList.append(OptionString[OptionStart:Index - 1]) > + OptionStart =3D Index > + LastChar =3D CurrentChar > + OptionList.append(OptionString[OptionStart:]) > + return OptionList > + > +## AutoGen class for platform > +# > +# PlatformAutoGen class will process the original information in platf= orm > +# file in order to generate makefile for platform. > +# > +class PlatformAutoGen(AutoGen): > + # call super().__init__ then call the worker function with differen= t parameter count > + def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > + if not hasattr(self, "_Init"): > + self._InitWorker(Workspace, MetaFile, Target, Toolchain, Ar= ch) > + self._Init =3D True > + # > + # Used to store all PCDs for both PEI and DXE phase, in order to ge= nerate > + # correct PCD database > + # > + _DynaPcdList_ =3D [] > + _NonDynaPcdList_ =3D [] > + _PlatformPcds =3D {} > + > + > + > + ## Initialize PlatformAutoGen > + # > + # > + # @param Workspace WorkspaceAutoGen object > + # @param PlatformFile Platform file (DSC file) > + # @param Target Build target (DEBUG, RELEASE) > + # @param Toolchain Name of tool chain > + # @param Arch arch of the platform supports > + # > + def _InitWorker(self, Workspace, PlatformFile, Target, Toolchain, A= rch): > + EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen platform [%s] [%s]"= % (PlatformFile, Arch)) > + GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (PlatformFil= e, Arch, Toolchain, Target) > + > + self.MetaFile =3D PlatformFile > + self.Workspace =3D Workspace > + self.WorkspaceDir =3D Workspace.WorkspaceDir > + self.ToolChain =3D Toolchain > + self.BuildTarget =3D Target > + self.Arch =3D Arch > + self.SourceDir =3D PlatformFile.SubDir > + self.FdTargetList =3D self.Workspace.FdTargetList > + self.FvTargetList =3D self.Workspace.FvTargetList > + # get the original module/package/platform objects > + self.BuildDatabase =3D Workspace.BuildDatabase > + self.DscBuildDataObj =3D Workspace.Platform > + > + # flag indicating if the makefile/C-code file has been created = or not > + self.IsMakeFileCreated =3D False > + > + self._DynamicPcdList =3D None # [(TokenCName1, TokenSpaceGui= dCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > + self._NonDynamicPcdList =3D None # [(TokenCName1, TokenSpaceGui= dCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > + > + self._AsBuildInfList =3D [] > + self._AsBuildModuleList =3D [] > + > + self.VariableInfo =3D None > + > + if GlobalData.gFdfParser is not None: > + self._AsBuildInfList =3D GlobalData.gFdfParser.Profile.InfL= ist > + for Inf in self._AsBuildInfList: > + InfClass =3D PathClass(NormPath(Inf), GlobalData.gWorks= pace, self.Arch) > + M =3D self.BuildDatabase[InfClass, self.Arch, self.Buil= dTarget, self.ToolChain] > + if not M.IsBinaryModule: > + continue > + self._AsBuildModuleList.append(InfClass) > + # get library/modules for build > + self.LibraryBuildDirectoryList =3D [] > + self.ModuleBuildDirectoryList =3D [] > + > + self.DataPipe =3D MemoryDataPipe(self.BuildDir) > + self.DataPipe.FillData(self) > + > + return True > + ## hash() operator of PlatformAutoGen > + # > + # The platform file path and arch string will be used to represent > + # hash value of this object > + # > + # @retval int Hash value of the platform file path and arch > + # > + @cached_class_function > + def __hash__(self): > + return hash((self.MetaFile, self.Arch)) > + @cached_class_function > + def __repr__(self): > + return "%s [%s]" % (self.MetaFile, self.Arch) > + > + ## Create autogen code for platform and modules > + # > + # Since there's no autogen code for platform, this method will do = nothing > + # if CreateModuleCodeFile is set to False. > + # > + # @param CreateModuleCodeFile Flag indicating if creating= module's > + # autogen code file or not > + # > + @cached_class_function > + def CreateCodeFile(self, CreateModuleCodeFile=3DFalse): > + # only module has code to be created, so do nothing if CreateMo= duleCodeFile is False > + if not CreateModuleCodeFile: > + return > + > + for Ma in self.ModuleAutoGenList: > + Ma.CreateCodeFile(True) > + > + ## Generate Fds Command > + @cached_property > + def GenFdsCommand(self): > + return self.Workspace.GenFdsCommand > + > + ## Create makefile for the platform and modules in it > + # > + # @param CreateModuleMakeFile Flag indicating if the make= file for > + # modules will be created as = well > + # > + def CreateMakeFile(self, CreateModuleMakeFile=3DFalse, FfsCommand = =3D {}): > + if CreateModuleMakeFile: > + for Ma in self._MaList: > + key =3D (Ma.MetaFile.File, self.Arch) > + if key in FfsCommand: > + Ma.CreateMakeFile(True, FfsCommand[key]) > + else: > + Ma.CreateMakeFile(True) > + > + # no need to create makefile for the platform more than once > + if self.IsMakeFileCreated: > + return > + > + # create library/module build dirs for platform > + Makefile =3D GenMake.PlatformMakefile(self) > + self.LibraryBuildDirectoryList =3D Makefile.GetLibraryBuildDire= ctoryList() > + self.ModuleBuildDirectoryList =3D Makefile.GetModuleBuildDirect= oryList() > + > + self.IsMakeFileCreated =3D True > + > + @property > + def AllPcdList(self): > + return self.DynamicPcdList + self.NonDynamicPcdList > + ## Deal with Shared FixedAtBuild Pcds > + # > + def CollectFixedAtBuildPcds(self): > + for LibAuto in self.LibraryAutoGenList: > + FixedAtBuildPcds =3D {} > + ShareFixedAtBuildPcdsSameValue =3D {} > + for Module in LibAuto.ReferenceModules: > + for Pcd in set(Module.FixedAtBuildPcds + LibAuto.FixedA= tBuildPcds): > + DefaultValue =3D Pcd.DefaultValue > + # Cover the case: DSC component override the Pcd va= lue and the Pcd only used in one Lib > + if Pcd in Module.LibraryPcdList: > + Index =3D Module.LibraryPcdList.index(Pcd) > + DefaultValue =3D Module.LibraryPcdList[Index].D= efaultValue > + key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.Toke= nCName)) > + if key not in FixedAtBuildPcds: > + ShareFixedAtBuildPcdsSameValue[key] =3D True > + FixedAtBuildPcds[key] =3D DefaultValue > + else: > + if FixedAtBuildPcds[key] !=3D DefaultValue: > + ShareFixedAtBuildPcdsSameValue[key] =3D Fal= se > + for Pcd in LibAuto.FixedAtBuildPcds: > + key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCNa= me)) > + if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in sel= f.NonDynamicPcdDict: > + continue > + else: > + DscPcd =3D self.NonDynamicPcdDict[(Pcd.TokenCName, = Pcd.TokenSpaceGuidCName)] > + if DscPcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > + continue > + if key in ShareFixedAtBuildPcdsSameValue and ShareFixed= AtBuildPcdsSameValue[key]: > + LibAuto.ConstPcd[key] =3D FixedAtBuildPcds[key] > + > + def CollectVariables(self, DynamicPcdSet): > + VpdRegionSize =3D 0 > + VpdRegionBase =3D 0 > + if self.Workspace.FdfFile: > + FdDict =3D self.Workspace.FdfProfile.FdDict[GlobalData.gFdf= Parser.CurrentFdName] > + for FdRegion in FdDict.RegionList: > + for item in FdRegion.RegionDataList: > + if self.Platform.VpdToolGuid.strip() and self.Platf= orm.VpdToolGuid in item: > + VpdRegionSize =3D FdRegion.Size > + VpdRegionBase =3D FdRegion.Offset > + break > + > + VariableInfo =3D VariableMgr(self.DscBuildDataObj._GetDefaultSt= ores(), self.DscBuildDataObj.SkuIds) > + VariableInfo.SetVpdRegionMaxSize(VpdRegionSize) > + VariableInfo.SetVpdRegionOffset(VpdRegionBase) > + Index =3D 0 > + for Pcd in DynamicPcdSet: > + pcdname =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCNa= me)) > + for SkuName in Pcd.SkuInfoList: > + Sku =3D Pcd.SkuInfoList[SkuName] > + SkuId =3D Sku.SkuId > + if SkuId is None or SkuId =3D=3D '': > + continue > + if len(Sku.VariableName) > 0: > + if Sku.VariableAttribute and 'NV' not in Sku.Variab= leAttribute: > + continue > + VariableGuidStructure =3D Sku.VariableGuidValue > + VariableGuid =3D GuidStructureStringToGuidString(Va= riableGuidStructure) > + for StorageName in Sku.DefaultStoreDict: > + VariableInfo.append_variable(var_info(Index, pc= dname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid,= Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.Defaul= tStoreDict[StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else Stri= ngToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAttr= ibute['DscPosition'], Pcd.CustomAttribute.get('IsStru',False))) > + Index +=3D 1 > + return VariableInfo > + > + def UpdateNVStoreMaxSize(self, OrgVpdFile): > + if self.VariableInfo: > + VpdMapFilePath =3D os.path.join(self.BuildDir, TAB_FV_DIREC= TORY, "%s.map" % self.Platform.VpdToolGuid) > + PcdNvStoreDfBuffer =3D [item for item in self._DynamicPcdLi= st if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.TokenS= paceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > + > + if PcdNvStoreDfBuffer: > + if os.path.exists(VpdMapFilePath): > + OrgVpdFile.Read(VpdMapFilePath) > + PcdItems =3D OrgVpdFile.GetOffset(PcdNvStoreDfBuffe= r[0]) > + NvStoreOffset =3D list(PcdItems.values())[0].strip(= ) if PcdItems else '0' > + else: > + EdkLogger.error("build", FILE_READ_FAILURE, "Can no= t find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > + > + NvStoreOffset =3D int(NvStoreOffset, 16) if NvStoreOffs= et.upper().startswith("0X") else int(NvStoreOffset) > + default_skuobj =3D PcdNvStoreDfBuffer[0].SkuInfoList.ge= t(TAB_DEFAULT) > + maxsize =3D self.VariableInfo.VpdRegionSize - NvStoreO= ffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultVal= ue.split(",")) > + var_data =3D self.VariableInfo.PatchNVStoreDefaultMaxSi= ze(maxsize) > + > + if var_data and default_skuobj: > + default_skuobj.DefaultValue =3D var_data > + PcdNvStoreDfBuffer[0].DefaultValue =3D var_data > + PcdNvStoreDfBuffer[0].SkuInfoList.clear() > + PcdNvStoreDfBuffer[0].SkuInfoList[TAB_DEFAULT] =3D = default_skuobj > + PcdNvStoreDfBuffer[0].MaxDatumSize =3D str(len(defa= ult_skuobj.DefaultValue.split(","))) > + > + return OrgVpdFile > + > + ## Collect dynamic PCDs > + # > + # Gather dynamic PCDs list from each module and their settings fro= m platform > + # This interface should be invoked explicitly when platform action= is created. > + # > + def CollectPlatformDynamicPcds(self): > + self.CategoryPcds() > + self.SortDynamicPcd() > + > + def CategoryPcds(self): > + # Category Pcds into DynamicPcds and NonDynamicPcds > + # for gathering error information > + NoDatumTypePcdList =3D set() > + FdfModuleList =3D [] > + for InfName in self._AsBuildInfList: > + InfName =3D mws.join(self.WorkspaceDir, InfName) > + FdfModuleList.append(os.path.normpath(InfName)) > + for M in self._MbList: > +# F is the Module for which M is the module autogen > + ModPcdList =3D self.ApplyPcdSetting(M, M.ModulePcdList) > + LibPcdList =3D [] > + for lib in M.LibraryPcdList: > + LibPcdList.extend(self.ApplyPcdSetting(M, M.LibraryPcdL= ist[lib], lib)) > + for PcdFromModule in ModPcdList + LibPcdList: > + > + # make sure that the "VOID*" kind of datum has MaxDatum= Size set > + if PcdFromModule.DatumType =3D=3D TAB_VOID and not PcdF= romModule.MaxDatumSize: > + NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModul= e.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile)) > + > + # Check the PCD from Binary INF or Source INF > + if M.IsBinaryModule =3D=3D True: > + PcdFromModule.IsFromBinaryInf =3D True > + > + # Check the PCD from DSC or not > + PcdFromModule.IsFromDsc =3D (PcdFromModule.TokenCName, = PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds > + > + if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET or PcdFro= mModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > + if M.MetaFile.Path not in FdfModuleList: > + # If one of the Source built modules listed in = the DSC is not listed > + # in FDF modules, and the INF lists a PCD can o= nly use the PcdsDynamic > + # access method (it is only listed in the DEC f= ile that declares the > + # PCD as PcdsDynamic), then build tool will rep= ort warning message > + # notify the PI that they are attempting to bui= ld a module that must > + # be included in a flash image in order to be f= unctional. These Dynamic > + # PCD will not be added into the Database unles= s it is used by other > + # modules that are included in the FDF file. > + if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET a= nd \ > + PcdFromModule.IsFromBinaryInf =3D=3D False: > + # Print warning message to let the develope= r make a determine. > + continue > + # If one of the Source built modules listed in = the DSC is not listed in > + # FDF modules, and the INF lists a PCD can only= use the PcdsDynamicEx > + # access method (it is only listed in the DEC f= ile that declares the > + # PCD as PcdsDynamicEx), then DO NOT break the = build; DO NOT add the > + # PCD to the Platform's PCD Database. > + if PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SE= T: > + continue > + # > + # If a dynamic PCD used by a PEM module/PEI module = & DXE module, > + # it should be stored in Pcd PEI database, If a dyn= amic only > + # used by DXE module, it should be stored in DXE PC= D database. > + # The default Phase is DXE > + # > + if M.ModuleType in SUP_MODULE_SET_PEI: > + PcdFromModule.Phase =3D "PEI" > + if PcdFromModule not in self._DynaPcdList_: > + self._DynaPcdList_.append(PcdFromModule) > + elif PcdFromModule.Phase =3D=3D 'PEI': > + # overwrite any the same PCD existing, if Phase= is PEI > + Index =3D self._DynaPcdList_.index(PcdFromModul= e) > + self._DynaPcdList_[Index] =3D PcdFromModule > + elif PcdFromModule not in self._NonDynaPcdList_: > + self._NonDynaPcdList_.append(PcdFromModule) > + elif PcdFromModule in self._NonDynaPcdList_ and PcdFrom= Module.IsFromBinaryInf =3D=3D True: > + Index =3D self._NonDynaPcdList_.index(PcdFromModule= ) > + if self._NonDynaPcdList_[Index].IsFromBinaryInf =3D= = =3D False: > + #The PCD from Binary INF will override the same= one from source INF > + self._NonDynaPcdList_.remove (self._NonDynaPcdL= ist_[Index]) > + PcdFromModule.Pending =3D False > + self._NonDynaPcdList_.append (PcdFromModule) > + DscModuleSet =3D {os.path.normpath(ModuleInf.Path) for ModuleIn= f in self.Platform.Modules} > + # add the PCD from modules that listed in FDF but not in DSC to= Database > + for InfName in FdfModuleList: > + if InfName not in DscModuleSet: > + InfClass =3D PathClass(InfName) > + M =3D self.BuildDatabase[InfClass, self.Arch, self.Buil= dTarget, self.ToolChain] > + # If a module INF in FDF but not in current arch's DSC = module list, it must be module (either binary or source) > + # for different Arch. PCDs in source module for differe= nt Arch is already added before, so skip the source module here. > + # For binary module, if in current arch, we need to lis= t the PCDs into database. > + if not M.IsBinaryModule: > + continue > + # Override the module PCD setting by platform setting > + ModulePcdList =3D self.ApplyPcdSetting(M, M.Pcds) > + for PcdFromModule in ModulePcdList: > + PcdFromModule.IsFromBinaryInf =3D True > + PcdFromModule.IsFromDsc =3D False > + # Only allow the DynamicEx and Patchable PCD in AsB= uild INF > + if PcdFromModule.Type not in PCD_DYNAMIC_EX_TYPE_SE= T and PcdFromModule.Type not in TAB_PCDS_PATCHABLE_IN_MODULE: > + EdkLogger.error("build", AUTOGEN_ERROR, "PCD se= tting error", > + File=3Dself.MetaFile, > + ExtraData=3D"\n\tExisted %s PCD= %s in:\n\t\t%s\n" > + % (PcdFromModule.Type, PcdFromM= odule.TokenCName, InfName)) > + # make sure that the "VOID*" kind of datum has MaxD= atumSize set > + if PcdFromModule.DatumType =3D=3D TAB_VOID and not = PcdFromModule.MaxDatumSize: > + NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromM= odule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName)) > + if M.ModuleType in SUP_MODULE_SET_PEI: > + PcdFromModule.Phase =3D "PEI" > + if PcdFromModule not in self._DynaPcdList_ and PcdF= romModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > + self._DynaPcdList_.append(PcdFromModule) > + elif PcdFromModule not in self._NonDynaPcdList_ and= PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE: > + self._NonDynaPcdList_.append(PcdFromModule) > + if PcdFromModule in self._DynaPcdList_ and PcdFromM= odule.Phase =3D=3D 'PEI' and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > + # Overwrite the phase of any the same PCD exist= ing, if Phase is PEI. > + # It is to solve the case that a dynamic PCD us= ed by a PEM module/PEI > + # module & DXE module at a same time. > + # Overwrite the type of the PCDs in source INF = by the type of AsBuild > + # INF file as DynamicEx. > + Index =3D self._DynaPcdList_.index(PcdFromModul= e) > + self._DynaPcdList_[Index].Phase =3D PcdFromModu= le.Phase > + self._DynaPcdList_[Index].Type =3D PcdFromModul= e.Type > + for PcdFromModule in self._NonDynaPcdList_: > + # If a PCD is not listed in the DSC file, but binary INF fi= les used by > + # this platform all (that use this PCD) list the PCD in a [= PatchPcds] > + # section, AND all source INF files used by this platform t= he build > + # that use the PCD list the PCD in either a [Pcds] or [Patc= hPcds] > + # section, then the tools must NOT add the PCD to the Platf= orm's PCD > + # Database; the build must assign the access method for thi= s PCD as > + # PcdsPatchableInModule. > + if PcdFromModule not in self._DynaPcdList_: > + continue > + Index =3D self._DynaPcdList_.index(PcdFromModule) > + if PcdFromModule.IsFromDsc =3D=3D False and \ > + PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE and = \ > + PcdFromModule.IsFromBinaryInf =3D=3D True and \ > + self._DynaPcdList_[Index].IsFromBinaryInf =3D=3D False: > + Index =3D self._DynaPcdList_.index(PcdFromModule) > + self._DynaPcdList_.remove (self._DynaPcdList_[Index]) > + > + # print out error information and break the build, if error fou= nd > + if len(NoDatumTypePcdList) > 0: > + NoDatumTypePcdListString =3D "\n\t\t".join(NoDatumTypePcdLi= st) > + EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting error"= , > + File=3Dself.MetaFile, > + ExtraData=3D"\n\tPCD(s) without MaxDatumSiz= e:\n\t\t%s\n" > + % NoDatumTypePcdListString) > + self._NonDynamicPcdList =3D self._NonDynaPcdList_ > + self._DynamicPcdList =3D self._DynaPcdList_ > + > + def SortDynamicPcd(self): > + # > + # Sort dynamic PCD list to: > + # 1) If PCD's datum type is VOID* and value is unicode string w= hich starts with L, the PCD item should > + # try to be put header of dynamicd List > + # 2) If PCD is HII type, the PCD item should be put after unico= de type PCD > + # > + # The reason of sorting is make sure the unicode string is in d= ouble-byte alignment in string table. > + # > + UnicodePcdArray =3D set() > + HiiPcdArray =3D set() > + OtherPcdArray =3D set() > + VpdPcdDict =3D {} > + VpdFile =3D VpdInfoFile.VpdInfoFile() > + NeedProcessVpdMapFile =3D False > + > + for pcd in self.Platform.Pcds: > + if pcd not in self._PlatformPcds: > + self._PlatformPcds[pcd] =3D self.Platform.Pcds[pcd] > + > + for item in self._PlatformPcds: > + if self._PlatformPcds[item].DatumType and self._PlatformPcd= s[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TA= B_VOID, "BOOLEAN"]: > + self._PlatformPcds[item].DatumType =3D TAB_VOID > + > + if (self.Workspace.ArchList[-1] =3D=3D self.Arch): > + for Pcd in self._DynamicPcdList: > + # just pick the a value to determine whether is unicode= string type > + Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > + > + if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UIN= T32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > + Pcd.DatumType =3D TAB_VOID > + > + # if found PCD which datum value is unicode string = the insert to left size of UnicodeIndex > + # if found HII type PCD then insert to right of Uni= codeIndex > + if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_= EX_VPD]: > + VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName= )] =3D Pcd > + > + #Collect DynamicHii PCD values and assign it to DynamicExVp= d PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer > + PcdNvStoreDfBuffer =3D VpdPcdDict.get(("PcdNvStoreDefaultVa= lueBuffer", "gEfiMdeModulePkgTokenSpaceGuid")) > + if PcdNvStoreDfBuffer: > + self.VariableInfo =3D self.CollectVariables(self._Dynam= icPcdList) > + vardump =3D self.VariableInfo.dump() > + if vardump: > + # > + #According to PCD_DATABASE_INIT in edk2\MdeModulePk= g\Include\Guid\PcdDataBaseSignatureGuid.h, > + #the max size for string PCD should not exceed USHR= T_MAX 65535(0xffff). > + #typedef UINT16 SIZE_INFO; > + #//SIZE_INFO SizeTable[]; > + if len(vardump.split(",")) > 0xffff: > + EdkLogger.error("build", RESOURCE_OVERFLOW, 'Th= e current length of PCD %s value is %d, it exceeds to the max size of Strin= g PCD.' %(".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName,PcdNvStoreDfBuff= er.TokenCName]) ,len(vardump.split(",")))) > + PcdNvStoreDfBuffer.DefaultValue =3D vardump > + for skuname in PcdNvStoreDfBuffer.SkuInfoList: > + PcdNvStoreDfBuffer.SkuInfoList[skuname].Default= Value =3D vardump > + PcdNvStoreDfBuffer.MaxDatumSize =3D str(len(var= dump.split(","))) > + else: > + #If the end user define [DefaultStores] and [XXX.Menufa= cturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to Pc= dsDynamicVpd > + if [Pcd for Pcd in self._DynamicPcdList if Pcd.UserDefi= nedDefaultStoresFlag]: > + EdkLogger.warn("build", "PcdNvStoreDefaultValueBuff= er should be defined as PcdsDynamicExVpd in dsc file since the DefaultStore= s is enabled for this platform.\n%s" %self.Platform.MetaFile.Path) > + PlatformPcds =3D sorted(self._PlatformPcds.keys()) > + # > + # Add VPD type PCD into VpdFile and determine whether the V= PD PCD need to be fixed up. > + # > + VpdSkuMap =3D {} > + for PcdKey in PlatformPcds: > + Pcd =3D self._PlatformPcds[PcdKey] > + if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_= EX_VPD] and \ > + PcdKey in VpdPcdDict: > + Pcd =3D VpdPcdDict[PcdKey] > + SkuValueMap =3D {} > + DefaultSku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > + if DefaultSku: > + PcdValue =3D DefaultSku.DefaultValue > + if PcdValue not in SkuValueMap: > + SkuValueMap[PcdValue] =3D [] > + VpdFile.Add(Pcd, TAB_DEFAULT, DefaultSku.Vp= dOffset) > + SkuValueMap[PcdValue].append(DefaultSku) > + > + for (SkuName, Sku) in Pcd.SkuInfoList.items(): > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > + PcdValue =3D Sku.DefaultValue > + if PcdValue =3D=3D "": > + PcdValue =3D Pcd.DefaultValue > + if Sku.VpdOffset !=3D TAB_STAR: > + if PcdValue.startswith("{"): > + Alignment =3D 8 > + elif PcdValue.startswith("L"): > + Alignment =3D 2 > + else: > + Alignment =3D 1 > + try: > + VpdOffset =3D int(Sku.VpdOffset) > + except: > + try: > + VpdOffset =3D int(Sku.VpdOffset, 16= ) > + except: > + EdkLogger.error("build", FORMAT_INV= ALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Pcd.TokenS= paceGuidCName, Pcd.TokenCName)) > + if VpdOffset % Alignment !=3D 0: > + if PcdValue.startswith("{"): > + EdkLogger.warn("build", "The offset= value of PCD %s.%s is not 8-byte aligned!" %(Pcd.TokenSpaceGuidCName, Pcd.= TokenCName), File=3Dself.MetaFile) > + else: > + EdkLogger.error("build", FORMAT_INV= ALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.Tok= enSpaceGuidCName, Pcd.TokenCName, Alignment)) > + if PcdValue not in SkuValueMap: > + SkuValueMap[PcdValue] =3D [] > + VpdFile.Add(Pcd, SkuName, Sku.VpdOffset) > + SkuValueMap[PcdValue].append(Sku) > + # if the offset of a VPD is *, then it need to = be fixed up by third party tool. > + if not NeedProcessVpdMapFile and Sku.VpdOffset = = =3D=3D TAB_STAR: > + NeedProcessVpdMapFile =3D True > + if self.Platform.VpdToolGuid is None or sel= f.Platform.VpdToolGuid =3D=3D '': > + EdkLogger.error("Build", FILE_NOT_FOUND= , \ > + "Fail to find third-par= ty BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tool= s_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > + > + VpdSkuMap[PcdKey] =3D SkuValueMap > + # > + # Fix the PCDs define in VPD PCD section that never referen= ced by module. > + # An example is PCD for signature usage. > + # > + for DscPcd in PlatformPcds: > + DscPcdEntry =3D self._PlatformPcds[DscPcd] > + if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_= DYNAMIC_EX_VPD]: > + if not (self.Platform.VpdToolGuid is None or self.P= latform.VpdToolGuid =3D=3D ''): > + FoundFlag =3D False > + for VpdPcd in VpdFile._VpdArray: > + # This PCD has been referenced by module > + if (VpdPcd.TokenSpaceGuidCName =3D=3D DscPc= dEntry.TokenSpaceGuidCName) and \ > + (VpdPcd.TokenCName =3D=3D DscPcdEntry.To= kenCName): > + FoundFlag =3D True > + > + # Not found, it should be signature > + if not FoundFlag : > + # just pick the a value to determine whethe= r is unicode string type > + SkuValueMap =3D {} > + SkuObjList =3D list(DscPcdEntry.SkuInfoList= .items()) > + DefaultSku =3D DscPcdEntry.SkuInfoList.get(= TAB_DEFAULT) > + if DefaultSku: > + defaultindex =3D SkuObjList.index((TAB_= DEFAULT, DefaultSku)) > + SkuObjList[0], SkuObjList[defaultindex]= =3D SkuObjList[defaultindex], SkuObjList[0] > + for (SkuName, Sku) in SkuObjList: > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > + > + # Need to iterate DEC pcd information t= o get the value & datumtype > + for eachDec in self.PackageList: > + for DecPcd in eachDec.Pcds: > + DecPcdEntry =3D eachDec.Pcds[De= cPcd] > + if (DecPcdEntry.TokenSpaceGuidC= Name =3D=3D DscPcdEntry.TokenSpaceGuidCName) and \ > + (DecPcdEntry.TokenCName =3D= =3D DscPcdEntry.TokenCName): > + # Print warning message to = let the developer make a determine. > + EdkLogger.warn("build", "Un= referenced vpd pcd used!", > + File=3Dself= .MetaFile, \ > + ExtraData = =3D "PCD: %s.%s used in the DSC file %s is unreferenced." \ > + %(DscPcdEnt= ry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path= )) > + > + DscPcdEntry.DatumType = =3D DecPcdEntry.DatumType > + DscPcdEntry.DefaultValue = =3D DecPcdEntry.DefaultValue > + DscPcdEntry.TokenValue =3D = DecPcdEntry.TokenValue > + DscPcdEntry.TokenSpaceGuidV= alue =3D eachDec.Guids[DecPcdEntry.TokenSpaceGuidCName] > + # Only fix the value while = no value provided in DSC file. > + if not Sku.DefaultValue: > + DscPcdEntry.SkuInfoList= [list(DscPcdEntry.SkuInfoList.keys())[0]].DefaultValue =3D DecPcdEntry.Defa= ultValue > + > + if DscPcdEntry not in self._DynamicPcdL= ist: > + self._DynamicPcdList.append(DscPcdE= ntry) > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > + PcdValue =3D Sku.DefaultValue > + if PcdValue =3D=3D "": > + PcdValue =3D DscPcdEntry.DefaultVa= lue > + if Sku.VpdOffset !=3D TAB_STAR: > + if PcdValue.startswith("{"): > + Alignment =3D 8 > + elif PcdValue.startswith("L"): > + Alignment =3D 2 > + else: > + Alignment =3D 1 > + try: > + VpdOffset =3D int(Sku.VpdOffset= ) > + except: > + try: > + VpdOffset =3D int(Sku.VpdOf= fset, 16) > + except: > + EdkLogger.error("build", FO= RMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Ds= cPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName)) > + if VpdOffset % Alignment !=3D 0: > + if PcdValue.startswith("{"): > + EdkLogger.warn("build", "Th= e offset value of PCD %s.%s is not 8-byte aligned!" %(DscPcdEntry.TokenSpac= eGuidCName, DscPcdEntry.TokenCName), File=3Dself.MetaFile) > + else: > + EdkLogger.error("build", FO= RMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % = (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment)) > + if PcdValue not in SkuValueMap: > + SkuValueMap[PcdValue] =3D [] > + VpdFile.Add(DscPcdEntry, SkuName, S= ku.VpdOffset) > + SkuValueMap[PcdValue].append(Sku) > + if not NeedProcessVpdMapFile and Sku.Vp= dOffset =3D=3D TAB_STAR: > + NeedProcessVpdMapFile =3D True > + if DscPcdEntry.DatumType =3D=3D TAB_VOID an= d PcdValue.startswith("L"): > + UnicodePcdArray.add(DscPcdEntry) > + elif len(Sku.VariableName) > 0: > + HiiPcdArray.add(DscPcdEntry) > + else: > + OtherPcdArray.add(DscPcdEntry) > + > + # if the offset of a VPD is *, then it = need to be fixed up by third party tool. > + VpdSkuMap[DscPcd] =3D SkuValueMap > + if (self.Platform.FlashDefinition is None or self.Platform.= FlashDefinition =3D=3D '') and \ > + VpdFile.GetCount() !=3D 0: > + EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, > + "Fail to get FLASH_DEFINITION definitio= n in DSC file %s which is required when DSC contains VPD PCD." % str(self.P= latform.MetaFile)) > + > + if VpdFile.GetCount() !=3D 0: > + > + self.FixVpdOffset(VpdFile) > + > + self.FixVpdOffset(self.UpdateNVStoreMaxSize(VpdFile)) > + PcdNvStoreDfBuffer =3D [item for item in self._DynamicP= cdList if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.To= kenSpaceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > + if PcdNvStoreDfBuffer: > + PcdName,PcdGuid =3D PcdNvStoreDfBuffer[0].TokenCNam= e, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName > + if (PcdName,PcdGuid) in VpdSkuMap: > + DefaultSku =3D PcdNvStoreDfBuffer[0].SkuInfoLis= t.get(TAB_DEFAULT) > + VpdSkuMap[(PcdName,PcdGuid)] =3D {DefaultSku.De= faultValue:[SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values()= ]} > + > + # Process VPD map file generated by third party BPDG to= ol > + if NeedProcessVpdMapFile: > + VpdMapFilePath =3D os.path.join(self.BuildDir, TAB_= FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid) > + if os.path.exists(VpdMapFilePath): > + VpdFile.Read(VpdMapFilePath) > + > + # Fixup TAB_STAR offset > + for pcd in VpdSkuMap: > + vpdinfo =3D VpdFile.GetVpdInfo(pcd) > + if vpdinfo is None: > + # just pick the a value to determine whethe= r is unicode string type > + continue > + for pcdvalue in VpdSkuMap[pcd]: > + for sku in VpdSkuMap[pcd][pcdvalue]: > + for item in vpdinfo: > + if item[2] =3D=3D pcdvalue: > + sku.VpdOffset =3D item[1] > + else: > + EdkLogger.error("build", FILE_READ_FAILURE, "Ca= n not find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > + > + # Delete the DynamicPcdList At the last time enter into thi= s function > + for Pcd in self._DynamicPcdList: > + # just pick the a value to determine whether is unicode= string type > + Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > + > + if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UIN= T32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > + Pcd.DatumType =3D TAB_VOID > + > + PcdValue =3D Sku.DefaultValue > + if Pcd.DatumType =3D=3D TAB_VOID and PcdValue.startswit= h("L"): > + # if found PCD which datum value is unicode string = the insert to left size of UnicodeIndex > + UnicodePcdArray.add(Pcd) > + elif len(Sku.VariableName) > 0: > + # if found HII type PCD then insert to right of Uni= codeIndex > + HiiPcdArray.add(Pcd) > + else: > + OtherPcdArray.add(Pcd) > + del self._DynamicPcdList[:] > + self._DynamicPcdList.extend(list(UnicodePcdArray)) > + self._DynamicPcdList.extend(list(HiiPcdArray)) > + self._DynamicPcdList.extend(list(OtherPcdArray)) > + allskuset =3D [(SkuName, Sku.SkuId) for pcd in self._DynamicPcd= List for (SkuName, Sku) in pcd.SkuInfoList.items()] > + for pcd in self._DynamicPcdList: > + if len(pcd.SkuInfoList) =3D=3D 1: > + for (SkuName, SkuId) in allskuset: > + if isinstance(SkuId, str) and eval(SkuId) =3D=3D 0 = or SkuId =3D=3D 0: > + continue > + pcd.SkuInfoList[SkuName] =3D copy.deepcopy(pcd.SkuI= nfoList[TAB_DEFAULT]) > + pcd.SkuInfoList[SkuName].SkuId =3D SkuId > + pcd.SkuInfoList[SkuName].SkuIdName =3D SkuName > + > + def FixVpdOffset(self, VpdFile ): > + FvPath =3D os.path.join(self.BuildDir, TAB_FV_DIRECTORY) > + if not os.path.exists(FvPath): > + try: > + os.makedirs(FvPath) > + except: > + EdkLogger.error("build", FILE_WRITE_FAILURE, "Fail to c= reate FV folder under %s" % self.BuildDir) > + > + VpdFilePath =3D os.path.join(FvPath, "%s.txt" % self.Platform.V= pdToolGuid) > + > + if VpdFile.Write(VpdFilePath): > + # retrieve BPDG tool's path from tool_def.txt according to = VPD_TOOL_GUID defined in DSC file. > + BPDGToolName =3D None > + for ToolDef in self.ToolDefinition.values(): > + if TAB_GUID in ToolDef and ToolDef[TAB_GUID] =3D=3D sel= f.Platform.VpdToolGuid: > + if "PATH" not in ToolDef: > + EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABL= E, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt"= % self.Platform.VpdToolGuid) > + BPDGToolName =3D ToolDef["PATH"] > + break > + # Call third party GUID BPDG tool. > + if BPDGToolName is not None: > + VpdInfoFile.CallExtenalBPDGTool(BPDGToolName, VpdFilePa= th) > + else: > + EdkLogger.error("Build", FILE_NOT_FOUND, "Fail to find = third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be define= d in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > + > + ## Return the platform build data object > + @cached_property > + def Platform(self): > + return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildT= arget, self.ToolChain] > + > + ## Return platform name > + @cached_property > + def Name(self): > + return self.Platform.PlatformName > + > + ## Return the meta file GUID > + @cached_property > + def Guid(self): > + return self.Platform.Guid > + > + ## Return the platform version > + @cached_property > + def Version(self): > + return self.Platform.Version > + > + ## Return the FDF file name > + @cached_property > + def FdfFile(self): > + if self.Workspace.FdfFile: > + RetVal=3D mws.join(self.WorkspaceDir, self.Workspace.FdfFil= e) > + else: > + RetVal =3D '' > + return RetVal > + > + ## Return the build output directory platform specifies > + @cached_property > + def OutputDir(self): > + return self.Platform.OutputDirectory > + > + ## Return the directory to store all intermediate and final files b= uilt > + @cached_property > + def BuildDir(self): > + if os.path.isabs(self.OutputDir): > + GlobalData.gBuildDirectory =3D RetVal =3D path.join( > + path.abspath(self.OutputDir), > + self.BuildTarget + "_" + self.T= oolChain, > + ) > + else: > + GlobalData.gBuildDirectory =3D RetVal =3D path.join( > + self.WorkspaceDir, > + self.OutputDir, > + self.BuildTarget + "_" + self.T= oolChain, > + ) > + return RetVal > + > + ## Return directory of platform makefile > + # > + # @retval string Makefile directory > + # > + @cached_property > + def MakeFileDir(self): > + return path.join(self.BuildDir, self.Arch) > + > + ## Return build command string > + # > + # @retval string Build command string > + # > + @cached_property > + def BuildCommand(self): > + RetVal =3D [] > + if "MAKE" in self.ToolDefinition and "PATH" in self.ToolDefinit= ion["MAKE"]: > + RetVal +=3D _SplitOption(self.ToolDefinition["MAKE"]["PATH"= ]) > + if "FLAGS" in self.ToolDefinition["MAKE"]: > + NewOption =3D self.ToolDefinition["MAKE"]["FLAGS"].stri= p() > + if NewOption !=3D '': > + RetVal +=3D _SplitOption(NewOption) > + if "MAKE" in self.EdkIIBuildOption: > + if "FLAGS" in self.EdkIIBuildOption["MAKE"]: > + Flags =3D self.EdkIIBuildOption["MAKE"]["FLAGS"] > + if Flags.startswith('=3D'): > + RetVal =3D [RetVal[0]] + [Flags[1:]] > + else: > + RetVal.append(Flags) > + return RetVal > + > + ## Get tool chain definition > + # > + # Get each tool definition for given tool chain from tools_def.txt= and platform > + # > + @cached_property > + def ToolDefinition(self): > + ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDictionary > + if TAB_TOD_DEFINES_COMMAND_TYPE not in self.Workspace.ToolDef.T= oolsDefTxtDatabase: > + EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No tools = found in configuration", > + ExtraData=3D"[%s]" % self.MetaFile) > + RetVal =3D OrderedDict() > + DllPathList =3D set() > + for Def in ToolDefinition: > + Target, Tag, Arch, Tool, Attr =3D Def.split("_") > + if Target !=3D self.BuildTarget or Tag !=3D self.ToolChain = or Arch !=3D self.Arch: > + continue > + > + Value =3D ToolDefinition[Def] > + # don't record the DLL > + if Attr =3D=3D "DLL": > + DllPathList.add(Value) > + continue > + > + if Tool not in RetVal: > + RetVal[Tool] =3D OrderedDict() > + RetVal[Tool][Attr] =3D Value > + > + ToolsDef =3D '' > + if GlobalData.gOptions.SilentMode and "MAKE" in RetVal: > + if "FLAGS" not in RetVal["MAKE"]: > + RetVal["MAKE"]["FLAGS"] =3D "" > + RetVal["MAKE"]["FLAGS"] +=3D " -s" > + MakeFlags =3D '' > + for Tool in RetVal: > + for Attr in RetVal[Tool]: > + Value =3D RetVal[Tool][Attr] > + if Tool in self._BuildOptionWithToolDef(RetVal) and Att= r in self._BuildOptionWithToolDef(RetVal)[Tool]: > + # check if override is indicated > + if self._BuildOptionWithToolDef(RetVal)[Tool][Attr]= .startswith('=3D'): > + Value =3D self._BuildOptionWithToolDef(RetVal)[= Tool][Attr][1:] > + else: > + if Attr !=3D 'PATH': > + Value +=3D " " + self._BuildOptionWithToolD= ef(RetVal)[Tool][Attr] > + else: > + Value =3D self._BuildOptionWithToolDef(RetV= al)[Tool][Attr] > + > + if Attr =3D=3D "PATH": > + # Don't put MAKE definition in the file > + if Tool !=3D "MAKE": > + ToolsDef +=3D "%s =3D %s\n" % (Tool, Value) > + elif Attr !=3D "DLL": > + # Don't put MAKE definition in the file > + if Tool =3D=3D "MAKE": > + if Attr =3D=3D "FLAGS": > + MakeFlags =3D Value > + else: > + ToolsDef +=3D "%s_%s =3D %s\n" % (Tool, Attr, V= alue) > + ToolsDef +=3D "\n" > + > + tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF." += self.Arch) > + SaveFileOnChange(tool_def_file, ToolsDef, False) > + for DllPath in DllPathList: > + os.environ["PATH"] =3D DllPath + os.pathsep + os.environ["P= ATH"] > + os.environ["MAKE_FLAGS"] =3D MakeFlags > + > + return RetVal > + > + ## Return the paths of tools > + @cached_property > + def ToolDefinitionFile(self): > + tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF." += self.Arch) > + if not os.path.exists(tool_def_file): > + self.ToolDefinition > + return tool_def_file > + > + ## Retrieve the toolchain family of given toolchain tag. Default to= 'MSFT'. > + @cached_property > + def ToolChainFamily(self): > + ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > + if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \ > + or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_FAMI= LY] \ > + or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain= ]: > + EdkLogger.verbose("No tool chain family found in configurat= ion for %s. Default to MSFT." \ > + % self.ToolChain) > + RetVal =3D TAB_COMPILER_MSFT > + else: > + RetVal =3D ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.Tool= Chain] > + return RetVal > + > + @cached_property > + def BuildRuleFamily(self): > + ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > + if TAB_TOD_DEFINES_BUILDRULEFAMILY not in ToolDefinition \ > + or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_BUIL= DRULEFAMILY] \ > + or not ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.= ToolChain]: > + EdkLogger.verbose("No tool chain family found in configurat= ion for %s. Default to MSFT." \ > + % self.ToolChain) > + return TAB_COMPILER_MSFT > + > + return ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.Too= lChain] > + > + ## Return the build options specific for all modules in this platfo= rm > + @cached_property > + def BuildOption(self): > + return self._ExpandBuildOption(self.Platform.BuildOptions) > + > + def _BuildOptionWithToolDef(self, ToolDef): > + return self._ExpandBuildOption(self.Platform.BuildOptions, Tool= Def=3DToolDef) > + > + ## Return the build options specific for EDK modules in this platfo= rm > + @cached_property > + def EdkBuildOption(self): > + return self._ExpandBuildOption(self.Platform.BuildOptions, EDK_= NAME) > + > + ## Return the build options specific for EDKII modules in this plat= form > + @cached_property > + def EdkIIBuildOption(self): > + return self._ExpandBuildOption(self.Platform.BuildOptions, EDKI= I_NAME) > + > + ## Parse build_rule.txt in Conf Directory. > + # > + # @retval BuildRule object > + # > + @cached_property > + def BuildRule(self): > + BuildRuleFile =3D None > + if TAB_TAT_DEFINES_BUILD_RULE_CONF in self.Workspace.TargetTxt.= TargetTxtDictionary: > + BuildRuleFile =3D self.Workspace.TargetTxt.TargetTxtDiction= ary[TAB_TAT_DEFINES_BUILD_RULE_CONF] > + if not BuildRuleFile: > + BuildRuleFile =3D gDefaultBuildRuleFile > + RetVal =3D BuildRule(BuildRuleFile) > + if RetVal._FileVersion =3D=3D "": > + RetVal._FileVersion =3D AutoGenReqBuildRuleVerNum > + else: > + if RetVal._FileVersion < AutoGenReqBuildRuleVerNum : > + # If Build Rule's version is less than the version numb= er required by the tools, halting the build. > + EdkLogger.error("build", AUTOGEN_ERROR, > + ExtraData=3D"The version number [%s] of= build_rule.txt is less than the version number required by the AutoGen.(th= e minimum required version number is [%s])"\ > + % (RetVal._FileVersion, AutoGenReqBuil= dRuleVerNum)) > + return RetVal > + > + ## Summarize the packages used by modules in this platform > + @cached_property > + def PackageList(self): > + RetVal =3D set() > + for Mb in self._MbList: > + RetVal.update(Mb.Packages) > + for lb in Mb.LibInstances: > + RetVal.update(lb.Packages) > + #Collect package set information from INF of FDF > + for ModuleFile in self._AsBuildModuleList: > + if ModuleFile in self.Platform.Modules: > + continue > + ModuleData =3D self.BuildDatabase[ModuleFile, self.Arch, se= lf.BuildTarget, self.ToolChain] > + RetVal.update(ModuleData.Packages) > + return list(RetVal) > + > + @cached_property > + def NonDynamicPcdDict(self): > + return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):Pcd for Pcd i= n self.NonDynamicPcdList} > + > + ## Get list of non-dynamic PCDs > + @property > + def NonDynamicPcdList(self): > + if not self._NonDynamicPcdList: > + self.CollectPlatformDynamicPcds() > + return self._NonDynamicPcdList > + > + ## Get list of dynamic PCDs > + @property > + def DynamicPcdList(self): > + if not self._DynamicPcdList: > + self.CollectPlatformDynamicPcds() > + return self._DynamicPcdList > + > + ## Generate Token Number for all PCD > + @cached_property > + def PcdTokenNumber(self): > + RetVal =3D OrderedDict() > + TokenNumber =3D 1 > + # > + # Make the Dynamic and DynamicEx PCD use within different Token= Number area. > + # Such as: > + # > + # Dynamic PCD: > + # TokenNumber 0 ~ 10 > + # DynamicEx PCD: > + # TokeNumber 11 ~ 20 > + # > + for Pcd in self.DynamicPcdList: > + if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_TYPE_= SET: > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > + TokenNumber +=3D 1 > + > + for Pcd in self.DynamicPcdList: > + if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_EX_TY= PE_SET: > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > + TokenNumber +=3D 1 > + > + for Pcd in self.DynamicPcdList: > + if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_TYPE_= SET: > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > + TokenNumber +=3D 1 > + > + for Pcd in self.DynamicPcdList: > + if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_EX_TY= PE_SET: > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %= (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Tok= enNumber > + TokenNumber +=3D 1 > + > + for Pcd in self.NonDynamicPcdList: > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D TokenNu= mber > + TokenNumber +=3D 1 > + return RetVal > + > + @cached_property > + def _MbList(self): > + return [self.BuildDatabase[m, self.Arch, self.BuildTarget, self= .ToolChain] for m in self.Platform.Modules] > + > + @cached_property > + def _MaList(self): > + for ModuleFile in self.Platform.Modules: > + Ma =3D ModuleAutoGen( > + self.Workspace, > + ModuleFile, > + self.BuildTarget, > + self.ToolChain, > + self.Arch, > + self.MetaFile, > + self.DataPipe > + ) > + self.Platform.Modules[ModuleFile].M =3D Ma > + return [x.M for x in self.Platform.Modules.values()] > + > + ## Summarize ModuleAutoGen objects of all modules to be built for t= his platform > + @cached_property > + def ModuleAutoGenList(self): > + RetVal =3D [] > + for Ma in self._MaList: > + if Ma not in RetVal: > + RetVal.append(Ma) > + return RetVal > + > + ## Summarize ModuleAutoGen objects of all libraries to be built for= this platform > + @cached_property > + def LibraryAutoGenList(self): > + RetVal =3D [] > + for Ma in self._MaList: > + for La in Ma.LibraryAutoGenList: > + if La not in RetVal: > + RetVal.append(La) > + if Ma not in La.ReferenceModules: > + La.ReferenceModules.append(Ma) > + return RetVal > + > + ## Test if a module is supported by the platform > + # > + # An error will be raised directly if the module or its arch is no= t supported > + # by the platform or current configuration > + # > + def ValidModule(self, Module): > + return Module in self.Platform.Modules or Module in self.Platfo= rm.LibraryInstances \ > + or Module in self._AsBuildModuleList > + @cached_property > + def GetAllModuleInfo(self,WithoutPcd=3DTrue): > + ModuleLibs =3D set() > + for m in self.Platform.Modules: > + module_obj =3D self.BuildDatabase[m,self.Arch,self.BuildTar= get,self.ToolChain] > + if not bool(module_obj.LibraryClass): > + Libs =3D GetModuleLibInstances(module_obj, self.Platfor= m, self.BuildDatabase, self.Arch,self.BuildTarget,self.ToolChain) > + else: > + Libs =3D [] > + ModuleLibs.update( set([(l.MetaFile.File,l.MetaFile.Root,l.= Arch,True) for l in Libs])) > + if WithoutPcd and module_obj.PcdIsDriver: > + continue > + ModuleLibs.add((m.File,m.Root,module_obj.Arch,False)) > + > + return ModuleLibs > + > + ## Resolve the library classes in a module to library instances > + # > + # This method will not only resolve library classes but also sort t= he library > + # instances according to the dependency-ship. > + # > + # @param Module The module from which the library classes w= ill be resolved > + # > + # @retval library_list List of library instances sorted > + # > + def ApplyLibraryInstance(self, Module): > + # Cover the case that the binary INF file is list in the FDF fi= le but not DSC file, return empty list directly > + if str(Module) not in self.Platform.Modules: > + return [] > + > + return GetModuleLibInstances(Module, > + self.Platform, > + self.BuildDatabase, > + self.Arch, > + self.BuildTarget, > + self.ToolChain, > + self.MetaFile, > + EdkLogger) > + > + ## Override PCD setting (type, value, ...) > + # > + # @param ToPcd The PCD to be overridden > + # @param FromPcd The PCD overriding from > + # > + def _OverridePcd(self, ToPcd, FromPcd, Module=3D"", Msg=3D"", Libra= ry=3D""): > + # > + # in case there's PCDs coming from FDF file, which have no type= given. > + # at this point, ToPcd.Type has the type found from dependent > + # package > + # > + TokenCName =3D ToPcd.TokenCName > + for PcdItem in GlobalData.MixedPcd: > + if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in GlobalD= ata.MixedPcd[PcdItem]: > + TokenCName =3D PcdItem[0] > + break > + if FromPcd is not None: > + if ToPcd.Pending and FromPcd.Type: > + ToPcd.Type =3D FromPcd.Type > + elif ToPcd.Type and FromPcd.Type\ > + and ToPcd.Type !=3D FromPcd.Type and ToPcd.Type in From= Pcd.Type: > + if ToPcd.Type.strip() =3D=3D TAB_PCDS_DYNAMIC_EX: > + ToPcd.Type =3D FromPcd.Type > + elif ToPcd.Type and FromPcd.Type \ > + and ToPcd.Type !=3D FromPcd.Type: > + if Library: > + Module =3D str(Module) + " 's library file (" + str= (Library) + ")" > + EdkLogger.error("build", OPTION_CONFLICT, "Mismatched P= CD type", > + ExtraData=3D"%s.%s is used as [%s] in m= odule %s, but as [%s] in %s."\ > + % (ToPcd.TokenSpaceGuidCName,= TokenCName, > + ToPcd.Type, Module, FromPc= d.Type, Msg), > + File=3Dself.MetaFile) > + > + if FromPcd.MaxDatumSize: > + ToPcd.MaxDatumSize =3D FromPcd.MaxDatumSize > + ToPcd.MaxSizeUserSet =3D FromPcd.MaxDatumSize > + if FromPcd.DefaultValue: > + ToPcd.DefaultValue =3D FromPcd.DefaultValue > + if FromPcd.TokenValue: > + ToPcd.TokenValue =3D FromPcd.TokenValue > + if FromPcd.DatumType: > + ToPcd.DatumType =3D FromPcd.DatumType > + if FromPcd.SkuInfoList: > + ToPcd.SkuInfoList =3D FromPcd.SkuInfoList > + if FromPcd.UserDefinedDefaultStoresFlag: > + ToPcd.UserDefinedDefaultStoresFlag =3D FromPcd.UserDefi= nedDefaultStoresFlag > + # Add Flexible PCD format parse > + if ToPcd.DefaultValue: > + try: > + ToPcd.DefaultValue =3D ValueExpressionEx(ToPcd.Defa= ultValue, ToPcd.DatumType, self.Platform._GuidDict)(True) > + except BadExpression as Value: > + EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.= %s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.De= faultValue, Value), > + File=3Dself.MetaFile) > + > + # check the validation of datum > + IsValid, Cause =3D CheckPcdDatum(ToPcd.DatumType, ToPcd.Def= aultValue) > + if not IsValid: > + EdkLogger.error('build', FORMAT_INVALID, Cause, File=3D= self.MetaFile, > + ExtraData=3D"%s.%s" % (ToPcd.TokenSpace= GuidCName, TokenCName)) > + ToPcd.validateranges =3D FromPcd.validateranges > + ToPcd.validlists =3D FromPcd.validlists > + ToPcd.expressions =3D FromPcd.expressions > + ToPcd.CustomAttribute =3D FromPcd.CustomAttribute > + > + if FromPcd is not None and ToPcd.DatumType =3D=3D TAB_VOID and = not ToPcd.MaxDatumSize: > + EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specifi= ed for PCD %s.%s" \ > + % (ToPcd.TokenSpaceGuidCName, TokenCName)) > + Value =3D ToPcd.DefaultValue > + if not Value: > + ToPcd.MaxDatumSize =3D '1' > + elif Value[0] =3D=3D 'L': > + ToPcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > + elif Value[0] =3D=3D '{': > + ToPcd.MaxDatumSize =3D str(len(Value.split(','))) > + else: > + ToPcd.MaxDatumSize =3D str(len(Value) - 1) > + > + # apply default SKU for dynamic PCDS if specified one is not av= ailable > + if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYN= AMIC_EX_TYPE_SET) \ > + and not ToPcd.SkuInfoList: > + if self.Platform.SkuName in self.Platform.SkuIds: > + SkuName =3D self.Platform.SkuName > + else: > + SkuName =3D TAB_DEFAULT > + ToPcd.SkuInfoList =3D { > + SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[Sk= uName][0], '', '', '', '', '', ToPcd.DefaultValue) > + } > + > + ## Apply PCD setting defined platform to a module > + # > + # @param Module The module from which the PCD setting will be o= verridden > + # > + # @retval PCD_list The list PCDs with settings from platform > + # > + def ApplyPcdSetting(self, Module, Pcds, Library=3D""): > + # for each PCD in module > + for Name, Guid in Pcds: > + PcdInModule =3D Pcds[Name, Guid] > + # find out the PCD setting in platform > + if (Name, Guid) in self.Platform.Pcds: > + PcdInPlatform =3D self.Platform.Pcds[Name, Guid] > + else: > + PcdInPlatform =3D None > + # then override the settings if any > + self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg= =3D"DSC PCD sections", Library=3DLibrary) > + # resolve the VariableGuid value > + for SkuId in PcdInModule.SkuInfoList: > + Sku =3D PcdInModule.SkuInfoList[SkuId] > + if Sku.VariableGuid =3D=3D '': continue > + Sku.VariableGuidValue =3D GuidValue(Sku.VariableGuid, s= elf.PackageList, self.MetaFile.Path) > + if Sku.VariableGuidValue is None: > + PackageList =3D "\n\t".join(str(P) for P in self.Pa= ckageList) > + EdkLogger.error( > + 'build', > + RESOURCE_NOT_AVAILABLE, > + "Value of GUID [%s] is not found in" % = Sku.VariableGuid, > + ExtraData=3DPackageList + "\n\t(used wi= th %s.%s from module %s)" \ > + % (Guid, Name, = str(Module)), > + File=3Dself.MetaFile > + ) > + > + # override PCD settings with module specific setting > + if Module in self.Platform.Modules: > + PlatformModule =3D self.Platform.Modules[str(Module)] > + for Key in PlatformModule.Pcds: > + if GlobalData.BuildOptionPcd: > + for pcd in GlobalData.BuildOptionPcd: > + (TokenSpaceGuidCName, TokenCName, FieldName, pc= dvalue, _) =3D pcd > + if (TokenCName, TokenSpaceGuidCName) =3D=3D Key= and FieldName =3D=3D"": > + PlatformModule.Pcds[Key].DefaultValue =3D p= cdvalue > + PlatformModule.Pcds[Key].PcdValueFromComm = =3D pcdvalue > + break > + Flag =3D False > + if Key in Pcds: > + ToPcd =3D Pcds[Key] > + Flag =3D True > + elif Key in GlobalData.MixedPcd: > + for PcdItem in GlobalData.MixedPcd[Key]: > + if PcdItem in Pcds: > + ToPcd =3D Pcds[PcdItem] > + Flag =3D True > + break > + if Flag: > + self._OverridePcd(ToPcd, PlatformModule.Pcds[Key], = Module, Msg=3D"DSC Components Module scoped PCD section", Library=3DLibrary= ) > + # use PCD value to calculate the MaxDatumSize when it is not sp= ecified > + for Name, Guid in Pcds: > + Pcd =3D Pcds[Name, Guid] > + if Pcd.DatumType =3D=3D TAB_VOID and not Pcd.MaxDatumSize: > + Pcd.MaxSizeUserSet =3D None > + Value =3D Pcd.DefaultValue > + if not Value: > + Pcd.MaxDatumSize =3D '1' > + elif Value[0] =3D=3D 'L': > + Pcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > + elif Value[0] =3D=3D '{': > + Pcd.MaxDatumSize =3D str(len(Value.split(','))) > + else: > + Pcd.MaxDatumSize =3D str(len(Value) - 1) > + return list(Pcds.values()) > + > + ## Append build options in platform to a module > + # > + # @param Module The module to which the build options will be a= ppended > + # > + # @retval options The options appended with build options in = platform > + # > + def ApplyBuildOption(self, Module): > + # Get the different options for the different style module > + PlatformOptions =3D self.EdkIIBuildOption > + ModuleTypeOptions =3D self.Platform.GetBuildOptionsByModuleType= (EDKII_NAME, Module.ModuleType) > + ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptions= ) > + ModuleOptions =3D self._ExpandBuildOption(Module.BuildOptions) > + if Module in self.Platform.Modules: > + PlatformModule =3D self.Platform.Modules[str(Module)] > + PlatformModuleOptions =3D self._ExpandBuildOption(PlatformM= odule.BuildOptions) > + else: > + PlatformModuleOptions =3D {} > + > + BuildRuleOrder =3D None > + for Options in [self.ToolDefinition, ModuleOptions, PlatformOpt= ions, ModuleTypeOptions, PlatformModuleOptions]: > + for Tool in Options: > + for Attr in Options[Tool]: > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > + BuildRuleOrder =3D Options[Tool][Attr] > + > + AllTools =3D set(list(ModuleOptions.keys()) + list(PlatformOpti= ons.keys()) + > + list(PlatformModuleOptions.keys()) + list(Module= TypeOptions.keys()) + > + list(self.ToolDefinition.keys())) > + BuildOptions =3D defaultdict(lambda: defaultdict(str)) > + for Tool in AllTools: > + for Options in [self.ToolDefinition, ModuleOptions, Platfor= mOptions, ModuleTypeOptions, PlatformModuleOptions]: > + if Tool not in Options: > + continue > + for Attr in Options[Tool]: > + # > + # Do not generate it in Makefile > + # > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > + continue > + Value =3D Options[Tool][Attr] > + # check if override is indicated > + if Value.startswith('=3D'): > + BuildOptions[Tool][Attr] =3D mws.handleWsMacro(= Value[1:]) > + else: > + if Attr !=3D 'PATH': > + BuildOptions[Tool][Attr] +=3D " " + mws.han= dleWsMacro(Value) > + else: > + BuildOptions[Tool][Attr] =3D mws.handleWsMa= cro(Value) > + > + return BuildOptions, BuildRuleOrder > + > + > + def GetGlobalBuildOptions(self,Module): > + ModuleTypeOptions =3D self.Platform.GetBuildOptionsByModuleType= (EDKII_NAME, Module.ModuleType) > + ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptions= ) > + > + if Module in self.Platform.Modules: > + PlatformModule =3D self.Platform.Modules[str(Module)] > + PlatformModuleOptions =3D self._ExpandBuildOption(PlatformM= odule.BuildOptions) > + else: > + PlatformModuleOptions =3D {} > + > + return ModuleTypeOptions,PlatformModuleOptions > + def ModuleGuid(self,Module): > + if os.path.basename(Module.MetaFile.File) !=3D os.path.basename= (Module.MetaFile.Path): > + # > + # Length of GUID is 36 > + # > + return os.path.basename(Module.MetaFile.Path)[:36] > + return Module.Guid > + @cached_property > + def UniqueBaseName(self): > + retVal =3D{} > + ModuleNameDict =3D {} > + UniqueName =3D {} > + for Module in self._MbList: > + unique_base_name =3D '%s_%s' % (Module.BaseName,self.Module= Guid(Module)) > + if unique_base_name not in ModuleNameDict: > + ModuleNameDict[unique_base_name] =3D [] > + ModuleNameDict[unique_base_name].append(Module.MetaFile) > + if Module.BaseName not in UniqueName: > + UniqueName[Module.BaseName] =3D set() > + UniqueName[Module.BaseName].add((self.ModuleGuid(Module),Mo= dule.MetaFile)) > + for module_paths in ModuleNameDict.values(): > + if len(module_paths) > 1 and len(set(module_paths))>1: > + samemodules =3D list(set(module_paths)) > + EdkLogger.error("build", FILE_DUPLICATED, 'Modules have= same BaseName and FILE_GUID:\n' > + ' %s\n %s' % (samemodules[0], sam= emodules[1])) > + for name in UniqueName: > + Guid_Path =3D UniqueName[name] > + if len(Guid_Path) > 1: > + retVal[name] =3D '%s_%s' % (name,Guid_Path.pop()[0]) > + return retVal > + ## Expand * in build option key > + # > + # @param Options Options to be expanded > + # @param ToolDef Use specified ToolDef instead of full versi= on. > + # This is needed during initialization to pre= vent > + # infinite recursion betweeh BuildOptions, > + # ToolDefinition, and this function. > + # > + # @retval options Options expanded > + # > + def _ExpandBuildOption(self, Options, ModuleStyle=3DNone, ToolDef= =3DNone): > + if not ToolDef: > + ToolDef =3D self.ToolDefinition > + BuildOptions =3D {} > + FamilyMatch =3D False > + FamilyIsNull =3D True > + > + OverrideList =3D {} > + # > + # Construct a list contain the build options which need overrid= e. > + # > + for Key in Options: > + # > + # Key[0] -- tool family > + # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE > + # > + if (Key[0] =3D=3D self.BuildRuleFamily and > + (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 a= nd Key[2] =3D=3D ModuleStyle))): > + Target, ToolChain, Arch, CommandType, Attr =3D Key[1].s= plit('_') > + if (Target =3D=3D self.BuildTarget or Target =3D=3D TAB= _STAR) and\ > + (ToolChain =3D=3D self.ToolChain or ToolChain =3D= =3D TAB_STAR) and\ > + (Arch =3D=3D self.Arch or Arch =3D=3D TAB_STAR) and= \ > + Options[Key].startswith("=3D"): > + > + if OverrideList.get(Key[1]) is not None: > + OverrideList.pop(Key[1]) > + OverrideList[Key[1]] =3D Options[Key] > + > + # > + # Use the highest priority value. > + # > + if (len(OverrideList) >=3D 2): > + KeyList =3D list(OverrideList.keys()) > + for Index in range(len(KeyList)): > + NowKey =3D KeyList[Index] > + Target1, ToolChain1, Arch1, CommandType1, Attr1 =3D Now= Key.split("_") > + for Index1 in range(len(KeyList) - Index - 1): > + NextKey =3D KeyList[Index1 + Index + 1] > + # > + # Compare two Key, if one is included by another, c= hoose the higher priority one > + # > + Target2, ToolChain2, Arch2, CommandType2, Attr2 =3D= NextKey.split("_") > + if (Target1 =3D=3D Target2 or Target1 =3D=3D TAB_ST= AR or Target2 =3D=3D TAB_STAR) and\ > + (ToolChain1 =3D=3D ToolChain2 or ToolChain1 =3D= = =3D TAB_STAR or ToolChain2 =3D=3D TAB_STAR) and\ > + (Arch1 =3D=3D Arch2 or Arch1 =3D=3D TAB_STAR or= Arch2 =3D=3D TAB_STAR) and\ > + (CommandType1 =3D=3D CommandType2 or CommandTyp= e1 =3D=3D TAB_STAR or CommandType2 =3D=3D TAB_STAR) and\ > + (Attr1 =3D=3D Attr2 or Attr1 =3D=3D TAB_STAR or= Attr2 =3D=3D TAB_STAR): > + > + if CalculatePriorityValue(NowKey) > CalculatePr= iorityValue(NextKey): > + if Options.get((self.BuildRuleFamily, NextK= ey)) is not None: > + Options.pop((self.BuildRuleFamily, Next= Key)) > + else: > + if Options.get((self.BuildRuleFamily, NowKe= y)) is not None: > + Options.pop((self.BuildRuleFamily, NowK= ey)) > + > + for Key in Options: > + if ModuleStyle is not None and len (Key) > 2: > + # Check Module style is EDK or EDKII. > + # Only append build option for the matched style module= . > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NAME= : > + continue > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D EDKI= I_NAME: > + continue > + Family =3D Key[0] > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > + # if tool chain family doesn't match, skip it > + if Tool in ToolDef and Family !=3D "": > + FamilyIsNull =3D False > + if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY, "= ") !=3D "": > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_BUILDR= ULEFAMILY]: > + continue > + elif Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > + continue > + FamilyMatch =3D True > + # expand any wildcard > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarget= : > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > + if Tool not in BuildOptions: > + BuildOptions[Tool] =3D {} > + if Attr !=3D "FLAGS" or Attr not in BuildOption= s[Tool] or Options[Key].startswith('=3D'): > + BuildOptions[Tool][Attr] =3D Options[Key] > + else: > + # append options for the same tool except P= ATH > + if Attr !=3D 'PATH': > + BuildOptions[Tool][Attr] +=3D " " + Opt= ions[Key] > + else: > + BuildOptions[Tool][Attr] =3D Options[Ke= y] > + # Build Option Family has been checked, which need't to be chec= ked again for family. > + if FamilyMatch or FamilyIsNull: > + return BuildOptions > + > + for Key in Options: > + if ModuleStyle is not None and len (Key) > 2: > + # Check Module style is EDK or EDKII. > + # Only append build option for the matched style module= . > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NAME= : > + continue > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D EDKI= I_NAME: > + continue > + Family =3D Key[0] > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > + # if tool chain family doesn't match, skip it > + if Tool not in ToolDef or Family =3D=3D "": > + continue > + # option has been added before > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > + continue > + > + # expand any wildcard > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarget= : > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > + if Tool not in BuildOptions: > + BuildOptions[Tool] =3D {} > + if Attr !=3D "FLAGS" or Attr not in BuildOption= s[Tool] or Options[Key].startswith('=3D'): > + BuildOptions[Tool][Attr] =3D Options[Key] > + else: > + # append options for the same tool except P= ATH > + if Attr !=3D 'PATH': > + BuildOptions[Tool][Attr] +=3D " " + Opt= ions[Key] > + else: > + BuildOptions[Tool][Attr] =3D Options[Ke= y] > + return BuildOptions > diff --git a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py b/BaseT= ools/Source/Python/AutoGen/WorkspaceAutoGen.py > new file mode 100644 > index 000000000000..22a7d996fd3b > --- /dev/null > +++ b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py > @@ -0,0 +1,904 @@ > +## @file > +# Create makefile for MS nmake and GNU make > +# > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> +# SPDX-License-Identifier: BSD-2-Clause-Patent > +# > + > +## Import Modules > +# > +from __future__ import print_function > +from __future__ import absolute_import > +import os.path as path > +import hashlib > +from collections import defaultdict > +from GenFds.FdfParser import FdfParser > +from Workspace.WorkspaceCommon import GetModuleLibInstances > +from AutoGen import GenMake > +from AutoGen.AutoGen import AutoGen > +from AutoGen.PlatformAutoGen import PlatformAutoGen > +from AutoGen.BuildEngine import gDefaultBuildRuleFile > +from Common.ToolDefClassObject import gDefaultToolsDefFile > +from Common.StringUtils import NormPath > +from Common.BuildToolError import * > +from Common.DataType import * > +from Common.Misc import * > + > +## Regular expression for splitting Dependency Expression string into t= okens > +gDepexTokenPattern =3D re.compile("(\(|\)|\w+| \S+\.inf)") > + > +## Regular expression for match: PCD(xxxx.yyy) > +gPCDAsGuidPattern =3D re.compile(r"^PCD\(.+\..+\)$") > + > +## Workspace AutoGen class > +# > +# This class is used mainly to control the whole platform build for d= ifferent > +# architecture. This class will generate top level makefile. > +# > +class WorkspaceAutoGen(AutoGen): > + # call super().__init__ then call the worker function with differen= t parameter count > + def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > + if not hasattr(self, "_Init"): > + self._InitWorker(Workspace, MetaFile, Target, Toolchain, Ar= ch, *args, **kwargs) > + self._Init =3D True > + > + ## Initialize WorkspaceAutoGen > + # > + # @param WorkspaceDir Root directory of workspace > + # @param ActivePlatform Meta-file of active platform > + # @param Target Build target > + # @param Toolchain Tool chain name > + # @param ArchList List of architecture of current= build > + # @param MetaFileDb Database containing meta-files > + # @param BuildConfig Configuration of build > + # @param ToolDefinition Tool chain definitions > + # @param FlashDefinitionFile File of flash definition > + # @param Fds FD list to be generated > + # @param Fvs FV list to be generated > + # @param Caps Capsule list to be generated > + # @param SkuId SKU id from command line > + # > + def _InitWorker(self, WorkspaceDir, ActivePlatform, Target, Toolcha= in, ArchList, MetaFileDb, > + BuildConfig, ToolDefinition, FlashDefinitionFile=3D'', Fd= s=3DNone, Fvs=3DNone, Caps=3DNone, SkuId=3D'', UniFlag=3DNone, > + Progress=3DNone, BuildModule=3DNone): > + self.BuildDatabase =3D MetaFileDb > + self.MetaFile =3D ActivePlatform > + self.WorkspaceDir =3D WorkspaceDir > + self.Platform =3D self.BuildDatabase[self.MetaFile, TAB_A= RCH_COMMON, Target, Toolchain] > + GlobalData.gActivePlatform =3D self.Platform > + self.BuildTarget =3D Target > + self.ToolChain =3D Toolchain > + self.ArchList =3D ArchList > + self.SkuId =3D SkuId > + self.UniFlag =3D UniFlag > + > + self.TargetTxt =3D BuildConfig > + self.ToolDef =3D ToolDefinition > + self.FdfFile =3D FlashDefinitionFile > + self.FdTargetList =3D Fds if Fds else [] > + self.FvTargetList =3D Fvs if Fvs else [] > + self.CapTargetList =3D Caps if Caps else [] > + self.AutoGenObjectList =3D [] > + self._GuidDict =3D {} > + > + # there's many relative directory operations, so ... > + os.chdir(self.WorkspaceDir) > + > + self.MergeArch() > + self.ValidateBuildTarget() > + > + EdkLogger.info("") > + if self.ArchList: > + EdkLogger.info('%-16s =3D %s' % ("Architecture(s)", ' '.joi= n(self.ArchList))) > + EdkLogger.info('%-16s =3D %s' % ("Build target", self.BuildTarg= et)) > + EdkLogger.info('%-16s =3D %s' % ("Toolchain", self.ToolChain)) > + > + EdkLogger.info('\n%-24s =3D %s' % ("Active Platform", self.Plat= form)) > + if BuildModule: > + EdkLogger.info('%-24s =3D %s' % ("Active Module", BuildModu= le)) > + > + if self.FdfFile: > + EdkLogger.info('%-24s =3D %s' % ("Flash Image Definition", = self.FdfFile)) > + > + EdkLogger.verbose("\nFLASH_DEFINITION =3D %s" % self.FdfFile) > + > + if Progress: > + Progress.Start("\nProcessing meta-data") > + # > + # Mark now build in AutoGen Phase > + # > + GlobalData.gAutoGenPhase =3D True > + self.ProcessModuleFromPdf() > + self.ProcessPcdType() > + self.ProcessMixedPcd() > + self.VerifyPcdsFromFDF() > + self.CollectAllPcds() > + self.GeneratePkgLevelHash() > + # > + # Check PCDs token value conflict in each DEC file. > + # > + self._CheckAllPcdsTokenValueConflict() > + # > + # Check PCD type and definition between DSC and DEC > + # > + self._CheckPcdDefineAndType() > + > + self.CreateBuildOptionsFile() > + self.CreatePcdTokenNumberFile() > + self.CreateModuleHashInfo() > + GlobalData.gAutoGenPhase =3D False > + > + # > + # Merge Arch > + # > + def MergeArch(self): > + if not self.ArchList: > + ArchList =3D set(self.Platform.SupArchList) > + else: > + ArchList =3D set(self.ArchList) & set(self.Platform.SupArch= List) > + if not ArchList: > + EdkLogger.error("build", PARAMETER_INVALID, > + ExtraData =3D "Invalid ARCH specified. [Val= id ARCH: %s]" % (" ".join(self.Platform.SupArchList))) > + elif self.ArchList and len(ArchList) !=3D len(self.ArchList): > + SkippedArchList =3D set(self.ArchList).symmetric_difference= (set(self.Platform.SupArchList)) > + EdkLogger.verbose("\nArch [%s] is ignored because the platf= orm supports [%s] only!" > + % (" ".join(SkippedArchList), " ".join(se= lf.Platform.SupArchList))) > + self.ArchList =3D tuple(ArchList) > + > + # Validate build target > + def ValidateBuildTarget(self): > + if self.BuildTarget not in self.Platform.BuildTargets: > + EdkLogger.error("build", PARAMETER_INVALID, > + ExtraData=3D"Build target [%s] is not suppo= rted by the platform. [Valid target: %s]" > + % (self.BuildTarget, " ".join(sel= f.Platform.BuildTargets))) > + @cached_property > + def FdfProfile(self): > + if not self.FdfFile: > + self.FdfFile =3D self.Platform.FlashDefinition > + > + FdfProfile =3D None > + if self.FdfFile: > + Fdf =3D FdfParser(self.FdfFile.Path) > + Fdf.ParseFile() > + GlobalData.gFdfParser =3D Fdf > + if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile.F= dDict: > + FdDict =3D Fdf.Profile.FdDict[Fdf.CurrentFdName] > + for FdRegion in FdDict.RegionList: > + if str(FdRegion.RegionType) is 'FILE' and self.Plat= form.VpdToolGuid in str(FdRegion.RegionDataList): > + if int(FdRegion.Offset) % 8 !=3D 0: > + EdkLogger.error("build", FORMAT_INVALID, 'T= he VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset)) > + FdfProfile =3D Fdf.Profile > + else: > + if self.FdTargetList: > + EdkLogger.info("No flash definition file found. FD [%s]= will be ignored." % " ".join(self.FdTargetList)) > + self.FdTargetList =3D [] > + if self.FvTargetList: > + EdkLogger.info("No flash definition file found. FV [%s]= will be ignored." % " ".join(self.FvTargetList)) > + self.FvTargetList =3D [] > + if self.CapTargetList: > + EdkLogger.info("No flash definition file found. Capsule= [%s] will be ignored." % " ".join(self.CapTargetList)) > + self.CapTargetList =3D [] > + > + return FdfProfile > + > + def ProcessModuleFromPdf(self): > + > + if self.FdfProfile: > + for fvname in self.FvTargetList: > + if fvname.upper() not in self.FdfProfile.FvDict: > + EdkLogger.error("build", OPTION_VALUE_INVALID, > + "No such an FV in FDF file: %s" % f= vname) > + > + # In DSC file may use FILE_GUID to override the module, the= n in the Platform.Modules use FILE_GUIDmodule.inf as key, > + # but the path (self.MetaFile.Path) is the real path > + for key in self.FdfProfile.InfDict: > + if key =3D=3D 'ArchTBD': > + MetaFile_cache =3D defaultdict(set) > + for Arch in self.ArchList: > + Current_Platform_cache =3D self.BuildDatabase[s= elf.MetaFile, Arch, self.BuildTarget, self.ToolChain] > + for Pkey in Current_Platform_cache.Modules: > + MetaFile_cache[Arch].add(Current_Platform_c= ache.Modules[Pkey].MetaFile) > + for Inf in self.FdfProfile.InfDict[key]: > + ModuleFile =3D PathClass(NormPath(Inf), GlobalD= ata.gWorkspace, Arch) > + for Arch in self.ArchList: > + if ModuleFile in MetaFile_cache[Arch]: > + break > + else: > + ModuleData =3D self.BuildDatabase[ModuleFil= e, Arch, self.BuildTarget, self.ToolChain] > + if not ModuleData.IsBinaryModule: > + EdkLogger.error('build', PARSER_ERROR, = "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFi= le) > + > + else: > + for Arch in self.ArchList: > + if Arch =3D=3D key: > + Platform =3D self.BuildDatabase[self.MetaFi= le, Arch, self.BuildTarget, self.ToolChain] > + MetaFileList =3D set() > + for Pkey in Platform.Modules: > + MetaFileList.add(Platform.Modules[Pkey]= .MetaFile) > + for Inf in self.FdfProfile.InfDict[key]: > + ModuleFile =3D PathClass(NormPath(Inf),= GlobalData.gWorkspace, Arch) > + if ModuleFile in MetaFileList: > + continue > + ModuleData =3D self.BuildDatabase[Modul= eFile, Arch, self.BuildTarget, self.ToolChain] > + if not ModuleData.IsBinaryModule: > + EdkLogger.error('build', PARSER_ERR= OR, "Module %s NOT found in DSC file; Is it really a binary module?" % Modu= leFile) > + > + > + > + # parse FDF file to get PCDs in it, if any > + def VerifyPcdsFromFDF(self): > + > + if self.FdfProfile: > + PcdSet =3D self.FdfProfile.PcdDict > + self.VerifyPcdDeclearation(PcdSet) > + > + def ProcessPcdType(self): > + for Arch in self.ArchList: > + Platform =3D self.BuildDatabase[self.MetaFile, Arch, self.B= uildTarget, self.ToolChain] > + Platform.Pcds > + # generate the SourcePcdDict and BinaryPcdDict > + Libs =3D [] > + for BuildData in list(self.BuildDatabase._CACHE_.values()): > + if BuildData.Arch !=3D Arch: > + continue > + if BuildData.MetaFile.Ext =3D=3D '.inf' and str(BuildDa= ta) in Platform.Modules : > + Libs.extend(GetModuleLibInstances(BuildData, Platfo= rm, > + self.BuildDatabase, > + Arch, > + self.BuildTarget, > + self.ToolChain > + )) > + for BuildData in list(self.BuildDatabase._CACHE_.values()): > + if BuildData.Arch !=3D Arch: > + continue > + if BuildData.MetaFile.Ext =3D=3D '.inf': > + for key in BuildData.Pcds: > + if BuildData.Pcds[key].Pending: > + if key in Platform.Pcds: > + PcdInPlatform =3D Platform.Pcds[key] > + if PcdInPlatform.Type: > + BuildData.Pcds[key].Type =3D PcdInP= latform.Type > + BuildData.Pcds[key].Pending =3D Fal= se > + > + if BuildData.MetaFile in Platform.Modules: > + PlatformModule =3D Platform.Modules[str= (BuildData.MetaFile)] > + if key in PlatformModule.Pcds: > + PcdInPlatform =3D PlatformModule.Pc= ds[key] > + if PcdInPlatform.Type: > + BuildData.Pcds[key].Type =3D Pc= dInPlatform.Type > + BuildData.Pcds[key].Pending =3D= False > + else: > + #Pcd used in Library, Pcd Type from ref= erence module if Pcd Type is Pending > + if BuildData.Pcds[key].Pending: > + if bool(BuildData.LibraryClass): > + if BuildData in set(Libs): > + ReferenceModules =3D BuildD= ata.ReferenceModules > + for ReferenceModule in Refe= renceModules: > + if ReferenceModule.Meta= File in Platform.Modules: > + RefPlatformModule = =3D Platform.Modules[str(ReferenceModule.MetaFile)] > + if key in RefPlatfo= rmModule.Pcds: > + PcdInReferenceM= odule =3D RefPlatformModule.Pcds[key] > + if PcdInReferen= ceModule.Type: > + BuildData.P= cds[key].Type =3D PcdInReferenceModule.Type > + BuildData.P= cds[key].Pending =3D False > + break > + > + def ProcessMixedPcd(self): > + for Arch in self.ArchList: > + SourcePcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATC= HABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()} > + BinaryPcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATC= HABLE_IN_MODULE:set()} > + SourcePcdDict_Keys =3D SourcePcdDict.keys() > + BinaryPcdDict_Keys =3D BinaryPcdDict.keys() > + > + # generate the SourcePcdDict and BinaryPcdDict > + > + for BuildData in list(self.BuildDatabase._CACHE_.values()): > + if BuildData.Arch !=3D Arch: > + continue > + if BuildData.MetaFile.Ext =3D=3D '.inf': > + for key in BuildData.Pcds: > + if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].T= ype: > + if BuildData.IsBinaryModule: > + BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add(= (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > + else: > + SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add(= (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > + > + elif TAB_PCDS_PATCHABLE_IN_MODULE in BuildData.= Pcds[key].Type: > + if BuildData.MetaFile.Ext =3D=3D '.inf': > + if BuildData.IsBinaryModule: > + BinaryPcdDict[TAB_PCDS_PATCHABLE_IN= _MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpac= eGuidCName)) > + else: > + SourcePcdDict[TAB_PCDS_PATCHABLE_IN= _MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpac= eGuidCName)) > + > + elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Ty= pe: > + SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildD= ata.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > + elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[= key].Type: > + SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add(= (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > + > + # > + # A PCD can only use one type for all source modules > + # > + for i in SourcePcdDict_Keys: > + for j in SourcePcdDict_Keys: > + if i !=3D j: > + Intersections =3D SourcePcdDict[i].intersection= (SourcePcdDict[j]) > + if len(Intersections) > 0: > + EdkLogger.error( > + 'build', > + FORMAT_INVALID, > + "Building modules from source INFs, followi= ng PCD use %s and %s access method. It must be corrected to use only one ac= cess method." % (i, j), > + ExtraData=3D'\n\t'.join(str(P[1]+'.'+P[0]) = for P in Intersections) > + ) > + > + # > + # intersection the BinaryPCD for Mixed PCD > + # > + for i in BinaryPcdDict_Keys: > + for j in BinaryPcdDict_Keys: > + if i !=3D j: > + Intersections =3D BinaryPcdDict[i].intersection= (BinaryPcdDict[j]) > + for item in Intersections: > + NewPcd1 =3D (item[0] + '_' + i, item[1]) > + NewPcd2 =3D (item[0] + '_' + j, item[1]) > + if item not in GlobalData.MixedPcd: > + GlobalData.MixedPcd[item] =3D [NewPcd1,= NewPcd2] > + else: > + if NewPcd1 not in GlobalData.MixedPcd[i= tem]: > + GlobalData.MixedPcd[item].append(Ne= wPcd1) > + if NewPcd2 not in GlobalData.MixedPcd[i= tem]: > + GlobalData.MixedPcd[item].append(Ne= wPcd2) > + > + # > + # intersection the SourcePCD and BinaryPCD for Mixed PCD > + # > + for i in SourcePcdDict_Keys: > + for j in BinaryPcdDict_Keys: > + if i !=3D j: > + Intersections =3D SourcePcdDict[i].intersection= (BinaryPcdDict[j]) > + for item in Intersections: > + NewPcd1 =3D (item[0] + '_' + i, item[1]) > + NewPcd2 =3D (item[0] + '_' + j, item[1]) > + if item not in GlobalData.MixedPcd: > + GlobalData.MixedPcd[item] =3D [NewPcd1,= NewPcd2] > + else: > + if NewPcd1 not in GlobalData.MixedPcd[i= tem]: > + GlobalData.MixedPcd[item].append(Ne= wPcd1) > + if NewPcd2 not in GlobalData.MixedPcd[i= tem]: > + GlobalData.MixedPcd[item].append(Ne= wPcd2) > + > + BuildData =3D self.BuildDatabase[self.MetaFile, Arch, self.= BuildTarget, self.ToolChain] > + for key in BuildData.Pcds: > + for SinglePcd in GlobalData.MixedPcd: > + if (BuildData.Pcds[key].TokenCName, BuildData.Pcds[= key].TokenSpaceGuidCName) =3D=3D SinglePcd: > + for item in GlobalData.MixedPcd[SinglePcd]: > + Pcd_Type =3D item[0].split('_')[-1] > + if (Pcd_Type =3D=3D BuildData.Pcds[key].Typ= e) or (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC_EX and BuildData.Pcds[key].Type in = PCD_DYNAMIC_EX_TYPE_SET) or \ > + (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC and Bu= ildData.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET): > + Value =3D BuildData.Pcds[key] > + Value.TokenCName =3D BuildData.Pcds[key= ].TokenCName + '_' + Pcd_Type > + if len(key) =3D=3D 2: > + newkey =3D (Value.TokenCName, key[1= ]) > + elif len(key) =3D=3D 3: > + newkey =3D (Value.TokenCName, key[1= ], key[2]) > + del BuildData.Pcds[key] > + BuildData.Pcds[newkey] =3D Value > + break > + break > + > + if self.FdfProfile: > + PcdSet =3D self.FdfProfile.PcdDict > + # handle the mixed pcd in FDF file > + for key in PcdSet: > + if key in GlobalData.MixedPcd: > + Value =3D PcdSet[key] > + del PcdSet[key] > + for item in GlobalData.MixedPcd[key]: > + PcdSet[item] =3D Value > + > + #Collect package set information from INF of FDF > + @cached_property > + def PkgSet(self): > + if not self.FdfFile: > + self.FdfFile =3D self.Platform.FlashDefinition > + > + if self.FdfFile: > + ModuleList =3D self.FdfProfile.InfList > + else: > + ModuleList =3D [] > + Pkgs =3D {} > + for Arch in self.ArchList: > + Platform =3D self.BuildDatabase[self.MetaFile, Arch, self.B= uildTarget, self.ToolChain] > + PkgSet =3D set() > + for mb in [self.BuildDatabase[m, Arch, self.BuildTarget, se= lf.ToolChain] for m in Platform.Modules]: > + PkgSet.update(mb.Packages) > + for Inf in ModuleList: > + ModuleFile =3D PathClass(NormPath(Inf), GlobalData.gWor= kspace, Arch) > + if ModuleFile in Platform.Modules: > + continue > + ModuleData =3D self.BuildDatabase[ModuleFile, Arch, sel= f.BuildTarget, self.ToolChain] > + PkgSet.update(ModuleData.Packages) > + Pkgs[Arch] =3D list(PkgSet) > + return Pkgs > + > + def VerifyPcdDeclearation(self,PcdSet): > + for Arch in self.ArchList: > + Platform =3D self.BuildDatabase[self.MetaFile, Arch, self.B= uildTarget, self.ToolChain] > + Pkgs =3D self.PkgSet[Arch] > + DecPcds =3D set() > + DecPcdsKey =3D set() > + for Pkg in Pkgs: > + for Pcd in Pkg.Pcds: > + DecPcds.add((Pcd[0], Pcd[1])) > + DecPcdsKey.add((Pcd[0], Pcd[1], Pcd[2])) > + > + Platform.SkuName =3D self.SkuId > + for Name, Guid,Fileds in PcdSet: > + if (Name, Guid) not in DecPcds: > + EdkLogger.error( > + 'build', > + PARSER_ERROR, > + "PCD (%s.%s) used in FDF is not declared in DEC= files." % (Guid, Name), > + File =3D self.FdfProfile.PcdFileLineDict[Name, = Guid, Fileds][0], > + Line =3D self.FdfProfile.PcdFileLineDict[Name, = Guid, Fileds][1] > + ) > + else: > + # Check whether Dynamic or DynamicEx PCD used in FD= F file. If used, build break and give a error message. > + if (Name, Guid, TAB_PCDS_FIXED_AT_BUILD) in DecPcds= Key \ > + or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE) i= n DecPcdsKey \ > + or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in DecPc= dsKey: > + continue > + elif (Name, Guid, TAB_PCDS_DYNAMIC) in DecPcdsKey o= r (Name, Guid, TAB_PCDS_DYNAMIC_EX) in DecPcdsKey: > + EdkLogger.error( > + 'build', > + PARSER_ERROR, > + "Using Dynamic or DynamicEx type of PCD= [%s.%s] in FDF file is not allowed." % (Guid, Name), > + File =3D self.FdfProfile.PcdFileLineDic= t[Name, Guid, Fileds][0], > + Line =3D self.FdfProfile.PcdFileLineDic= t[Name, Guid, Fileds][1] > + ) > + def CollectAllPcds(self): > + > + for Arch in self.ArchList: > + Pa =3D PlatformAutoGen(self, self.MetaFile, self.BuildTarge= t, self.ToolChain, Arch) > + # > + # Explicitly collect platform's dynamic PCDs > + # > + Pa.CollectPlatformDynamicPcds() > + Pa.CollectFixedAtBuildPcds() > + self.AutoGenObjectList.append(Pa) > + # We need to calculate the PcdTokenNumber after all Arch Pcds a= re collected. > + for Arch in self.ArchList: > + #Pcd TokenNumber > + Pa =3D PlatformAutoGen(self, self.MetaFile, self.BuildTarge= t, self.ToolChain, Arch) > + self.UpdateModuleDataPipe(Arch, {"PCD_TNUM":Pa.PcdTokenNum= ber}) > + > + def UpdateModuleDataPipe(self,arch, attr_dict): > + for (Target, Toolchain, Arch, MetaFile) in AutoGen.Cache(): > + if Arch !=3D arch: > + continue > + try: > + AutoGen.Cache()[(Target, Toolchain, Arch, MetaFile)].Da= taPipe.DataContainer =3D attr_dict > + except Exception: > + pass > + # > + # Generate Package level hash value > + # > + def GeneratePkgLevelHash(self): > + for Arch in self.ArchList: > + GlobalData.gPackageHash =3D {} > + if GlobalData.gUseHashCache: > + for Pkg in self.PkgSet[Arch]: > + self._GenPkgLevelHash(Pkg) > + > + > + def CreateBuildOptionsFile(self): > + # > + # Create BuildOptions Macro & PCD metafile, also add the Active= Platform and FDF file. > + # > + content =3D 'gCommandLineDefines: ' > + content +=3D str(GlobalData.gCommandLineDefines) > + content +=3D TAB_LINE_BREAK > + content +=3D 'BuildOptionPcd: ' > + content +=3D str(GlobalData.BuildOptionPcd) > + content +=3D TAB_LINE_BREAK > + content +=3D 'Active Platform: ' > + content +=3D str(self.Platform) > + content +=3D TAB_LINE_BREAK > + if self.FdfFile: > + content +=3D 'Flash Image Definition: ' > + content +=3D str(self.FdfFile) > + content +=3D TAB_LINE_BREAK > + SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'), c= ontent, False) > + > + def CreatePcdTokenNumberFile(self): > + # > + # Create PcdToken Number file for Dynamic/DynamicEx Pcd. > + # > + PcdTokenNumber =3D 'PcdTokenNumber: ' > + Pa =3D self.AutoGenObjectList[0] > + if Pa.PcdTokenNumber: > + if Pa.DynamicPcdList: > + for Pcd in Pa.DynamicPcdList: > + PcdTokenNumber +=3D TAB_LINE_BREAK > + PcdTokenNumber +=3D str((Pcd.TokenCName, Pcd.TokenS= paceGuidCName)) > + PcdTokenNumber +=3D ' : ' > + PcdTokenNumber +=3D str(Pa.PcdTokenNumber[Pcd.Token= CName, Pcd.TokenSpaceGuidCName]) > + SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'),= PcdTokenNumber, False) > + > + def CreateModuleHashInfo(self): > + # > + # Get set of workspace metafiles > + # > + AllWorkSpaceMetaFiles =3D self._GetMetaFiles(self.BuildTarget, = self.ToolChain) > + > + # > + # Retrieve latest modified time of all metafiles > + # > + SrcTimeStamp =3D 0 > + for f in AllWorkSpaceMetaFiles: > + if os.stat(f)[8] > SrcTimeStamp: > + SrcTimeStamp =3D os.stat(f)[8] > + self._SrcTimeStamp =3D SrcTimeStamp > + > + if GlobalData.gUseHashCache: > + m =3D hashlib.md5() > + for files in AllWorkSpaceMetaFiles: > + if files.endswith('.dec'): > + continue > + f =3D open(files, 'rb') > + Content =3D f.read() > + f.close() > + m.update(Content) > + SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'= ), m.hexdigest(), False) > + GlobalData.gPlatformHash =3D m.hexdigest() > + > + # > + # Write metafile list to build directory > + # > + AutoGenFilePath =3D os.path.join(self.BuildDir, 'AutoGen') > + if os.path.exists (AutoGenFilePath): > + os.remove(AutoGenFilePath) > + if not os.path.exists(self.BuildDir): > + os.makedirs(self.BuildDir) > + with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file= : > + for f in AllWorkSpaceMetaFiles: > + print(f, file=3Dfile) > + return True > + > + def _GenPkgLevelHash(self, Pkg): > + if Pkg.PackageName in GlobalData.gPackageHash: > + return > + > + PkgDir =3D os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageNam= e) > + CreateDirectory(PkgDir) > + HashFile =3D os.path.join(PkgDir, Pkg.PackageName + '.hash') > + m =3D hashlib.md5() > + # Get .dec file's hash value > + f =3D open(Pkg.MetaFile.Path, 'rb') > + Content =3D f.read() > + f.close() > + m.update(Content) > + # Get include files hash value > + if Pkg.Includes: > + for inc in sorted(Pkg.Includes, key=3Dlambda x: str(x)): > + for Root, Dirs, Files in os.walk(str(inc)): > + for File in sorted(Files): > + File_Path =3D os.path.join(Root, File) > + f =3D open(File_Path, 'rb') > + Content =3D f.read() > + f.close() > + m.update(Content) > + SaveFileOnChange(HashFile, m.hexdigest(), False) > + GlobalData.gPackageHash[Pkg.PackageName] =3D m.hexdigest() > + > + def _GetMetaFiles(self, Target, Toolchain): > + AllWorkSpaceMetaFiles =3D set() > + # > + # add fdf > + # > + if self.FdfFile: > + AllWorkSpaceMetaFiles.add (self.FdfFile.Path) > + for f in GlobalData.gFdfParser.GetAllIncludedFile(): > + AllWorkSpaceMetaFiles.add (f.FileName) > + # > + # add dsc > + # > + AllWorkSpaceMetaFiles.add(self.MetaFile.Path) > + > + # > + # add build_rule.txt & tools_def.txt > + # > + AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirector= y, gDefaultBuildRuleFile)) > + AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirector= y, gDefaultToolsDefFile)) > + > + # add BuildOption metafile > + # > + AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'BuildOpt= ions')) > + > + # add PcdToken Number file for Dynamic/DynamicEx Pcd > + # > + AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'PcdToken= Number')) > + > + for Pa in self.AutoGenObjectList: > + AllWorkSpaceMetaFiles.add(Pa.ToolDefinitionFile) > + > + for Arch in self.ArchList: > + # > + # add dec > + # > + for Package in PlatformAutoGen(self, self.MetaFile, Target,= Toolchain, Arch).PackageList: > + AllWorkSpaceMetaFiles.add(Package.MetaFile.Path) > + > + # > + # add included dsc > + # > + for filePath in self.BuildDatabase[self.MetaFile, Arch, Tar= get, Toolchain]._RawData.IncludedFiles: > + AllWorkSpaceMetaFiles.add(filePath.Path) > + > + return AllWorkSpaceMetaFiles > + > + def _CheckPcdDefineAndType(self): > + PcdTypeSet =3D {TAB_PCDS_FIXED_AT_BUILD, > + TAB_PCDS_PATCHABLE_IN_MODULE, > + TAB_PCDS_FEATURE_FLAG, > + TAB_PCDS_DYNAMIC, > + TAB_PCDS_DYNAMIC_EX} > + > + # This dict store PCDs which are not used by any modules with s= pecified arches > + UnusedPcd =3D OrderedDict() > + for Pa in self.AutoGenObjectList: > + # Key of DSC's Pcds dictionary is PcdCName, TokenSpaceGuid > + for Pcd in Pa.Platform.Pcds: > + PcdType =3D Pa.Platform.Pcds[Pcd].Type > + > + # If no PCD type, this PCD comes from FDF > + if not PcdType: > + continue > + > + # Try to remove Hii and Vpd suffix > + if PcdType.startswith(TAB_PCDS_DYNAMIC_EX): > + PcdType =3D TAB_PCDS_DYNAMIC_EX > + elif PcdType.startswith(TAB_PCDS_DYNAMIC): > + PcdType =3D TAB_PCDS_DYNAMIC > + > + for Package in Pa.PackageList: > + # Key of DEC's Pcds dictionary is PcdCName, TokenSp= aceGuid, PcdType > + if (Pcd[0], Pcd[1], PcdType) in Package.Pcds: > + break > + for Type in PcdTypeSet: > + if (Pcd[0], Pcd[1], Type) in Package.Pcds: > + EdkLogger.error( > + 'build', > + FORMAT_INVALID, > + "Type [%s] of PCD [%s.%s] in DSC file d= oesn't match the type [%s] defined in DEC file." \ > + % (Pa.Platform.Pcds[Pcd].Type, Pcd[1], = Pcd[0], Type), > + ExtraData=3DNone > + ) > + return > + else: > + UnusedPcd.setdefault(Pcd, []).append(Pa.Arch) > + > + for Pcd in UnusedPcd: > + EdkLogger.warn( > + 'build', > + "The PCD was not specified by any INF module in the pla= tform for the given architecture.\n" > + "\tPCD: [%s.%s]\n\tPlatform: [%s]\n\tArch: %s" > + % (Pcd[1], Pcd[0], os.path.basename(str(self.MetaFile))= , str(UnusedPcd[Pcd])), > + ExtraData=3DNone > + ) > + > + def __repr__(self): > + return "%s [%s]" % (self.MetaFile, ", ".join(self.ArchList)) > + > + ## Return the directory to store FV files > + @cached_property > + def FvDir(self): > + return path.join(self.BuildDir, TAB_FV_DIRECTORY) > + > + ## Return the directory to store all intermediate and final files b= uilt > + @cached_property > + def BuildDir(self): > + return self.AutoGenObjectList[0].BuildDir > + > + ## Return the build output directory platform specifies > + @cached_property > + def OutputDir(self): > + return self.Platform.OutputDirectory > + > + ## Return platform name > + @cached_property > + def Name(self): > + return self.Platform.PlatformName > + > + ## Return meta-file GUID > + @cached_property > + def Guid(self): > + return self.Platform.Guid > + > + ## Return platform version > + @cached_property > + def Version(self): > + return self.Platform.Version > + > + ## Return paths of tools > + @cached_property > + def ToolDefinition(self): > + return self.AutoGenObjectList[0].ToolDefinition > + > + ## Return directory of platform makefile > + # > + # @retval string Makefile directory > + # > + @cached_property > + def MakeFileDir(self): > + return self.BuildDir > + > + ## Return build command string > + # > + # @retval string Build command string > + # > + @cached_property > + def BuildCommand(self): > + # BuildCommand should be all the same. So just get one from pla= tform AutoGen > + return self.AutoGenObjectList[0].BuildCommand > + > + ## Check the PCDs token value conflict in each DEC file. > + # > + # Will cause build break and raise error message while two PCDs con= flict. > + # > + # @return None > + # > + def _CheckAllPcdsTokenValueConflict(self): > + for Pa in self.AutoGenObjectList: > + for Package in Pa.PackageList: > + PcdList =3D list(Package.Pcds.values()) > + PcdList.sort(key=3Dlambda x: int(x.TokenValue, 0)) > + Count =3D 0 > + while (Count < len(PcdList) - 1) : > + Item =3D PcdList[Count] > + ItemNext =3D PcdList[Count + 1] > + # > + # Make sure in the same token space the TokenValue = should be unique > + # > + if (int(Item.TokenValue, 0) =3D=3D int(ItemNext.Tok= enValue, 0)): > + SameTokenValuePcdList =3D [] > + SameTokenValuePcdList.append(Item) > + SameTokenValuePcdList.append(ItemNext) > + RemainPcdListLength =3D len(PcdList) - Count - = 2 > + for ValueSameCount in range(RemainPcdListLength= ): > + if int(PcdList[len(PcdList) - RemainPcdList= Length + ValueSameCount].TokenValue, 0) =3D=3D int(Item.TokenValue, 0): > + SameTokenValuePcdList.append(PcdList[le= n(PcdList) - RemainPcdListLength + ValueSameCount]) > + else: > + break; > + # > + # Sort same token value PCD list with TokenGuid= and TokenCName > + # > + SameTokenValuePcdList.sort(key=3Dlambda x: "%s.= %s" % (x.TokenSpaceGuidCName, x.TokenCName)) > + SameTokenValuePcdListCount =3D 0 > + while (SameTokenValuePcdListCount < len(SameTok= enValuePcdList) - 1): > + Flag =3D False > + TemListItem =3D SameTokenValuePcdList[SameT= okenValuePcdListCount] > + TemListItemNext =3D SameTokenValuePcdList[S= ameTokenValuePcdListCount + 1] > + > + if (TemListItem.TokenSpaceGuidCName =3D=3D = TemListItemNext.TokenSpaceGuidCName) and (TemListItem.TokenCName !=3D TemLi= stItemNext.TokenCName): > + for PcdItem in GlobalData.MixedPcd: > + if (TemListItem.TokenCName, TemList= Item.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem] or \ > + (TemListItemNext.TokenCName, Te= mListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]: > + Flag =3D True > + if not Flag: > + EdkLogger.error( > + 'build', > + FORMAT_INVALID, > + "The TokenValue [%s] of= PCD [%s.%s] is conflict with: [%s.%s] in %s"\ > + % (TemListItem.TokenVal= ue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemNex= t.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package), > + ExtraData=3DNone > + ) > + SameTokenValuePcdListCount +=3D 1 > + Count +=3D SameTokenValuePcdListCount > + Count +=3D 1 > + > + PcdList =3D list(Package.Pcds.values()) > + PcdList.sort(key=3Dlambda x: "%s.%s" % (x.TokenSpaceGui= dCName, x.TokenCName)) > + Count =3D 0 > + while (Count < len(PcdList) - 1) : > + Item =3D PcdList[Count] > + ItemNext =3D PcdList[Count + 1] > + # > + # Check PCDs with same TokenSpaceGuidCName.TokenCNa= me have same token value as well. > + # > + if (Item.TokenSpaceGuidCName =3D=3D ItemNext.TokenS= paceGuidCName) and (Item.TokenCName =3D=3D ItemNext.TokenCName) and (int(It= em.TokenValue, 0) !=3D int(ItemNext.TokenValue, 0)): > + EdkLogger.error( > + 'build', > + FORMAT_INVALID, > + "The TokenValue [%s] of PCD [%s.%s]= in %s defined in two places should be same as well."\ > + % (Item.TokenValue, Item.TokenSpace= GuidCName, Item.TokenCName, Package), > + ExtraData=3DNone > + ) > + Count +=3D 1 > + ## Generate fds command > + @property > + def GenFdsCommand(self): > + return (GenMake.TopLevelMakefile(self)._TEMPLATE_.Replace(GenMa= ke.TopLevelMakefile(self)._TemplateDict)).strip() > + > + @property > + def GenFdsCommandDict(self): > + FdsCommandDict =3D {} > + LogLevel =3D EdkLogger.GetLevel() > + if LogLevel =3D=3D EdkLogger.VERBOSE: > + FdsCommandDict["verbose"] =3D True > + elif LogLevel <=3D EdkLogger.DEBUG_9: > + FdsCommandDict["debug"] =3D LogLevel - 1 > + elif LogLevel =3D=3D EdkLogger.QUIET: > + FdsCommandDict["quiet"] =3D True > + > + if GlobalData.gEnableGenfdsMultiThread: > + FdsCommandDict["GenfdsMultiThread"] =3D True > + if GlobalData.gIgnoreSource: > + FdsCommandDict["IgnoreSources"] =3D True > + > + FdsCommandDict["OptionPcd"] =3D [] > + for pcd in GlobalData.BuildOptionPcd: > + if pcd[2]: > + pcdname =3D '.'.join(pcd[0:3]) > + else: > + pcdname =3D '.'.join(pcd[0:2]) > + if pcd[3].startswith('{'): > + FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + 'H= ' + '"' + pcd[3] + '"') > + else: > + FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + pc= d[3]) > + > + MacroList =3D [] > + # macros passed to GenFds > + MacroDict =3D {} > + MacroDict.update(GlobalData.gGlobalDefines) > + MacroDict.update(GlobalData.gCommandLineDefines) > + for MacroName in MacroDict: > + if MacroDict[MacroName] !=3D "": > + MacroList.append('"%s=3D%s"' % (MacroName, MacroDict[Ma= croName].replace('\\', '\\\\'))) > + else: > + MacroList.append('"%s"' % MacroName) > + FdsCommandDict["macro"] =3D MacroList > + > + FdsCommandDict["fdf_file"] =3D [self.FdfFile] > + FdsCommandDict["build_target"] =3D self.BuildTarget > + FdsCommandDict["toolchain_tag"] =3D self.ToolChain > + FdsCommandDict["active_platform"] =3D str(self) > + > + FdsCommandDict["conf_directory"] =3D GlobalData.gConfDirectory > + FdsCommandDict["build_architecture_list"] =3D ','.join(self.Arc= hList) > + FdsCommandDict["platform_build_directory"] =3D self.BuildDir > + > + FdsCommandDict["fd"] =3D self.FdTargetList > + FdsCommandDict["fv"] =3D self.FvTargetList > + FdsCommandDict["cap"] =3D self.CapTargetList > + return FdsCommandDict > + > + ## Create makefile for the platform and modules in it > + # > + # @param CreateDepsMakeFile Flag indicating if the make= file for > + # modules will be created as = well > + # > + def CreateMakeFile(self, CreateDepsMakeFile=3DFalse): > + if not CreateDepsMakeFile: > + return > + for Pa in self.AutoGenObjectList: > + Pa.CreateMakeFile(True) > + > + ## Create autogen code for platform and modules > + # > + # Since there's no autogen code for platform, this method will do = nothing > + # if CreateModuleCodeFile is set to False. > + # > + # @param CreateDepsCodeFile Flag indicating if creating= module's > + # autogen code file or not > + # > + def CreateCodeFile(self, CreateDepsCodeFile=3DFalse): > + if not CreateDepsCodeFile: > + return > + for Pa in self.AutoGenObjectList: > + Pa.CreateCodeFile(True) > + > + ## Create AsBuilt INF file the platform > + # > + def CreateAsBuiltInf(self): > + return > + > diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/P= ython/Common/Misc.py > index 1caa184eb923..26d149c27040 100644 > --- a/BaseTools/Source/Python/Common/Misc.py > +++ b/BaseTools/Source/Python/Common/Misc.py > @@ -647,11 +647,10 @@ def GuidValue(CName, PackageList, Inffile =3D None= ): > if not Inffile.startswith(P.MetaFile.Dir): > GuidKeys =3D [x for x in P.Guids if x not in P._Private= Guids] > if CName in GuidKeys: > return P.Guids[CName] > return None > - return None > > ## A string template class > # > # This class implements a template for string replacement. A string te= mplate > # looks like following > diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/Ba= seTools/Source/Python/PatchPcdValue/PatchPcdValue.py > index 02735e165ca1..d35cd792704c 100644 > --- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py > +++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py > @@ -9,11 +9,10 @@ > # Import Modules > # > import Common.LongFilePathOs as os > from Common.LongFilePathSupport import OpenLongFilePath as open > import sys > -import re > > from optparse import OptionParser > from optparse import make_option > from Common.BuildToolError import * > import Common.EdkLogger as EdkLogger > diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseToo= ls/Source/Python/Workspace/DscBuildData.py > index fa41e57c4f45..383aeaaa15c3 100644 > --- a/BaseTools/Source/Python/Workspace/DscBuildData.py > +++ b/BaseTools/Source/Python/Workspace/DscBuildData.py > @@ -1371,15 +1371,15 @@ class DscBuildData(PlatformBuildClassObject): > if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_P= CD_FIXED_AT_BUILD], > self._PCD_TYPE_STRING_[MODEL_PC= D_PATCHABLE_IN_MODULE], > self._PCD_TYPE_STRING_[MODEL_PC= D_FEATURE_FLAG], > self._PCD_TYPE_STRING_[MODEL_PC= D_DYNAMIC], > self._PCD_TYPE_STRING_[MODEL_PC= D_DYNAMIC_EX]]: > - self.Pcds[Name, Guid] =3D copy.deepcopy(PcdInDe= c) > - self.Pcds[Name, Guid].DefaultValue =3D NoFiledV= alues[( Guid, Name)][0] > + self._Pcds[Name, Guid] =3D copy.deepcopy(PcdInD= ec) > + self._Pcds[Name, Guid].DefaultValue =3D NoFiled= Values[( Guid, Name)][0] > if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_P= CD_DYNAMIC], > self._PCD_TYPE_STRING_[MODEL_PC= D_DYNAMIC_EX]]: > - self.Pcds[Name, Guid].SkuInfoList =3D {TAB_DEFA= ULT:SkuInfoClass(TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', '', = '', NoFiledValues[( Guid, Name)][0])} > + self._Pcds[Name, Guid].SkuInfoList =3D {TAB_DEF= AULT:SkuInfoClass(TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', '',= '', NoFiledValues[( Guid, Name)][0])} > return AllPcds > > def OverrideByFdfOverAll(self,AllPcds): > > if GlobalData.gFdfParser is None: > @@ -1417,12 +1417,12 @@ class DscBuildData(PlatformBuildClassObject): > if PcdInDec: > PcdInDec.PcdValueFromFdf =3D Value > if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_P= CD_FIXED_AT_BUILD], > self._PCD_TYPE_STRING_[MODEL_PC= D_PATCHABLE_IN_MODULE], > self._PCD_TYPE_STRING_[MODEL_PC= D_FEATURE_FLAG]]: > - self.Pcds[Name, Guid] =3D copy.deepcopy(PcdInDe= c) > - self.Pcds[Name, Guid].DefaultValue =3D Value > + self._Pcds[Name, Guid] =3D copy.deepcopy(PcdInD= ec) > + self._Pcds[Name, Guid].DefaultValue =3D Value > return AllPcds > > def ParsePcdNameStruct(self,NamePart1,NamePart2): > TokenSpaceCName =3D PcdCName =3D DimensionAttr =3D Field =3D "" > if "." in NamePart1: > diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseToo= ls/Source/Python/Workspace/InfBuildData.py > index da35391d3aff..e63246b03b6e 100644 > --- a/BaseTools/Source/Python/Workspace/InfBuildData.py > +++ b/BaseTools/Source/Python/Workspace/InfBuildData.py > @@ -152,10 +152,17 @@ class InfBuildData(ModuleBuildClassObject): > self._GuidsUsedByPcd =3D OrderedDict() > self._GuidComments =3D None > self._PcdComments =3D None > self._BuildOptions =3D None > self._DependencyFileList =3D None > + self.LibInstances =3D [] > + self.ReferenceModules =3D set() > + self.Guids > + self.Pcds > + def SetReferenceModule(self,Module): > + self.ReferenceModules.add(Module) > + return self > > ## XXX[key] =3D value > def __setitem__(self, key, value): > self.__dict__[self._PROPERTY_[key]] =3D value > > @@ -703,10 +710,29 @@ class InfBuildData(ModuleBuildClassObject): > RetVal.update(self._GetPcd(MODEL_PCD_DYNAMIC)) > RetVal.update(self._GetPcd(MODEL_PCD_DYNAMIC_EX)) > return RetVal > > @cached_property > + def ModulePcdList(self): > + RetVal =3D self.Pcds > + return RetVal > + @cached_property > + def LibraryPcdList(self): > + if bool(self.LibraryClass): > + return [] > + RetVal =3D {} > + Pcds =3D set() > + for Library in self.LibInstances: > + PcdsInLibrary =3D OrderedDict() > + for Key in Library.Pcds: > + if Key in self.Pcds or Key in Pcds: > + continue > + Pcds.add(Key) > + PcdsInLibrary[Key] =3D copy.copy(Library.Pcds[Key]) > + RetVal[Library] =3D PcdsInLibrary > + return RetVal > + @cached_property > def PcdsName(self): > PcdsName =3D set() > for Type in (MODEL_PCD_FIXED_AT_BUILD,MODEL_PCD_PATCHABLE_IN_MO= DULE,MODEL_PCD_FEATURE_FLAG,MODEL_PCD_DYNAMIC,MODEL_PCD_DYNAMIC_EX): > RecordList =3D self._RawData[Type, self._Arch, self._Platfo= rm] > for TokenSpaceGuid, PcdCName, _, _, _, _, _ in RecordList: > @@ -1028,5 +1054,8 @@ class InfBuildData(ModuleBuildClassObject): > @property > def IsBinaryModule(self): > if (self.Binaries and not self.Sources) or GlobalData.gIgnoreSo= urce: > return True > return False > +def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict): > + for Key in CopyFromDict: > + CopyToDict[Key].extend(CopyFromDict[Key]) > diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/Base= Tools/Source/Python/Workspace/WorkspaceCommon.py > index 41ae684d3ee9..76583f46e500 100644 > --- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py > +++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py > @@ -86,10 +86,12 @@ def GetDeclaredPcd(Platform, BuildDatabase, Arch, Ta= rget, Toolchain, additionalP > # > def GetLiabraryInstances(Module, Platform, BuildDatabase, Arch, Target,= Toolchain): > return GetModuleLibInstances(Module, Platform, BuildDatabase, Arch,= Target, Toolchain) > > def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target= , Toolchain, FileName =3D '', EdkLogger =3D None): > + if Module.LibInstances: > + return Module.LibInstances > ModuleType =3D Module.ModuleType > > # add forced library instances (specified under LibraryClasses sect= ions) > # > # If a module has a MODULE_TYPE of USER_DEFINED, > @@ -244,6 +246,8 @@ def GetModuleLibInstances(Module, Platform, BuildDat= abase, Arch, Target, Toolcha > # > # Build the list of constructor and destructor names > # The DAG Topo sort produces the destructor order, so the list of c= onstructors must generated in the reverse order > # > SortedLibraryList.reverse() > + Module.LibInstances =3D SortedLibraryList > + SortedLibraryList =3D [lib.SetReferenceModule(Module) for lib in So= rtedLibraryList] > return SortedLibraryList > diff --git a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py b/Ba= seTools/Source/Python/Workspace/WorkspaceDatabase.py > index 28a975f54e51..ab7b4506c1c1 100644 > --- a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py > +++ b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py > @@ -60,10 +60,12 @@ class WorkspaceDatabase(object): > MODEL_FILE_DEC : DecBuildData, > MODEL_FILE_DSC : DscBuildData, > } > > _CACHE_ =3D {} # (FilePath, Arch) : > + def GetCache(self): > + return self._CACHE_ > > # constructor > def __init__(self, WorkspaceDb): > self.WorkspaceDb =3D WorkspaceDb > > @@ -201,10 +203,11 @@ class WorkspaceDatabase(object): > Platform =3D self.BuildObject[PathClass(Dscfile), TAB_COMMON] > if Platform is None: > EdkLogger.error('build', PARSER_ERROR, "Failed to parser DS= C file: %s" % Dscfile) > return Platform > > +BuildDB =3D WorkspaceDatabase() > ## > # > # This acts like the main() function for the script, unless it is 'impo= rt'ed into another > # script. > # > diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/So= urce/Python/build/BuildReport.py > index b4189240e127..9c12c01d2a2a 100644 > --- a/BaseTools/Source/Python/build/BuildReport.py > +++ b/BaseTools/Source/Python/build/BuildReport.py > @@ -32,11 +32,11 @@ from Common.BuildToolError import CODE_ERROR > from Common.BuildToolError import COMMAND_FAILURE > from Common.BuildToolError import FORMAT_INVALID > from Common.LongFilePathSupport import OpenLongFilePath as open > from Common.MultipleWorkspace import MultipleWorkspace as mws > import Common.GlobalData as GlobalData > -from AutoGen.AutoGen import ModuleAutoGen > +from AutoGen.ModuleAutoGen import ModuleAutoGen > from Common.Misc import PathClass > from Common.StringUtils import NormPath > from Common.DataType import * > import collections > from Common.Expression import * > @@ -2140,11 +2140,11 @@ class PlatformReport(object): > if GlobalData.gFdfParser is not None: > if Pa.Arch in GlobalData.gFdfParser.Profile.InfDict= : > INFList =3D GlobalData.gFdfParser.Profile.InfDi= ct[Pa.Arch] > for InfName in INFList: > InfClass =3D PathClass(NormPath(InfName), W= a.WorkspaceDir, Pa.Arch) > - Ma =3D ModuleAutoGen(Wa, InfClass, Pa.Build= Target, Pa.ToolChain, Pa.Arch, Wa.MetaFile) > + Ma =3D ModuleAutoGen(Wa, InfClass, Pa.Build= Target, Pa.ToolChain, Pa.Arch, Wa.MetaFile,Pa.DataPile) > if Ma is None: > continue > if Ma not in ModuleAutoGenList: > ModuleAutoGenList.append(Ma) > for MGen in ModuleAutoGenList: > diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/P= ython/build/build.py > index 07693b97359e..3d083f4eaade 100644 > --- a/BaseTools/Source/Python/build/build.py > +++ b/BaseTools/Source/Python/build/build.py > @@ -10,46 +10,49 @@ > > ## > # Import Modules > # > from __future__ import print_function > -import Common.LongFilePathOs as os > -import re > +from __future__ import absolute_import > +import os.path as path > import sys > +import os > +import re > import glob > import time > import platform > import traceback > -import encodings.ascii > import multiprocessing > - > -from struct import * > -from threading import * > +from threading import Thread,Event,BoundedSemaphore > import threading > +from subprocess import Popen,PIPE > +from collections import OrderedDict, defaultdict > from optparse import OptionParser > -from subprocess import * > +from AutoGen.PlatformAutoGen import PlatformAutoGen > +from AutoGen.ModuleAutoGen import ModuleAutoGen > +from AutoGen.WorkspaceAutoGen import WorkspaceAutoGen > +from AutoGen import GenMake > from Common import Misc as Utils > > -from Common.LongFilePathSupport import OpenLongFilePath as open > from Common.TargetTxtClassObject import TargetTxt > from Common.ToolDefClassObject import ToolDef > +from Common.Misc import PathClass,SaveFileOnChange,RemoveDirectory > +from Common.StringUtils import NormPath > +from Common.MultipleWorkspace import MultipleWorkspace as mws > +from Common.BuildToolError import * > from Common.DataType import * > +import Common.EdkLogger as EdkLogger > from Common.BuildVersion import gBUILD_VERSION > -from AutoGen.AutoGen import * > -from Common.BuildToolError import * > -from Workspace.WorkspaceDatabase import WorkspaceDatabase > -from Common.MultipleWorkspace import MultipleWorkspace as mws > +from Workspace.WorkspaceDatabase import BuildDB > > from BuildReport import BuildReport > -from GenPatchPcdTable.GenPatchPcdTable import * > -from PatchPcdValue.PatchPcdValue import * > +from GenPatchPcdTable.GenPatchPcdTable import PeImageClass,parsePcdInfo= FromMapFile > +from PatchPcdValue.PatchPcdValue import PatchBinaryFile > > -import Common.EdkLogger > import Common.GlobalData as GlobalData > from GenFds.GenFds import GenFds, GenFdsApi > > -from collections import OrderedDict, defaultdict > > # Version and Copyright > VersionNumber =3D "0.60" + ' ' + gBUILD_VERSION > __version__ =3D "%prog Version " + VersionNumber > __copyright__ =3D "Copyright (c) 2007 - 2018, Intel Corporation All ri= ghts reserved." > @@ -773,11 +776,11 @@ class Build(): > ConfDirectoryPath =3D mws.join(self.WorkspaceDir, 'Conf= ') > GlobalData.gConfDirectory =3D ConfDirectoryPath > GlobalData.gDatabasePath =3D os.path.normpath(os.path.join(Conf= DirectoryPath, GlobalData.gDatabasePath)) > if not os.path.exists(os.path.join(GlobalData.gConfDirectory, '= .cache')): > os.makedirs(os.path.join(GlobalData.gConfDirectory, '.cache= ')) > - self.Db =3D WorkspaceDatabase() > + self.Db =3D BuildDB > self.BuildDatabase =3D self.Db.BuildObject > self.Platform =3D None > self.ToolChainFamily =3D None > self.LoadFixAddress =3D 0 > self.UniFlag =3D BuildOptions.Flag > @@ -1698,17 +1701,21 @@ class Build(): > CmdListDict =3D {} > if GlobalData.gEnableGenfdsMultiThread and self.Fdf: > CmdListDict =3D self._GenFfsCmd(Wa.ArchList) > > for Arch in Wa.ArchList: > + PcdMaList =3D [] > GlobalData.gGlobalDefines['ARCH'] =3D Arch > Pa =3D PlatformAutoGen(Wa, self.PlatformFile, Build= Target, ToolChain, Arch) > for Module in Pa.Platform.Modules: > # Get ModuleAutoGen object to generate C code f= ile and makefile > - Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, T= oolChain, Arch, self.PlatformFile) > + Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, T= oolChain, Arch, self.PlatformFile,Pa.DataPipe) > if Ma is None: > continue > + if Ma.PcdIsDriver: > + Ma.PlatformInfo =3D Pa > + PcdMaList.append(Ma) > self.BuildModules.append(Ma) > self._BuildPa(self.Target, Pa, FfsCommand=3DCmdList= Dict) > > # Create MAP file when Load Fix Address is enabled. > if self.Target in ["", "all", "fds"]: > @@ -1800,11 +1807,11 @@ class Build(): > AutoGenStart =3D time.time() > GlobalData.gGlobalDefines['ARCH'] =3D Arch > Pa =3D PlatformAutoGen(Wa, self.PlatformFile, Build= Target, ToolChain, Arch) > for Module in Pa.Platform.Modules: > if self.ModuleFile.Dir =3D=3D Module.Dir and se= lf.ModuleFile.Name =3D=3D Module.Name: > - Ma =3D ModuleAutoGen(Wa, Module, BuildTarge= t, ToolChain, Arch, self.PlatformFile) > + Ma =3D ModuleAutoGen(Wa, Module, BuildTarge= t, ToolChain, Arch, self.PlatformFile,Pa.DataPipe) > if Ma is None: > continue > MaList.append(Ma) > if Ma.CanSkipbyHash(): > self.HashSkipModules.append(Ma) > @@ -1980,10 +1987,11 @@ class Build(): > # multi-thread exit flag > ExitFlag =3D threading.Event() > ExitFlag.clear() > self.AutoGenTime +=3D int(round((time.time() - Workspac= eAutoGenTime))) > for Arch in Wa.ArchList: > + PcdMaList =3D [] > AutoGenStart =3D time.time() > GlobalData.gGlobalDefines['ARCH'] =3D Arch > Pa =3D PlatformAutoGen(Wa, self.PlatformFile, Build= Target, ToolChain, Arch) > if Pa is None: > continue > @@ -1997,14 +2005,17 @@ class Build(): > if Inf in Pa.Platform.Modules: > continue > ModuleList.append(Inf) > for Module in ModuleList: > # Get ModuleAutoGen object to generate C code f= ile and makefile > - Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, T= oolChain, Arch, self.PlatformFile) > + Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, T= oolChain, Arch, self.PlatformFile,Pa.DataPipe) > > if Ma is None: > continue > + if Ma.PcdIsDriver: > + Ma.PlatformInfo =3D Pa > + PcdMaList.append(Ma) > if Ma.CanSkipbyHash(): > self.HashSkipModules.append(Ma) > if GlobalData.gBinCacheSource: > EdkLogger.quiet("cache hit: %s[%s]" % (= Ma.MetaFile.Path, Ma.Arch)) > continue > --=20 > 2.20.1.windows.1 >=20 >=20 >=20 >=20 >=20