From: "Carsey, Jaben" <jaben.carsey@intel.com>
To: "Zhu, Yonghong" <yonghong.zhu@intel.com>,
"edk2-devel@lists.01.org" <edk2-devel@lists.01.org>
Cc: "Gao, Liming" <liming.gao@intel.com>
Subject: Re: [PATCH v1 2/2] BaseTools: Remove unneeded files
Date: Tue, 10 Apr 2018 14:26:26 +0000 [thread overview]
Message-ID: <CB6E33457884FA40993F35157061515CA3CBDCB3@FMSMSX103.amr.corp.intel.com> (raw)
In-Reply-To: <B9726D6DCCFB8B4CA276A9169B02216D51FDCA14@SHSMSX103.ccr.corp.intel.com>
Thanks.
> -----Original Message-----
> From: Zhu, Yonghong
> Sent: Monday, April 09, 2018 11:39 PM
> To: Carsey, Jaben <jaben.carsey@intel.com>; edk2-devel@lists.01.org
> Cc: Gao, Liming <liming.gao@intel.com>; Zhu, Yonghong
> <yonghong.zhu@intel.com>
> Subject: RE: [PATCH v1 2/2] BaseTools: Remove unneeded files
> Importance: High
>
> Hi Jaben,
>
> I helped to update the Makefile, and pushed this patch. because I pushed
> the patch that remove the Dictionary.py file, but in InfClassObject.py, "from
> Dictionary import *" still exists, it cause build failure.
> Since this 4 files are not used, remove it is fine.
>
> Best Regards,
> Zhu Yonghong
>
>
> -----Original Message-----
> From: Zhu, Yonghong
> Sent: Tuesday, April 10, 2018 8:35 AM
> To: Carsey, Jaben <jaben.carsey@intel.com>; edk2-devel@lists.01.org
> Cc: Gao, Liming <liming.gao@intel.com>; Zhu, Yonghong
> <yonghong.zhu@intel.com>
> Subject: RE: [PATCH v1 2/2] BaseTools: Remove unneeded files
>
> So please send out a V2 for 1) issue fixed. I will review and apply Feng's
> patch for 2).
>
> Best Regards,
> Zhu Yonghong
>
>
> -----Original Message-----
> From: Carsey, Jaben
> Sent: Tuesday, April 10, 2018 5:12 AM
> To: Zhu, Yonghong <yonghong.zhu@intel.com>; edk2-devel@lists.01.org
> Cc: Gao, Liming <liming.gao@intel.com>
> Subject: RE: [PATCH v1 2/2] BaseTools: Remove unneeded files
>
> 1) Yes.
> 2) I didn't do this since EdkIIWorkspaceBuild is already pending removal in a
> separate patch from Feng.
>
> Shall I send out a v2 with workspace fixed or with both?j I didn't want to
> conflict with Feng's patch.
>
> -Jaben
>
> > -----Original Message-----
> > From: Zhu, Yonghong
> > Sent: Sunday, April 08, 2018 1:17 AM
> > To: Carsey, Jaben <jaben.carsey@intel.com>; edk2-devel@lists.01.org
> > Cc: Gao, Liming <liming.gao@intel.com>; Zhu, Yonghong
> > <yonghong.zhu@intel.com>
> > Subject: RE: [PATCH v1 2/2] BaseTools: Remove unneeded files
> > Importance: High
> >
> > Hi Jaben,
> >
> > 1. the BaseTools/Source/Python/Makefile also need remove those .py
> files,
> > otherwise it would cause freeze binary failure.
> > 2. EdkIIWorkspaceBuild.py use some function in those .py files, but in fact
> > the EdkIIWorkspaceBuild.py also can be removed, so how about in your V2
> > patch also remove this file ?
> >
> > Best Regards,
> > Zhu Yonghong
> >
> >
> > -----Original Message-----
> > From: Carsey, Jaben
> > Sent: Wednesday, April 04, 2018 11:02 PM
> > To: edk2-devel@lists.01.org
> > Cc: Gao, Liming <liming.gao@intel.com>; Zhu, Yonghong
> > <yonghong.zhu@intel.com>
> > Subject: [PATCH v1 2/2] BaseTools: Remove unneeded files
> >
> > These files are not used by any tool:
> > BaseTools/Source/Python/Common/DecClassObject.py
> > BaseTools/Source/Python/Common/DscClassObject.py
> > BaseTools/Source/Python/Common/FdfClassObject.py
> > BaseTools/Source/Python/Common/InfClassObject.py
> >
> > Cc: Liming Gao <liming.gao@intel.com>
> > Cc: Yonghong Zhu <yonghong.zhu@intel.com>
> > Contributed-under: TianoCore Contribution Agreement 1.1
> > Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
> > ---
> > BaseTools/Source/Python/Common/DecClassObject.py | 553 --------
> > BaseTools/Source/Python/Common/DscClassObject.py | 1434 --------------
> ---
> > ---
> > BaseTools/Source/Python/Common/FdfClassObject.py | 106 --
> > BaseTools/Source/Python/Common/InfClassObject.py | 1105 ---------------
> > 4 files changed, 3198 deletions(-)
> >
> > diff --git a/BaseTools/Source/Python/Common/DecClassObject.py
> > b/BaseTools/Source/Python/Common/DecClassObject.py
> > deleted file mode 100644
> > index ed998d3b677d..000000000000
> > --- a/BaseTools/Source/Python/Common/DecClassObject.py
> > +++ /dev/null
> > @@ -1,553 +0,0 @@
> > -## @file
> > -# This file is used to define each component of DEC file
> > -#
> > -# Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
> > -# This program and the accompanying materials
> > -# are licensed and made available under the terms and conditions of the
> > BSD License
> > -# which accompanies this distribution. The full text of the license may be
> > found at
> > -# http://opensource.org/licenses/bsd-license.php
> > -#
> > -# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS"
> > BASIS,
> > -# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER
> > EXPRESS OR IMPLIED.
> > -#
> > -
> > -##
> > -# Import Modules
> > -#
> > -import Common.LongFilePathOs as os
> > -from String import *
> > -from DataType import *
> > -from Identification import *
> > -from Dictionary import *
> > -from CommonDataClass.PackageClass import *
> > -from CommonDataClass.CommonClass import PcdClass
> > -from BuildToolError import *
> > -from Table.TableDec import TableDec
> > -import Database
> > -from Parsing import *
> > -import GlobalData
> > -from Common.LongFilePathSupport import OpenLongFilePath as open
> > -
> > -#
> > -# Global variable
> > -#
> > -Section = {TAB_UNKNOWN.upper() : MODEL_UNKNOWN,
> > - TAB_DEC_DEFINES.upper() : MODEL_META_DATA_HEADER,
> > - TAB_INCLUDES.upper() : MODEL_EFI_INCLUDE,
> > - TAB_LIBRARY_CLASSES.upper() : MODEL_EFI_LIBRARY_CLASS,
> > - TAB_COMPONENTS.upper() : MODEL_META_DATA_COMPONENT,
> > - TAB_GUIDS.upper() : MODEL_EFI_GUID,
> > - TAB_PROTOCOLS.upper() : MODEL_EFI_PROTOCOL,
> > - TAB_PPIS.upper() : MODEL_EFI_PPI,
> > - TAB_PCDS_FIXED_AT_BUILD_NULL.upper() :
> > MODEL_PCD_FIXED_AT_BUILD,
> > - TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper() :
> > MODEL_PCD_PATCHABLE_IN_MODULE,
> > - TAB_PCDS_FEATURE_FLAG_NULL.upper() :
> > MODEL_PCD_FEATURE_FLAG,
> > - TAB_PCDS_DYNAMIC_EX_NULL.upper() :
> MODEL_PCD_DYNAMIC_EX,
> > - TAB_PCDS_DYNAMIC_NULL.upper() : MODEL_PCD_DYNAMIC,
> > - TAB_USER_EXTENSIONS.upper() :
> > MODEL_META_DATA_USER_EXTENSION
> > - }
> > -
> > -
> > -## Dec
> > -#
> > -# This class defined the structure used in Dec object
> > -#
> > -# @param Filename: Input value for Filename of Dec file, default is
> > None
> > -# @param IsMergeAllArches: Input value for IsMergeAllArches
> > -# True is to merge all arches
> > -# Fales is not to merge all arches
> > -# default is False
> > -# @param IsToPackage: Input value for IsToPackage
> > -# True is to transfer to PackageObject automatically
> > -# False is not to transfer to PackageObject automatically
> > -# default is False
> > -# @param WorkspaceDir: Input value for current workspace directory,
> > default is None
> > -#
> > -# @var Identification: To store value for Identification, it is a structure as
> > Identification
> > -# @var Defines: To store value for Defines, it is a structure as
> > DecDefines
> > -# @var UserExtensions: To store value for UserExtensions
> > -# @var Package: To store value for Package, it is a structure as
> > PackageClass
> > -# @var WorkspaceDir: To store value for WorkspaceDir
> > -# @var Contents: To store value for Contents, it is a structure as
> > DecContents
> > -# @var KeyList: To store value for KeyList, a list for all Keys used in
> Dec
> > -#
> > -class Dec(object):
> > - def __init__(self, Filename=None, IsToDatabase=False,
> > IsToPackage=False, WorkspaceDir=None, Database=None,
> > SupArchList=DataType.ARCH_LIST):
> > - self.Identification = Identification()
> > - self.Package = PackageClass()
> > - self.UserExtensions = ''
> > - self.WorkspaceDir = WorkspaceDir
> > - self.SupArchList = SupArchList
> > - self.IsToDatabase = IsToDatabase
> > -
> > - self.Cur = Database.Cur
> > - self.TblFile = Database.TblFile
> > - self.TblDec = Database.TblDec
> > - self.FileID = -1
> > -
> > - self.KeyList = [
> > - TAB_INCLUDES, TAB_GUIDS, TAB_PROTOCOLS, TAB_PPIS,
> > TAB_LIBRARY_CLASSES, \
> > - TAB_PCDS_FIXED_AT_BUILD_NULL,
> > TAB_PCDS_PATCHABLE_IN_MODULE_NULL,
> > TAB_PCDS_FEATURE_FLAG_NULL, \
> > - TAB_PCDS_DYNAMIC_NULL, TAB_PCDS_DYNAMIC_EX_NULL,
> > TAB_DEC_DEFINES
> > - ]
> > - #
> > - # Upper all KEYs to ignore case sensitive when parsing
> > - #
> > - self.KeyList = map(lambda c: c.upper(), self.KeyList)
> > -
> > - #
> > - # Init RecordSet
> > - #
> > - self.RecordSet = {}
> > - for Key in self.KeyList:
> > - self.RecordSet[Section[Key]] = []
> > -
> > - #
> > - # Load Dec file if filename is not None
> > - #
> > - if Filename is not None:
> > - self.LoadDecFile(Filename)
> > -
> > - #
> > - # Transfer to Package Object if IsToPackage is True
> > - #
> > - if IsToPackage:
> > - self.DecToPackage()
> > -
> > - ## Load Dec file
> > - #
> > - # Load the file if it exists
> > - #
> > - # @param Filename: Input value for filename of Dec file
> > - #
> > - def LoadDecFile(self, Filename):
> > - #
> > - # Insert a record for file
> > - #
> > - Filename = NormPath(Filename)
> > - self.Identification.FileFullPath = Filename
> > - (self.Identification.FileRelativePath, self.Identification.FileName) =
> > os.path.split(Filename)
> > - self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_DEC)
> > -
> > - #
> > - # Init DecTable
> > - #
> > - #self.TblDec.Table = "Dec%s" % self.FileID
> > - #self.TblDec.Create()
> > -
> > - #
> > - # Init common datas
> > - #
> > - IfDefList, SectionItemList, CurrentSection, ArchList, ThirdList,
> > IncludeFiles = \
> > - [], [], TAB_UNKNOWN, [], [], []
> > - LineNo = 0
> > -
> > - #
> > - # Parse file content
> > - #
> > - IsFindBlockComment = False
> > - ReservedLine = ''
> > - for Line in open(Filename, 'r'):
> > - LineNo = LineNo + 1
> > - #
> > - # Remove comment block
> > - #
> > - if Line.find(TAB_COMMENT_EDK_START) > -1:
> > - ReservedLine = GetSplitList(Line, TAB_COMMENT_EDK_START,
> 1)[0]
> > - IsFindBlockComment = True
> > - if Line.find(TAB_COMMENT_EDK_END) > -1:
> > - Line = ReservedLine + GetSplitList(Line,
> TAB_COMMENT_EDK_END,
> > 1)[1]
> > - ReservedLine = ''
> > - IsFindBlockComment = False
> > - if IsFindBlockComment:
> > - continue
> > -
> > - #
> > - # Remove comments at tail and remove spaces again
> > - #
> > - Line = CleanString(Line)
> > - if Line == '':
> > - continue
> > -
> > - #
> > - # Find a new section tab
> > - # First insert previous section items
> > - # And then parse the content of the new section
> > - #
> > - if Line.startswith(TAB_SECTION_START) and
> > Line.endswith(TAB_SECTION_END):
> > - #
> > - # Insert items data of previous section
> > - #
> > - Model = Section[CurrentSection.upper()]
> > - InsertSectionItemsIntoDatabase(self.TblDec, self.FileID, Filename,
> > Model, CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList,
> > self.RecordSet)
> > -
> > - #
> > - # Parse the new section
> > - #
> > - SectionItemList = []
> > - ArchList = []
> > - ThirdList = []
> > -
> > - CurrentSection = ''
> > - LineList =
> GetSplitValueList(Line[len(TAB_SECTION_START):len(Line)
> > - len(TAB_SECTION_END)], TAB_COMMA_SPLIT)
> > - for Item in LineList:
> > - ItemList = GetSplitValueList(Item, TAB_SPLIT)
> > - if CurrentSection == '':
> > - CurrentSection = ItemList[0]
> > - else:
> > - if CurrentSection != ItemList[0]:
> > - EdkLogger.error("Parser", PARSER_ERROR, "Different
> section
> > names '%s' and '%s' are found in one section definition, this is not allowed."
> > % (CurrentSection, ItemList[0]), File=Filename, Line=LineNo,
> > RaiseError=EdkLogger.IsRaiseError)
> > - if CurrentSection.upper() not in self.KeyList:
> > - RaiseParserError(Line, CurrentSection, Filename, '', LineNo)
> > - ItemList.append('')
> > - ItemList.append('')
> > - if len(ItemList) > 5:
> > - RaiseParserError(Line, CurrentSection, Filename, '', LineNo)
> > - else:
> > - if ItemList[1] != '' and ItemList[1].upper() not in
> > ARCH_LIST_FULL:
> > - EdkLogger.error("Parser", PARSER_ERROR, "Invalid Arch
> > definition '%s' found" % ItemList[1], File=Filename, Line=LineNo,
> > RaiseError=EdkLogger.IsRaiseError)
> > - ArchList.append(ItemList[1].upper())
> > - ThirdList.append(ItemList[2])
> > -
> > - continue
> > -
> > - #
> > - # Not in any defined section
> > - #
> > - if CurrentSection == TAB_UNKNOWN:
> > - ErrorMsg = "%s is not in any defined section" % Line
> > - EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg,
> File=Filename,
> > Line=LineNo, RaiseError=EdkLogger.IsRaiseError)
> > -
> > - #
> > - # Add a section item
> > - #
> > - SectionItemList.append([Line, LineNo])
> > - # End of parse
> > - #End of For
> > -
> > - #
> > - # Insert items data of last section
> > - #
> > - Model = Section[CurrentSection.upper()]
> > - InsertSectionItemsIntoDatabase(self.TblDec, self.FileID, Filename,
> > Model, CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList,
> > self.RecordSet)
> > -
> > - #
> > - # Replace all DEFINE macros with its actual values
> > - #
> > - ParseDefineMacro2(self.TblDec, self.RecordSet,
> > GlobalData.gGlobalDefines)
> > -
> > - ## Transfer to Package Object
> > - #
> > - # Transfer all contents of a Dec file to a standard Package Object
> > - #
> > - def DecToPackage(self):
> > - #
> > - # Init global information for the file
> > - #
> > - ContainerFile = self.Identification.FileFullPath
> > -
> > - #
> > - # Generate Package Header
> > - #
> > - self.GenPackageHeader(ContainerFile)
> > -
> > - #
> > - # Generate Includes
> > - #
> > - self.GenIncludes(ContainerFile)
> > -
> > - #
> > - # Generate Guids
> > - #
> > - self.GenGuidProtocolPpis(DataType.TAB_GUIDS, ContainerFile)
> > -
> > - #
> > - # Generate Protocols
> > - #
> > - self.GenGuidProtocolPpis(DataType.TAB_PROTOCOLS, ContainerFile)
> > -
> > - #
> > - # Generate Ppis
> > - #
> > - self.GenGuidProtocolPpis(DataType.TAB_PPIS, ContainerFile)
> > -
> > - #
> > - # Generate LibraryClasses
> > - #
> > - self.GenLibraryClasses(ContainerFile)
> > -
> > - #
> > - # Generate Pcds
> > - #
> > - self.GenPcds(ContainerFile)
> > -
> > - ## Get Package Header
> > - #
> > - # Gen Package Header of Dec as <Key> = <Value>
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenPackageHeader(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate PackageHeader ...")
> > - #
> > - # Update all defines item in database
> > - #
> > - RecordSet = self.RecordSet[MODEL_META_DATA_HEADER]
> > - for Record in RecordSet:
> > - ValueList = GetSplitValueList(Record[0], TAB_EQUAL_SPLIT)
> > - if len(ValueList) != 2:
> > - RaiseParserError(Record[0], 'Defines', ContainerFile, '<Key> =
> > <Value>', Record[2])
> > - ID, Value1, Value2, Arch, LineNo = Record[3], ValueList[0],
> > ValueList[1], Record[1], Record[2]
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s'
> > - where ID = %s""" % (self.TblDec.Table,
> > ConvertToSqlString2(Value1), ConvertToSqlString2(Value2), ID)
> > - self.TblDec.Exec(SqlCommand)
> > -
> > - #
> > - # Get detailed information
> > - #
> > - for Arch in self.SupArchList:
> > - PackageHeader = PackageHeaderClass()
> > -
> > - PackageHeader.Name = QueryDefinesItem(self.TblDec,
> > TAB_DEC_DEFINES_PACKAGE_NAME, Arch, self.FileID)[0]
> > - PackageHeader.Guid = QueryDefinesItem(self.TblDec,
> > TAB_DEC_DEFINES_PACKAGE_GUID, Arch, self.FileID)[0]
> > - PackageHeader.Version = QueryDefinesItem(self.TblDec,
> > TAB_DEC_DEFINES_PACKAGE_VERSION, Arch, self.FileID)[0]
> > - PackageHeader.FileName = self.Identification.FileName
> > - PackageHeader.FullPath = self.Identification.FileFullPath
> > - PackageHeader.DecSpecification = QueryDefinesItem(self.TblDec,
> > TAB_DEC_DEFINES_DEC_SPECIFICATION, Arch, self.FileID)[0]
> > -
> > - self.Package.Header[Arch] = PackageHeader
> > -
> > - ## GenIncludes
> > - #
> > - # Gen Includes of Dec
> > - #
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenIncludes(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_INCLUDES)
> > - Includes = {}
> > - #
> > - # Get all Includes
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_INCLUDE]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - MergeArches(Includes, Record[0], Arch)
> > -
> > - for Key in Includes.keys():
> > - Include = IncludeClass()
> > - Include.FilePath = NormPath(Key)
> > - Include.SupArchList = Includes[Key]
> > - self.Package.Includes.append(Include)
> > -
> > - ## GenPpis
> > - #
> > - # Gen Ppis of Dec
> > - # <CName>=<GuidValue>
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenGuidProtocolPpis(self, Type, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % Type)
> > - Lists = {}
> > - #
> > - # Get all Items
> > - #
> > - RecordSet = self.RecordSet[Section[Type.upper()]]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (Name, Value) = GetGuidsProtocolsPpisOfDec(Record[0], Type,
> > ContainerFile, Record[2])
> > - MergeArches(Lists, (Name, Value), Arch)
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s'
> > - where ID = %s""" % (self.TblDec.Table,
> > ConvertToSqlString2(Name), ConvertToSqlString2(Value), Record[3])
> > - self.TblDec.Exec(SqlCommand)
> > -
> > - ListMember = None
> > - if Type == TAB_GUIDS:
> > - ListMember = self.Package.GuidDeclarations
> > - elif Type == TAB_PROTOCOLS:
> > - ListMember = self.Package.ProtocolDeclarations
> > - elif Type == TAB_PPIS:
> > - ListMember = self.Package.PpiDeclarations
> > -
> > - for Key in Lists.keys():
> > - ListClass = GuidProtocolPpiCommonClass()
> > - ListClass.CName = Key[0]
> > - ListClass.Guid = Key[1]
> > - ListClass.SupArchList = Lists[Key]
> > - ListMember.append(ListClass)
> > -
> > -
> > - ## GenLibraryClasses
> > - #
> > - # Gen LibraryClasses of Dec
> > - # <CName>=<GuidValue>
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenLibraryClasses(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_LIBRARY_CLASSES)
> > - LibraryClasses = {}
> > - #
> > - # Get all Guids
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_LIBRARY_CLASS]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - List = GetSplitValueList(Record[0], DataType.TAB_VALUE_SPLIT)
> > - if len(List) != 2:
> > - RaiseParserError(Record[0], 'LibraryClasses', ContainerFile,
> > '<LibraryClassName>|<LibraryClassInstanceFilename>', Record[2])
> > - else:
> > - CheckFileExist(self.Identification.FileRelativePath, List[1],
> > ContainerFile, 'LibraryClasses', Record[0])
> > - MergeArches(LibraryClasses, (List[0], List[1]), Arch)
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s',
> > Value3 = '%s'
> > - where ID = %s""" % (self.TblDec.Table,
> > ConvertToSqlString2(List[0]), ConvertToSqlString2(List[1]),
> > SUP_MODULE_LIST_STRING, Record[3])
> > - self.TblDec.Exec(SqlCommand)
> > -
> > -
> > - for Key in LibraryClasses.keys():
> > - LibraryClass = LibraryClassClass()
> > - LibraryClass.LibraryClass = Key[0]
> > - LibraryClass.RecommendedInstance = NormPath(Key[1])
> > - LibraryClass.SupModuleList = SUP_MODULE_LIST
> > - LibraryClass.SupArchList = LibraryClasses[Key]
> > - self.Package.LibraryClassDeclarations.append(LibraryClass)
> > -
> > - ## GenPcds
> > - #
> > - # Gen Pcds of Dec
> > - #
> <TokenSpcCName>.<TokenCName>|<Value>|<DatumType>|<Token>
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenPcds(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_PCDS)
> > - Pcds = {}
> > - PcdToken = {}
> > - #
> > - # Get all Guids
> > - #
> > - RecordSet1 = self.RecordSet[MODEL_PCD_FIXED_AT_BUILD]
> > - RecordSet2 = self.RecordSet[MODEL_PCD_PATCHABLE_IN_MODULE]
> > - RecordSet3 = self.RecordSet[MODEL_PCD_FEATURE_FLAG]
> > - RecordSet4 = self.RecordSet[MODEL_PCD_DYNAMIC_EX]
> > - RecordSet5 = self.RecordSet[MODEL_PCD_DYNAMIC]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet1:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, DatumType, Token,
> > Type) = GetPcdOfDec(Record[0], TAB_PCDS_FIXED_AT_BUILD,
> > ContainerFile, Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > DatumType, Token, Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet2:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, DatumType, Token,
> > Type) = GetPcdOfDec(Record[0], TAB_PCDS_PATCHABLE_IN_MODULE,
> > ContainerFile, Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > DatumType, Token, Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet3:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, DatumType, Token,
> > Type) = GetPcdOfDec(Record[0], TAB_PCDS_FEATURE_FLAG,
> ContainerFile,
> > Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > DatumType, Token, Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet4:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, DatumType, Token,
> > Type) = GetPcdOfDec(Record[0], TAB_PCDS_DYNAMIC_EX, ContainerFile,
> > Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > DatumType, Token, Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet5:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, DatumType, Token,
> > Type) = GetPcdOfDec(Record[0], TAB_PCDS_DYNAMIC, ContainerFile,
> > Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > DatumType, Token, Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - #
> > - # Update to database
> > - #
> > - if self.IsToDatabase:
> > - for Key in PcdToken.keys():
> > - SqlCommand = """update %s set Value2 = '%s' where ID = %s""" %
> > (self.TblDec.Table, ".".join((PcdToken[Key][0], PcdToken[Key][1])), Key)
> > - self.TblDec.Exec(SqlCommand)
> > -
> > - for Key in Pcds.keys():
> > - Pcd = PcdClass()
> > - Pcd.CName = Key[1]
> > - Pcd.Token = Key[4]
> > - Pcd.TokenSpaceGuidCName = Key[0]
> > - Pcd.DatumType = Key[3]
> > - Pcd.DefaultValue = Key[2]
> > - Pcd.ItemType = Key[5]
> > - Pcd.SupArchList = Pcds[Key]
> > - self.Package.PcdDeclarations.append(Pcd)
> > -
> > - ## Show detailed information of Package
> > - #
> > - # Print all members and their values of Package class
> > - #
> > - def ShowPackage(self):
> > - M = self.Package
> > - for Arch in M.Header.keys():
> > - print '\nArch =', Arch
> > - print 'Filename =', M.Header[Arch].FileName
> > - print 'FullPath =', M.Header[Arch].FullPath
> > - print 'BaseName =', M.Header[Arch].Name
> > - print 'Guid =', M.Header[Arch].Guid
> > - print 'Version =', M.Header[Arch].Version
> > - print 'DecSpecification =', M.Header[Arch].DecSpecification
> > - print '\nIncludes =', M.Includes
> > - for Item in M.Includes:
> > - print Item.FilePath, Item.SupArchList
> > - print '\nGuids =', M.GuidDeclarations
> > - for Item in M.GuidDeclarations:
> > - print Item.CName, Item.Guid, Item.SupArchList
> > - print '\nProtocols =', M.ProtocolDeclarations
> > - for Item in M.ProtocolDeclarations:
> > - print Item.CName, Item.Guid, Item.SupArchList
> > - print '\nPpis =', M.PpiDeclarations
> > - for Item in M.PpiDeclarations:
> > - print Item.CName, Item.Guid, Item.SupArchList
> > - print '\nLibraryClasses =', M.LibraryClassDeclarations
> > - for Item in M.LibraryClassDeclarations:
> > - print Item.LibraryClass, Item.RecommendedInstance,
> > Item.SupModuleList, Item.SupArchList
> > - print '\nPcds =', M.PcdDeclarations
> > - for Item in M.PcdDeclarations:
> > - print 'CName=', Item.CName, 'TokenSpaceGuidCName=',
> > Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue,
> > 'ItemType=', Item.ItemType, 'Token=', Item.Token, 'DatumType=',
> > Item.DatumType, Item.SupArchList
> > -
> > -##
> > -#
> > -# This acts like the main() function for the script, unless it is 'import'ed into
> > another
> > -# script.
> > -#
> > -if __name__ == '__main__':
> > - EdkLogger.Initialize()
> > - EdkLogger.SetLevel(EdkLogger.DEBUG_0)
> > -
> > - W = os.getenv('WORKSPACE')
> > - F = os.path.join(W, 'Nt32Pkg/Nt32Pkg.dec')
> > -
> > - Db = Database.Database('Dec.db')
> > - Db.InitDatabase()
> > -
> > - P = Dec(os.path.normpath(F), True, True, W, Db)
> > - P.ShowPackage()
> > -
> > - Db.Close()
> > diff --git a/BaseTools/Source/Python/Common/DscClassObject.py
> > b/BaseTools/Source/Python/Common/DscClassObject.py
> > deleted file mode 100644
> > index da3101ae0fe9..000000000000
> > --- a/BaseTools/Source/Python/Common/DscClassObject.py
> > +++ /dev/null
> > @@ -1,1434 +0,0 @@
> > -## @file
> > -# This file is used to define each component of DSC file
> > -#
> > -# Copyright (c) 2007 - 2016, Intel Corporation. All rights reserved.<BR>
> > -# This program and the accompanying materials
> > -# are licensed and made available under the terms and conditions of the
> > BSD License
> > -# which accompanies this distribution. The full text of the license may be
> > found at
> > -# http://opensource.org/licenses/bsd-license.php
> > -#
> > -# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS"
> > BASIS,
> > -# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER
> > EXPRESS OR IMPLIED.
> > -#
> > -
> > -##
> > -# Import Modules
> > -#
> > -import Common.LongFilePathOs as os
> > -import EdkLogger as EdkLogger
> > -import Database
> > -from String import *
> > -from Parsing import *
> > -from DataType import *
> > -from Identification import *
> > -from Dictionary import *
> > -from CommonDataClass.PlatformClass import *
> > -from CommonDataClass.CommonClass import SkuInfoClass
> > -from BuildToolError import *
> > -from Misc import sdict
> > -import GlobalData
> > -from Table.TableDsc import TableDsc
> > -from Common.LongFilePathSupport import OpenLongFilePath as open
> > -
> > -#
> > -# Global variable
> > -#
> > -Section = {TAB_UNKNOWN.upper() : MODEL_UNKNOWN,
> > - TAB_DSC_DEFINES.upper() : MODEL_META_DATA_HEADER,
> > - TAB_BUILD_OPTIONS.upper() :
> MODEL_META_DATA_BUILD_OPTION,
> > - TAB_SKUIDS.upper() : MODEL_EFI_SKU_ID,
> > - TAB_LIBRARIES.upper() : MODEL_EFI_LIBRARY_INSTANCE,
> > - TAB_LIBRARY_CLASSES.upper() : MODEL_EFI_LIBRARY_CLASS,
> > - TAB_PCDS_FIXED_AT_BUILD_NULL.upper() :
> > MODEL_PCD_FIXED_AT_BUILD,
> > - TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper() :
> > MODEL_PCD_PATCHABLE_IN_MODULE,
> > - TAB_PCDS_FEATURE_FLAG_NULL.upper() :
> > MODEL_PCD_FEATURE_FLAG,
> > - TAB_PCDS_DYNAMIC_EX_NULL.upper() :
> MODEL_PCD_DYNAMIC_EX,
> > - TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL.upper() :
> > MODEL_PCD_DYNAMIC_EX_DEFAULT,
> > - TAB_PCDS_DYNAMIC_EX_VPD_NULL.upper() :
> > MODEL_PCD_DYNAMIC_EX_VPD,
> > - TAB_PCDS_DYNAMIC_EX_HII_NULL.upper() :
> > MODEL_PCD_DYNAMIC_EX_HII,
> > - TAB_PCDS_DYNAMIC_NULL.upper() : MODEL_PCD_DYNAMIC,
> > - TAB_PCDS_DYNAMIC_DEFAULT_NULL.upper() :
> > MODEL_PCD_DYNAMIC_DEFAULT,
> > - TAB_PCDS_DYNAMIC_VPD_NULL.upper() :
> > MODEL_PCD_DYNAMIC_VPD,
> > - TAB_PCDS_DYNAMIC_HII_NULL.upper() :
> > MODEL_PCD_DYNAMIC_HII,
> > - TAB_COMPONENTS.upper() : MODEL_META_DATA_COMPONENT,
> > - TAB_USER_EXTENSIONS.upper() :
> > MODEL_META_DATA_USER_EXTENSION
> > - }
> > -
> > -## Dsc
> > -#
> > -# This class defined the structure used in Dsc object
> > -#
> > -# @param Ffilename: Input value for Ffilename of Inf file, default is
> > None
> > -# @param IsMergeAllArches: Input value for IsMergeAllArches
> > -# True is to merge all arches
> > -# Fales is not to merge all arches
> > -# default is False
> > -# @param IsToPlatform: Input value for IsToPlatform
> > -# True is to transfer to ModuleObject automatically
> > -# False is not to transfer to ModuleObject automatically
> > -# default is False
> > -# @param WorkspaceDir: Input value for current workspace directory,
> > default is None
> > -#
> > -# @var _NullClassIndex: To store value for _NullClassIndex, default is 0
> > -# @var Identification: To store value for Identification, it is a structure as
> > Identification
> > -# @var Defines: To store value for Defines, it is a structure as
> > DscDefines
> > -# @var Contents: To store value for Contents, it is a structure as
> > DscContents
> > -# @var UserExtensions: To store value for UserExtensions
> > -# @var Platform: To store value for Platform, it is a structure as
> > PlatformClass
> > -# @var WorkspaceDir: To store value for WorkspaceDir
> > -# @var KeyList: To store value for KeyList, a list for all Keys used in
> Dec
> > -#
> > -class Dsc(object):
> > - _NullClassIndex = 0
> > -
> > - def __init__(self, Filename=None, IsToDatabase=False,
> > IsToPlatform=False, WorkspaceDir=None, Database=None):
> > - self.Identification = Identification()
> > - self.Platform = PlatformClass()
> > - self.UserExtensions = ''
> > - self.WorkspaceDir = WorkspaceDir
> > - self.IsToDatabase = IsToDatabase
> > - if Database:
> > - self.Cur = Database.Cur
> > - self.TblFile = Database.TblFile
> > - self.TblDsc = Database.TblDsc
> > -
> > - self.KeyList = [
> > - TAB_SKUIDS, TAB_LIBRARIES, TAB_LIBRARY_CLASSES,
> > TAB_BUILD_OPTIONS, TAB_PCDS_FIXED_AT_BUILD_NULL, \
> > - TAB_PCDS_PATCHABLE_IN_MODULE_NULL,
> > TAB_PCDS_FEATURE_FLAG_NULL, \
> > - TAB_PCDS_DYNAMIC_DEFAULT_NULL,
> > TAB_PCDS_DYNAMIC_HII_NULL, TAB_PCDS_DYNAMIC_VPD_NULL, \
> > - TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL,
> > TAB_PCDS_DYNAMIC_EX_HII_NULL,
> TAB_PCDS_DYNAMIC_EX_VPD_NULL, \
> > - TAB_COMPONENTS, TAB_DSC_DEFINES
> > - ]
> > -
> > - self.PcdToken = {}
> > -
> > - #
> > - # Upper all KEYs to ignore case sensitive when parsing
> > - #
> > - self.KeyList = map(lambda c: c.upper(), self.KeyList)
> > -
> > - #
> > - # Init RecordSet
> > - #
> > -# self.RecordSet = {}
> > -# for Key in self.KeyList:
> > -# self.RecordSet[Section[Key]] = []
> > -
> > - #
> > - # Load Dsc file if filename is not None
> > - #
> > - if Filename is not None:
> > - self.LoadDscFile(Filename)
> > -
> > - #
> > - # Transfer to Platform Object if IsToPlatform is True
> > - #
> > - if IsToPlatform:
> > - self.DscToPlatform()
> > -
> > - ## Transfer to Platform Object
> > - #
> > - # Transfer all contents of an Inf file to a standard Module Object
> > - #
> > - def DscToPlatform(self):
> > - #
> > - # Init global information for the file
> > - #
> > - ContainerFile = self.Identification.FileFullPath
> > -
> > - #
> > - # Generate Platform Header
> > - #
> > - self.GenPlatformHeader(ContainerFile)
> > -
> > - #
> > - # Generate BuildOptions
> > - #
> > - self.GenBuildOptions(ContainerFile)
> > -
> > - #
> > - # Generate SkuInfos
> > - #
> > - self.GenSkuInfos(ContainerFile)
> > -
> > - #
> > - # Generate Libraries
> > - #
> > - self.GenLibraries(ContainerFile)
> > -
> > - #
> > - # Generate LibraryClasses
> > - #
> > - self.GenLibraryClasses(ContainerFile)
> > -
> > - #
> > - # Generate Pcds
> > - #
> > - self.GenPcds(DataType.TAB_PCDS_FIXED_AT_BUILD, ContainerFile)
> > - self.GenPcds(DataType.TAB_PCDS_PATCHABLE_IN_MODULE,
> > ContainerFile)
> > - self.GenFeatureFlagPcds(DataType.TAB_PCDS_FEATURE_FLAG,
> > ContainerFile)
> > -
> > self.GenDynamicDefaultPcds(DataType.TAB_PCDS_DYNAMIC_DEFAULT,
> > ContainerFile)
> > -
> >
> self.GenDynamicDefaultPcds(DataType.TAB_PCDS_DYNAMIC_EX_DEFAULT,
> > ContainerFile)
> > - self.GenDynamicHiiPcds(DataType.TAB_PCDS_DYNAMIC_HII,
> > ContainerFile)
> > - self.GenDynamicHiiPcds(DataType.TAB_PCDS_DYNAMIC_EX_HII,
> > ContainerFile)
> > - self.GenDynamicVpdPcds(DataType.TAB_PCDS_DYNAMIC_VPD,
> > ContainerFile)
> > - self.GenDynamicVpdPcds(DataType.TAB_PCDS_DYNAMIC_EX_VPD,
> > ContainerFile)
> > -
> > - #
> > - # Generate Components
> > - #
> > - self.GenComponents(ContainerFile)
> > -
> > - #
> > - # Update to database
> > - #
> > - if self.IsToDatabase:
> > - for Key in self.PcdToken.keys():
> > - SqlCommand = """update %s set Value2 = '%s' where ID = %s""" %
> > (self.TblDsc.Table, ".".join((self.PcdToken[Key][0],
> self.PcdToken[Key][1])),
> > Key)
> > - self.TblDsc.Exec(SqlCommand)
> > - #End of DscToPlatform
> > -
> > - ## Get Platform Header
> > - #
> > - # Gen Platform Header of Dsc as <Key> = <Value>
> > - #
> > - # @param ContainerFile: The Dsc file full path
> > - #
> > - def GenPlatformHeader(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate PlatformHeader ...")
> > - #
> > - # Update all defines item in database
> > - #
> > - SqlCommand = """select ID, Value1, Arch, StartLine from %s
> > - where Model = %s
> > - and BelongsToFile = %s
> > - and Enabled > -1""" % (self.TblDsc.Table,
> > MODEL_META_DATA_HEADER, self.FileID)
> > - RecordSet = self.TblDsc.Exec(SqlCommand)
> > - for Record in RecordSet:
> > - ValueList = GetSplitValueList(Record[1], TAB_EQUAL_SPLIT)
> > - if len(ValueList) != 2:
> > - RaiseParserError(Record[1], 'Defines', ContainerFile, '<Key> =
> > <Value>', Record[3])
> > - ID, Value1, Value2, Arch = Record[0], ValueList[0], ValueList[1],
> > Record[2]
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s'
> > - where ID = %s""" % (self.TblDsc.Table,
> > ConvertToSqlString2(Value1), ConvertToSqlString2(Value2), ID)
> > - self.TblDsc.Exec(SqlCommand)
> > -
> > - #
> > - # Get detailed information
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - PlatformHeader = PlatformHeaderClass()
> > -
> > - PlatformHeader.Name = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_PLATFORM_NAME, Arch, self.FileID)[0]
> > - PlatformHeader.Guid = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_PLATFORM_GUID, Arch, self.FileID)[0]
> > - PlatformHeader.Version = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_PLATFORM_VERSION, Arch, self.FileID)[0]
> > - PlatformHeader.FileName = self.Identification.FileName
> > - PlatformHeader.FullPath = self.Identification.FileFullPath
> > - PlatformHeader.DscSpecification = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_DSC_SPECIFICATION, Arch, self.FileID)[0]
> > -
> > - PlatformHeader.SkuIdName = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_SKUID_IDENTIFIER, Arch, self.FileID)
> > - PlatformHeader.SupArchList = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_SUPPORTED_ARCHITECTURES, Arch, self.FileID)
> > - PlatformHeader.BuildTargets = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_BUILD_TARGETS, Arch, self.FileID)
> > - PlatformHeader.OutputDirectory =
> > NormPath(QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_OUTPUT_DIRECTORY, Arch, self.FileID)[0])
> > - PlatformHeader.BuildNumber = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_BUILD_NUMBER, Arch, self.FileID)[0]
> > - PlatformHeader.MakefileName = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_MAKEFILE_NAME, Arch, self.FileID)[0]
> > -
> > - PlatformHeader.BsBaseAddress = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_BS_BASE_ADDRESS, Arch, self.FileID)[0]
> > - PlatformHeader.RtBaseAddress = QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_RT_BASE_ADDRESS, Arch, self.FileID)[0]
> > -
> > - self.Platform.Header[Arch] = PlatformHeader
> > - Fdf = PlatformFlashDefinitionFileClass()
> > - Fdf.FilePath = NormPath(QueryDefinesItem(self.TblDsc,
> > TAB_DSC_DEFINES_FLASH_DEFINITION, Arch, self.FileID)[0])
> > - self.Platform.FlashDefinitionFile = Fdf
> > - Prebuild = BuildScriptClass()
> > - Prebuild.FilePath = NormPath(QueryDefinesItem(self.TblDsc,
> > TAB_DSC_PREBUILD, Arch, self.FileID)[0])
> > - self.Platform.Prebuild = Prebuild
> > - Postbuild = BuildScriptClass()
> > - Postbuild.FilePath = NormPath(QueryDefinesItem(self.TblDsc,
> > TAB_DSC_POSTBUILD, Arch, self.FileID)[0])
> > - self.Platform.Postbuild = Postbuild
> > -
> > - ## GenBuildOptions
> > - #
> > - # Gen BuildOptions of Dsc
> > - # [<Family>:]<ToolFlag>=Flag
> > - #
> > - # @param ContainerFile: The Dsc file full path
> > - #
> > - def GenBuildOptions(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_BUILD_OPTIONS)
> > - BuildOptions = {}
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, MODEL_META_DATA_BUILD_OPTION,
> > self.FileID)
> > -
> > - #
> > - # Get all BuildOptions
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_BUILD_OPTION, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, TAB_BUILD_OPTIONS, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - (Family, ToolChain, Flag) = GetBuildOption(NewItem,
> Filename,
> > -1)
> > - MergeArches(BuildOptions, (Family, ToolChain, Flag), Arch)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - (Family, ToolChain, Flag) = GetBuildOption(Record[0],
> > ContainerFile, Record[2])
> > - MergeArches(BuildOptions, (Family, ToolChain, Flag), Arch)
> > - #
> > - # Update to Database
> > - #
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s',
> > Value3 = '%s'
> > - where ID = %s""" % (self.TblDsc.Table,
> > ConvertToSqlString2(Family), ConvertToSqlString2(ToolChain),
> > ConvertToSqlString2(Flag), Record[3])
> > - self.TblDsc.Exec(SqlCommand)
> > -
> > - for Key in BuildOptions.keys():
> > - BuildOption = BuildOptionClass(Key[0], Key[1], Key[2])
> > - BuildOption.SupArchList = BuildOptions[Key]
> > - self.Platform.BuildOptions.BuildOptionList.append(BuildOption)
> > -
> > - ## GenSkuInfos
> > - #
> > - # Gen SkuInfos of Dsc
> > - # <Integer>|<UiName>
> > - #
> > - # @param ContainerFile: The Dsc file full path
> > - #
> > - def GenSkuInfos(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_SKUIDS)
> > - #
> > - # SkuIds
> > - # <Integer>|<UiName>
> > - #
> > - self.Platform.SkuInfos.SkuInfoList['DEFAULT'] = '0'
> > -
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, MODEL_EFI_SKU_ID, self.FileID)
> > -
> > - #
> > - # Get all SkuInfos
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc, MODEL_EFI_SKU_ID, -1,
> > self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, TAB_SKUIDS, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - List = GetSplitValueList(NewItem)
> > - if len(List) != 2:
> > - RaiseParserError(NewItem, TAB_SKUIDS, Filename,
> > '<Integer>|<UiName>')
> > - else:
> > - self.Platform.SkuInfos.SkuInfoList[List[1]] = List[0]
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - List = GetSplitValueList(Record[0])
> > - if len(List) != 2:
> > - RaiseParserError(Record[0], TAB_SKUIDS, ContainerFile,
> > '<Integer>|<UiName>')
> > - else:
> > - self.Platform.SkuInfos.SkuInfoList[List[1]] = List[0]
> > - #
> > - # Update to Database
> > - #
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s'
> > - where ID = %s""" % (self.TblDsc.Table,
> > ConvertToSqlString2(List[0]), ConvertToSqlString2(List[1]), Record[3])
> > - self.TblDsc.Exec(SqlCommand)
> > -
> > - ## GenLibraries
> > - #
> > - # Gen Libraries of Dsc
> > - # <PathAndFilename>
> > - #
> > - # @param ContainerFile: The Dsc file full path
> > - #
> > - def GenLibraries(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_LIBRARIES)
> > - Libraries = {}
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, MODEL_EFI_LIBRARY_INSTANCE,
> > self.FileID)
> > -
> > - #
> > - # Get all Libraries
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc,
> > MODEL_EFI_LIBRARY_INSTANCE, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, TAB_LIBRARIES, '', IncludeFile[2])
> > - if os.path.exists(Filename):
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - MergeArches(Libraries, NewItem, Arch)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - MergeArches(Libraries, Record[0], Arch)
> > -
> > - for Key in Libraries.keys():
> > - Library = PlatformLibraryClass()
> > - Library.FilePath = NormPath(Key)
> > - Library.SupArchList = Libraries[Key]
> > - self.Platform.Libraries.LibraryList.append(Library)
> > -
> > - ## GenLibraryClasses
> > - #
> > - # Get LibraryClasses of Dsc
> > - # <LibraryClassKeyWord>|<LibraryInstance>
> > - #
> > - # @param ContainerFile: The Dsc file full path
> > - #
> > - def GenLibraryClasses(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_LIBRARY_CLASSES)
> > - LibraryClasses = {}
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, MODEL_EFI_LIBRARY_CLASS, self.FileID)
> > -
> > - #
> > - # Get all LibraryClasses
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc, MODEL_EFI_LIBRARY_CLASS, -
> 1,
> > self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, TAB_LIBRARY_CLASSES, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - MergeArches(LibraryClasses, GetLibraryClass([NewItem,
> > IncludeFile[4]], Filename, self.WorkspaceDir, -1), Arch)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - (LibClassName, LibClassIns, SupModelList) =
> > GetLibraryClass([Record[0], Record[4]], ContainerFile, self.WorkspaceDir,
> > Record[2])
> > - MergeArches(LibraryClasses, (LibClassName, LibClassIns,
> > SupModelList), Arch)
> > - #
> > - # Update to Database
> > - #
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s',
> > Value3 = '%s'
> > - where ID = %s""" % (self.TblDsc.Table,
> > ConvertToSqlString2(LibClassName), ConvertToSqlString2(LibClassIns),
> > ConvertToSqlString2(SupModelList), Record[3])
> > - self.TblDsc.Exec(SqlCommand)
> > -
> > - for Key in LibraryClasses.keys():
> > - Library = PlatformLibraryClass()
> > - Library.Name = Key[0]
> > - Library.FilePath = NormPath(Key[1])
> > - Library.SupModuleList = GetSplitValueList(Key[2])
> > - Library.SupArchList = LibraryClasses[Key]
> > - self.Platform.LibraryClasses.LibraryList.append(Library)
> > -
> > - ## Gen Pcds
> > - #
> > - # Gen Pcd of Dsc as
> >
> <PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<Type>|<Maximu
> > mDatumSize>]
> > - #
> > - # @param Type: The type of Pcd
> > - # @param ContainerFile: The file which describes the pcd, used for
> error
> > report
> > - #
> > - def GenPcds(self, Type='', ContainerFile=''):
> > - Pcds = {}
> > - if Type == DataType.TAB_PCDS_PATCHABLE_IN_MODULE:
> > - Model = MODEL_PCD_PATCHABLE_IN_MODULE
> > - elif Type == DataType.TAB_PCDS_FIXED_AT_BUILD:
> > - Model = MODEL_PCD_FIXED_AT_BUILD
> > - else:
> > - pass
> > - EdkLogger.debug(2, "Generate %s ..." % Type)
> > -
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, Model, self.FileID)
> > -
> > - #
> > - # Get all Pcds
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc, Model, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, Type, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - (TokenName, TokenGuidCName, Value, DatumType,
> > MaxDatumSize, Type) = GetPcd(NewItem, Type, Filename, -1)
> > - MergeArches(Pcds, (TokenName, TokenGuidCName, Value,
> > DatumType, MaxDatumSize, Type), Arch)
> > - self.PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - (TokenName, TokenGuidCName, Value, DatumType,
> > MaxDatumSize, Type) = GetPcd(Record[0], Type, ContainerFile, Record[2])
> > - MergeArches(Pcds, (TokenName, TokenGuidCName, Value,
> > DatumType, MaxDatumSize, Type), Arch)
> > - self.PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > -
> > - for Key in Pcds:
> > - Pcd = PcdClass(Key[0], '', Key[1], Key[3], Key[4], Key[2], Key[5], [],
> {},
> > [])
> > - Pcd.SupArchList = Pcds[Key]
> > - self.Platform.DynamicPcdBuildDefinitions.append(Pcd)
> > -
> > - ## Gen FeatureFlagPcds
> > - #
> > - # Gen FeatureFlagPcds of Dsc file as
> > <PcdTokenSpaceGuidCName>.<TokenCName>|TRUE/FALSE
> > - #
> > - # @param Type: The type of Pcd
> > - # @param ContainerFile: The file which describes the pcd, used for
> error
> > report
> > - #
> > - def GenFeatureFlagPcds(self, Type='', ContainerFile=''):
> > - Pcds = {}
> > - if Type == DataType.TAB_PCDS_FEATURE_FLAG:
> > - Model = MODEL_PCD_FEATURE_FLAG
> > - else:
> > - pass
> > - EdkLogger.debug(2, "Generate %s ..." % Type)
> > -
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, Model, self.FileID)
> > -
> > - #
> > - # Get all FeatureFlagPcds
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc, Model, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, Type, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - (TokenName, TokenGuidCName, Value, Type) =
> > GetFeatureFlagPcd(NewItem, Type, Filename, -1)
> > - MergeArches(Pcds, (TokenName, TokenGuidCName, Value,
> > Type), Arch)
> > - self.PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - (TokenName, TokenGuidCName, Value, Type) =
> > GetFeatureFlagPcd(Record[0], Type, ContainerFile, Record[2])
> > - MergeArches(Pcds, (TokenName, TokenGuidCName, Value,
> > Type), Arch)
> > - self.PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > -
> > - for Key in Pcds:
> > - Pcd = PcdClass(Key[0], '', Key[1], '', '', Key[2], Key[3], [], {}, [])
> > - Pcd.SupArchList = Pcds[Key]
> > - self.Platform.DynamicPcdBuildDefinitions.append(Pcd)
> > -
> > - ## Gen DynamicDefaultPcds
> > - #
> > - # Gen DynamicDefaultPcds of Dsc as
> > <PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<DatumTyp>[|<
> > MaxDatumSize>]]
> > - #
> > - # @param Type: The type of Pcd
> > - # @param ContainerFile: The file which describes the pcd, used for
> error
> > report
> > - #
> > - def GenDynamicDefaultPcds(self, Type='', ContainerFile=''):
> > - Pcds = {}
> > - SkuInfoList = {}
> > - if Type == DataType.TAB_PCDS_DYNAMIC_DEFAULT:
> > - Model = MODEL_PCD_DYNAMIC_DEFAULT
> > - elif Type == DataType.TAB_PCDS_DYNAMIC_EX_DEFAULT:
> > - Model = MODEL_PCD_DYNAMIC_EX_DEFAULT
> > - else:
> > - pass
> > - EdkLogger.debug(2, "Generate %s ..." % Type)
> > -
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, Model, self.FileID)
> > -
> > - #
> > - # Get all DynamicDefaultPcds
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc, Model, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, Type, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - (K1, K2, K3, K4, K5, K6) = GetDynamicDefaultPcd(NewItem,
> > Type, Filename, -1)
> > - MergeArches(Pcds, (K1, K2, K3, K4, K5, K6, IncludeFile[4]),
> Arch)
> > - self.PcdToken[Record[3]] = (K2, K1)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - (K1, K2, K3, K4, K5, K6) = GetDynamicDefaultPcd(Record[0],
> Type,
> > ContainerFile, Record[2])
> > - MergeArches(Pcds, (K1, K2, K3, K4, K5, K6, Record[4]), Arch)
> > - self.PcdToken[Record[3]] = (K2, K1)
> > -
> > - for Key in Pcds:
> > - (Status, SkuInfoList) = self.GenSkuInfoList(Key[6],
> > self.Platform.SkuInfos.SkuInfoList, '', '', '', '', '', Key[2])
> > - if Status == False:
> > - ErrorMsg = "The SKUID '%s' used in section '%s' is not defined in
> > section [SkuIds]" % (SkuInfoList, Type)
> > - EdkLogger.error("DSC File Parser", PARSER_ERROR, ErrorMsg,
> > ContainerFile, RaiseError=EdkLogger.IsRaiseError)
> > - Pcd = PcdClass(Key[0], '', Key[1], Key[3], Key[4], Key[2], Key[5], [],
> > SkuInfoList, [])
> > - Pcd.SupArchList = Pcds[Key]
> > - self.Platform.DynamicPcdBuildDefinitions.append(Pcd)
> > -
> > - ## Gen DynamicHiiPcds
> > - #
> > - # Gen DynamicHiiPcds of Dsc as
> >
> <PcdTokenSpaceGuidCName>.<TokenCName>|<String>|<VariableGuidCNa
> > me>|<VariableOffset>[|<DefaultValue>[|<MaximumDatumSize>]]
> > - #
> > - # @param Type: The type of Pcd
> > - # @param ContainerFile: The file which describes the pcd, used for
> error
> > report
> > - #
> > - def GenDynamicHiiPcds(self, Type='', ContainerFile=''):
> > - Pcds = {}
> > - SkuInfoList = {}
> > - if Type == DataType.TAB_PCDS_DYNAMIC_HII:
> > - Model = MODEL_PCD_DYNAMIC_HII
> > - elif Type == DataType.TAB_PCDS_DYNAMIC_EX_HII:
> > - Model = MODEL_PCD_DYNAMIC_EX_HII
> > - else:
> > - pass
> > - EdkLogger.debug(2, "Generate %s ..." % Type)
> > -
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, Model, self.FileID)
> > -
> > - #
> > - # Get all DynamicHiiPcds
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc, Model, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, Type, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - (K1, K2, K3, K4, K5, K6, K7, K8) = GetDynamicHiiPcd(NewItem,
> > Type, Filename, -1)
> > - MergeArches(Pcds, (K1, K2, K3, K4, K5, K6, K7, K8,
> > IncludeFile[4]), Arch)
> > - self.PcdToken[Record[3]] = (K2, K1)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - (K1, K2, K3, K4, K5, K6, K7, K8) = GetDynamicHiiPcd(Record[0],
> > Type, ContainerFile, Record[2])
> > - MergeArches(Pcds, (K1, K2, K3, K4, K5, K6, K7, K8, Record[4]),
> > Arch)
> > - self.PcdToken[Record[3]] = (K2, K1)
> > -
> > - for Key in Pcds:
> > - (Status, SkuInfoList) = self.GenSkuInfoList(Key[8],
> > self.Platform.SkuInfos.SkuInfoList, Key[2], Key[3], Key[4], Key[5], '', '')
> > - if Status == False:
> > - ErrorMsg = "The SKUID '%s' used in section '%s' is not defined in
> > section [SkuIds]" % (SkuInfoList, Type)
> > - EdkLogger.error("DSC File Parser", PARSER_ERROR, ErrorMsg,
> > ContainerFile, RaiseError=EdkLogger.IsRaiseError)
> > - Pcd = PcdClass(Key[0], '', Key[1], '', Key[6], Key[5], Key[7], [],
> > SkuInfoList, [])
> > - Pcd.SupArchList = Pcds[Key]
> > - self.Platform.DynamicPcdBuildDefinitions.append(Pcd)
> > -
> > - ## Gen DynamicVpdPcds
> > - #
> > - # Gen DynamicVpdPcds of Dsc as
> >
> <PcdTokenSpaceGuidCName>.<TokenCName>|<VpdOffset>[|<MaximumD
> > atumSize>]
> > - #
> > - # @param Type: The type of Pcd
> > - # @param ContainerFile: The file which describes the pcd, used for
> error
> > report
> > - #
> > - def GenDynamicVpdPcds(self, Type='', ContainerFile=''):
> > - Pcds = {}
> > - SkuInfoList = {}
> > - if Type == DataType.TAB_PCDS_DYNAMIC_VPD:
> > - Model = MODEL_PCD_DYNAMIC_VPD
> > - elif Type == DataType.TAB_PCDS_DYNAMIC_EX_VPD:
> > - Model = MODEL_PCD_DYNAMIC_EX_VPD
> > - else:
> > - pass
> > - EdkLogger.debug(2, "Generate %s ..." % Type)
> > -
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, Model, self.FileID)
> > -
> > - #
> > - # Get all DynamicVpdPcds
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc, Model, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, Type, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - (K1, K2, K3, K4, K5) = GetDynamicVpdPcd(NewItem, Type,
> > Filename, -1)
> > - MergeArches(Pcds, (K1, K2, K3, K4, K5, IncludeFile[4]), Arch)
> > - self.PcdToken[Record[3]] = (K2, K1)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - (K1, K2, K3, K4, K5) = GetDynamicVpdPcd(Record[0], Type,
> > ContainerFile, Record[2])
> > - MergeArches(Pcds, (K1, K2, K3, K4, K5, Record[4]), Arch)
> > - self.PcdToken[Record[3]] = (K2, K1)
> > -
> > - for Key in Pcds:
> > - (Status, SkuInfoList) = self.GenSkuInfoList(Key[5],
> > self.Platform.SkuInfos.SkuInfoList, '', '', '', '', Key[2], '')
> > - if Status == False:
> > - ErrorMsg = "The SKUID '%s' used in section '%s' is not defined in
> > section [SkuIds]" % (SkuInfoList, Type)
> > - EdkLogger.error("DSC File Parser", PARSER_ERROR, ErrorMsg,
> > ContainerFile, RaiseError=EdkLogger.IsRaiseError)
> > - Pcd = PcdClass(Key[0], '', Key[1], '', Key[3], '', Key[4], [], SkuInfoList,
> [])
> > - Pcd.SupArchList = Pcds[Key]
> > - self.Platform.DynamicPcdBuildDefinitions.append(Pcd)
> > -
> > -
> > - ## Get Component
> > - #
> > - # Get Component section defined in Dsc file
> > - #
> > - # @param ContainerFile: The file which describes the Components,
> used
> > for error report
> > - #
> > - # @retval PlatformModuleClass() A instance for PlatformModuleClass
> > - #
> > - def GenComponents(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_COMPONENTS)
> > - Components = sdict()
> > - #
> > - # Get all include files
> > - #
> > - IncludeFiles = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_INCLUDE, MODEL_META_DATA_COMPONENT,
> > self.FileID)
> > -
> > - #
> > - # Get all Components
> > - #
> > - RecordSet = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_COMPONENT, -1, self.FileID)
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in DataType.ARCH_LIST:
> > - for IncludeFile in IncludeFiles:
> > - if IncludeFile[1] == Arch or IncludeFile[1] ==
> > TAB_ARCH_COMMON.upper():
> > - Filename = CheckFileExist(self.WorkspaceDir, IncludeFile[0],
> > ContainerFile, TAB_COMPONENTS, '', IncludeFile[2])
> > - for NewItem in open(Filename, 'r').readlines():
> > - if CleanString(NewItem) == '':
> > - continue
> > - NewItems = []
> > - GetComponents(open(Filename, 'r').read(),
> > TAB_COMPONENTS, NewItems, TAB_COMMENT_SPLIT)
> > - for NewComponent in NewItems:
> > - MergeArches(Components,
> > self.GenComponent(NewComponent, Filename), Arch)
> > -
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] ==
> TAB_ARCH_COMMON.upper():
> > - Lib, Bo, Pcd = [], [], []
> > -
> > - SubLibSet = QueryDscItem(self.TblDsc,
> > MODEL_EFI_LIBRARY_CLASS, Record[3], self.FileID)
> > - for SubLib in SubLibSet:
> > - Lib.append(TAB_VALUE_SPLIT.join([SubLib[0], SubLib[4]]))
> > -
> > - SubBoSet = QueryDscItem(self.TblDsc,
> > MODEL_META_DATA_BUILD_OPTION, Record[3], self.FileID)
> > - for SubBo in SubBoSet:
> > - Bo.append(SubBo[0])
> > -
> > - SubPcdSet1 = QueryDscItem(self.TblDsc,
> > MODEL_PCD_FIXED_AT_BUILD, Record[3], self.FileID)
> > - SubPcdSet2 = QueryDscItem(self.TblDsc,
> > MODEL_PCD_PATCHABLE_IN_MODULE, Record[3], self.FileID)
> > - SubPcdSet3 = QueryDscItem(self.TblDsc,
> > MODEL_PCD_FEATURE_FLAG, Record[3], self.FileID)
> > - SubPcdSet4 = QueryDscItem(self.TblDsc,
> > MODEL_PCD_DYNAMIC_EX_DEFAULT, Record[3], self.FileID)
> > - SubPcdSet5 = QueryDscItem(self.TblDsc,
> > MODEL_PCD_DYNAMIC_DEFAULT, Record[3], self.FileID)
> > - for SubPcd in SubPcdSet1:
> > - Pcd.append([DataType.TAB_PCDS_FIXED_AT_BUILD,
> > SubPcd[0], SubPcd[3]])
> > - for SubPcd in SubPcdSet2:
> > - Pcd.append([DataType.TAB_PCDS_PATCHABLE_IN_MODULE,
> > SubPcd[0], SubPcd[3]])
> > - for SubPcd in SubPcdSet3:
> > - Pcd.append([DataType.TAB_PCDS_FEATURE_FLAG,
> SubPcd[0],
> > SubPcd[3]])
> > - for SubPcd in SubPcdSet4:
> > - Pcd.append([DataType.TAB_PCDS_DYNAMIC_EX, SubPcd[0],
> > SubPcd[3]])
> > - for SubPcd in SubPcdSet5:
> > - Pcd.append([DataType.TAB_PCDS_DYNAMIC, SubPcd[0],
> > SubPcd[3]])
> > - Item = [Record[0], Lib, Bo, Pcd]
> > - MergeArches(Components, self.GenComponent(Item,
> > ContainerFile), Arch)
> > -
> > - for Key in Components.keys():
> > - Key.SupArchList = Components[Key]
> > - self.Platform.Modules.ModuleList.append(Key)
> > -
> > - ## Get Component
> > - #
> > - # Get Component section defined in Dsc file
> > - #
> > - # @param Item: Contents includes a component block
> > - # @param ContainerFile: The file which describes the library class, used
> > for error report
> > - #
> > - # @retval PlatformModuleClass() A instance for PlatformModuleClass
> > - #
> > - def GenComponent(self, Item, ContainerFile, LineNo= -1):
> > - (InfFilename, ExecFilename) = GetExec(Item[0])
> > - LibraryClasses = Item[1]
> > - BuildOptions = Item[2]
> > - Pcds = Item[3]
> > - Component = PlatformModuleClass()
> > - Component.FilePath = NormPath(InfFilename)
> > - Component.ExecFilePath = NormPath(ExecFilename)
> > - CheckFileType(Component.FilePath, '.Inf', ContainerFile, 'component
> > name', Item[0], LineNo)
> > - CheckFileExist(self.WorkspaceDir, Component.FilePath, ContainerFile,
> > 'component', Item[0], LineNo)
> > - for Lib in LibraryClasses:
> > - List = GetSplitValueList(Lib)
> > - if len(List) != 2:
> > - RaiseParserError(Lib, 'LibraryClasses', ContainerFile,
> > '<ClassName>|<InfFilename>')
> > - LibName = List[0]
> > - LibFile = NormPath(List[1])
> > - if LibName == "" or LibName == "NULL":
> > - LibName = "NULL%d" % self._NullClassIndex
> > - self._NullClassIndex += 1
> > - CheckFileType(List[1], '.Inf', ContainerFile, 'library instance of
> > component ', Lib, LineNo)
> > - CheckFileExist(self.WorkspaceDir, LibFile, ContainerFile, 'library
> > instance of component', Lib, LineNo)
> > -
> >
> Component.LibraryClasses.LibraryList.append(PlatformLibraryClass(LibName,
> > LibFile))
> > - for BuildOption in BuildOptions:
> > - Key = GetBuildOption(BuildOption, ContainerFile)
> > -
> >
> Component.ModuleSaBuildOption.BuildOptionList.append(BuildOptionClass(
> > Key[0], Key[1], Key[2]))
> > - for Pcd in Pcds:
> > - Type = Pcd[0]
> > - List = GetSplitValueList(Pcd[1])
> > - PcdId = Pcd[2]
> > -
> > - TokenInfo = None
> > - #
> > - # For FeatureFlag
> > - #
> > - if Type == DataType.TAB_PCDS_FEATURE_FLAG:
> > - if len(List) != 2:
> > - RaiseParserError(Pcd[1], 'Components', ContainerFile,
> > '<PcdTokenSpaceGuidCName>.<PcdTokenName>|TRUE/FALSE')
> > -
> > - CheckPcdTokenInfo(List[0], 'Components', ContainerFile)
> > - TokenInfo = GetSplitValueList(List[0], DataType.TAB_SPLIT)
> > - Component.PcdBuildDefinitions.append(PcdClass(TokenInfo[1], '',
> > TokenInfo[0], '', '', List[1], Type, [], {}, []))
> > - #
> > - # For FixedAtBuild or PatchableInModule
> > - #
> > - if Type == DataType.TAB_PCDS_FIXED_AT_BUILD or Type ==
> > DataType.TAB_PCDS_PATCHABLE_IN_MODULE:
> > - List.append('')
> > - if len(List) != 3 and len(List) != 4:
> > - RaiseParserError(Pcd[1], 'Components', ContainerFile,
> >
> '<PcdTokenSpaceGuidCName>.<PcdTokenName>|<Value>[|<MaxDatumSiz
> > e>]')
> > -
> > - CheckPcdTokenInfo(List[0], 'Components', ContainerFile)
> > - TokenInfo = GetSplitValueList(List[0], DataType.TAB_SPLIT)
> > - Component.PcdBuildDefinitions.append(PcdClass(TokenInfo[1], '',
> > TokenInfo[0], '', List[2], List[1], Type, [], {}, []))
> > -
> > - #
> > - # For Dynamic or DynamicEx
> > - #
> > - if Type == DataType.TAB_PCDS_DYNAMIC or Type ==
> > DataType.TAB_PCDS_DYNAMIC_EX:
> > - if len(List) != 1:
> > - RaiseParserError(Pcd[1], 'Components', ContainerFile,
> > '<PcdTokenSpaceGuidCName>.<PcdTokenName>')
> > -
> > - CheckPcdTokenInfo(List[0], 'Components', ContainerFile)
> > - TokenInfo = GetSplitValueList(List[0], DataType.TAB_SPLIT)
> > - Component.PcdBuildDefinitions.append(PcdClass(TokenInfo[1], '',
> > TokenInfo[0], '', '', '', Type, [], {}, []))
> > -
> > - #
> > - # Add to PcdToken
> > - #
> > - self.PcdToken[PcdId] = (TokenInfo[0], TokenInfo[1])
> > -
> > - return Component
> > - #End of GenComponent
> > -
> > - ## Gen SkuInfoList
> > - #
> > - # Gen SkuInfoList section defined in Dsc file
> > - #
> > - # @param SkuNameList: Input value for SkuNameList
> > - # @param SkuInfo: Input value for SkuInfo
> > - # @param VariableName: Input value for VariableName
> > - # @param VariableGuid: Input value for VariableGuid
> > - # @param VariableOffset: Input value for VariableOffset
> > - # @param HiiDefaultValue: Input value for HiiDefaultValue
> > - # @param VpdOffset: Input value for VpdOffset
> > - # @param DefaultValue: Input value for DefaultValue
> > - #
> > - # @retval (False, SkuName) Not found in section SkuId Dsc file
> > - # @retval (True, SkuInfoList) Found in section SkuId of Dsc file
> > - #
> > - def GenSkuInfoList(self, SkuNameList, SkuInfo, VariableName='',
> > VariableGuid='', VariableOffset='', HiiDefaultValue='', VpdOffset='',
> > DefaultValue=''):
> > - SkuNameList = GetSplitValueList(SkuNameList)
> > - if SkuNameList is None or SkuNameList == [] or SkuNameList == ['']:
> > - SkuNameList = ['DEFAULT']
> > - SkuInfoList = {}
> > - for Item in SkuNameList:
> > - if Item not in SkuInfo:
> > - return False, Item
> > - Sku = SkuInfoClass(Item, SkuInfo[Item], VariableName,
> VariableGuid,
> > VariableOffset, HiiDefaultValue, VpdOffset, DefaultValue)
> > - SkuInfoList[Item] = Sku
> > -
> > - return True, SkuInfoList
> > -
> > - ## Parse Include statement
> > - #
> > - # Get include file path
> > - #
> > - # 1. Insert a record into TblFile ???
> > - # 2. Insert a record into TblDsc
> > - # Value1: IncludeFilePath
> > - #
> > - # @param LineValue: The line of incude statement
> > - def ParseInclude(self, LineValue, StartLine, Table, FileID, Filename,
> > SectionName, Model, Arch):
> > - EdkLogger.debug(EdkLogger.DEBUG_2, "!include statement '%s'
> found
> > in section %s" % (LineValue, SectionName))
> > - SectionModel = Section[SectionName.upper()]
> > - IncludeFile =
> >
> CleanString(LineValue[LineValue.upper().find(DataType.TAB_INCLUDE.uppe
> > r() + ' ') + len(DataType.TAB_INCLUDE + ' ') : ])
> > - Table.Insert(Model, IncludeFile, '', '', Arch, SectionModel, FileID,
> > StartLine, -1, StartLine, -1, 0)
> > -
> > - ## Parse DEFINE statement
> > - #
> > - # Get DEFINE macros
> > - #
> > - # 1. Insert a record into TblDsc
> > - # Value1: Macro Name
> > - # Value2: Macro Value
> > - #
> > - def ParseDefine(self, LineValue, StartLine, Table, FileID, Filename,
> > SectionName, Model, Arch):
> > - EdkLogger.debug(EdkLogger.DEBUG_2, "DEFINE statement '%s' found
> in
> > section %s" % (LineValue, SectionName))
> > - SectionModel = Section[SectionName.upper()]
> > - Define =
> >
> GetSplitValueList(CleanString(LineValue[LineValue.upper().find(DataType.TA
> > B_DEFINE.upper() + ' ') + len(DataType.TAB_DEFINE + ' ') : ]),
> > TAB_EQUAL_SPLIT, 1)
> > - Table.Insert(Model, Define[0], Define[1], '', Arch, SectionModel,
> FileID,
> > StartLine, -1, StartLine, -1, 0)
> > -
> > - ## Parse Defines section
> > - #
> > - # Get one item in defines section
> > - #
> > - # Value1: Item Name
> > - # Value2: Item Value
> > - #
> > - def ParseDefinesSection(self, LineValue, StartLine, Table, FileID,
> Filename,
> > SectionName, Model, Arch):
> > - EdkLogger.debug(EdkLogger.DEBUG_2, "Parse '%s' found in section
> %s"
> > % (LineValue, SectionName))
> > - Defines = GetSplitValueList(LineValue, TAB_EQUAL_SPLIT, 1)
> > - if len(Defines) != 2:
> > - RaiseParserError(LineValue, SectionName, Filename, '', StartLine)
> > - self.TblDsc.Insert(Model, Defines[0], Defines[1], '', Arch, -1, FileID,
> > StartLine, -1, StartLine, -1, 0)
> > -
> > - ## Insert conditional statements
> > - #
> > - # Pop an item from IfDefList
> > - # Insert conditional statements to database
> > - #
> > - # @param Filename: Path of parsing file
> > - # @param IfDefList: A list stored current conditional statements
> > - # @param EndLine: The end line no
> > - # @param ArchList: Support arch list
> > - #
> > - def InsertConditionalStatement(self, Filename, FileID, BelongsToItem,
> > IfDefList, EndLine, ArchList):
> > - (Value1, Value2, Value3, Model, StartColumn, EndColumn, Enabled) =
> > ('', '', '', -1, -1, -1, 0)
> > - if IfDefList == []:
> > - ErrorMsg = 'Not suited conditional statement in file %s' % Filename
> > - EdkLogger.error("DSC File Parser", PARSER_ERROR, ErrorMsg,
> > Filename, RaiseError=EdkLogger.IsRaiseError)
> > - else:
> > - #
> > - # Get New Dsc item ID
> > - #
> > - DscID = self.TblDsc.GetCount() + 1
> > -
> > - #
> > - # Pop the conditional statements which is closed
> > - #
> > - PreviousIf = IfDefList.pop()
> > - EdkLogger.debug(EdkLogger.DEBUG_5, 'Previous IfDef: ' +
> > str(PreviousIf))
> > -
> > - #
> > - # !ifdef and !ifndef
> > - #
> > - if PreviousIf[2] in
> > (MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF):
> > - Value1 = PreviousIf[0]
> > - Model = PreviousIf[2]
> > - self.TblDsc.Insert(Model, Value1, Value2, Value3, ArchList,
> > BelongsToItem, self.FileID, PreviousIf[1], StartColumn, EndLine,
> EndColumn,
> > Enabled)
> > - #
> > - # !if and !elseif
> > - #
> > - elif PreviousIf[2] in
> > (MODEL_META_DATA_CONDITIONAL_STATEMENT_IF, Model):
> > - List = PreviousIf[0].split(' ')
> > - Value1, Value2, Value3 = '', '==', '0'
> > - if len(List) == 3:
> > - Value1 = List[0]
> > - Value2 = List[1]
> > - Value3 = List[2]
> > - Value3 = SplitString(Value3)
> > - if len(List) == 1:
> > - Value1 = List[0]
> > - Model = PreviousIf[2]
> > - self.TblDsc.Insert(Model, Value1, Value2, Value3, ArchList,
> > BelongsToItem, self.FileID, PreviousIf[1], StartColumn, EndLine,
> EndColumn,
> > Enabled)
> > - #
> > - # !else
> > - #
> > - elif PreviousIf[2] in
> > (MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE, Model):
> > - Value1 = PreviousIf[0].strip()
> > - Model = PreviousIf[2]
> > - self.TblDsc.Insert(Model, Value1, Value2, Value3, ArchList,
> > BelongsToItem, self.FileID, PreviousIf[1], StartColumn, EndLine,
> EndColumn,
> > Enabled)
> > -
> > - ## Load Dsc file
> > - #
> > - # Load the file if it exists
> > - #
> > - # @param Filename: Input value for filename of Dsc file
> > - #
> > - def LoadDscFile(self, Filename):
> > - #
> > - # Insert a record for file
> > - #
> > - Filename = NormPath(Filename)
> > - self.Identification.FileFullPath = Filename
> > - (self.Identification.FileRelativePath, self.Identification.FileName) =
> > os.path.split(Filename)
> > - self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_DSC)
> > -
> > - #
> > - # Init DscTable
> > - #
> > - #self.TblDsc.Table = "Dsc%s" % FileID
> > - #self.TblDsc.Create()
> > -
> > - #
> > - # Init common datas
> > - #
> > - IfDefList, SectionItemList, CurrentSection, ArchList, ThirdList,
> > IncludeFiles = \
> > - [], [], TAB_UNKNOWN, [], [], []
> > - LineNo = 0
> > -
> > - #
> > - # Parse file content
> > - #
> > - IsFindBlockComment = False
> > - ReservedLine = ''
> > - for Line in open(Filename, 'r'):
> > - LineNo = LineNo + 1
> > - #
> > - # Remove comment block
> > - #
> > - if Line.find(TAB_COMMENT_EDK_START) > -1:
> > - ReservedLine = GetSplitList(Line, TAB_COMMENT_EDK_START,
> 1)[0]
> > - IsFindBlockComment = True
> > - if Line.find(TAB_COMMENT_EDK_END) > -1:
> > - Line = ReservedLine + GetSplitList(Line,
> TAB_COMMENT_EDK_END,
> > 1)[1]
> > - ReservedLine = ''
> > - IsFindBlockComment = False
> > - if IsFindBlockComment:
> > - continue
> > -
> > - #
> > - # Remove comments at tail and remove spaces again
> > - #
> > - Line = CleanString(Line)
> > - if Line == '':
> > - continue
> > -
> > - #
> > - # Find a new section tab
> > - # First insert previous section items
> > - # And then parse the content of the new section
> > - #
> > - if Line.startswith(TAB_SECTION_START) and
> > Line.endswith(TAB_SECTION_END):
> > - #
> > - # Insert items data of previous section
> > - #
> > - self.InsertSectionItemsIntoDatabase(self.FileID, Filename,
> > CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList)
> > - #
> > - # Parse the new section
> > - #
> > - SectionItemList = []
> > - ArchList = []
> > - ThirdList = []
> > -
> > - CurrentSection = ''
> > - LineList =
> GetSplitValueList(Line[len(TAB_SECTION_START):len(Line)
> > - len(TAB_SECTION_END)], TAB_COMMA_SPLIT)
> > - for Item in LineList:
> > - ItemList = GetSplitValueList(Item, TAB_SPLIT)
> > - if CurrentSection == '':
> > - CurrentSection = ItemList[0]
> > - else:
> > - if CurrentSection != ItemList[0]:
> > - EdkLogger.error("Parser", PARSER_ERROR, "Different
> section
> > names '%s' and '%s' are found in one section definition, this is not allowed."
> > % (CurrentSection, ItemList[0]), File=Filename, Line=LineNo,
> > RaiseError=EdkLogger.IsRaiseError)
> > - if CurrentSection.upper() not in self.KeyList:
> > - RaiseParserError(Line, CurrentSection, Filename, '', LineNo)
> > - CurrentSection = TAB_UNKNOWN
> > - continue
> > - ItemList.append('')
> > - ItemList.append('')
> > - if len(ItemList) > 5:
> > - RaiseParserError(Line, CurrentSection, Filename, '', LineNo)
> > - else:
> > - if ItemList[1] != '' and ItemList[1].upper() not in
> > ARCH_LIST_FULL:
> > - EdkLogger.error("Parser", PARSER_ERROR, "Invalid Arch
> > definition '%s' found" % ItemList[1], File=Filename, Line=LineNo,
> > RaiseError=EdkLogger.IsRaiseError)
> > - ArchList.append(ItemList[1].upper())
> > - ThirdList.append(ItemList[2])
> > -
> > - continue
> > -
> > - #
> > - # Not in any defined section
> > - #
> > - if CurrentSection == TAB_UNKNOWN:
> > - ErrorMsg = "%s is not in any defined section" % Line
> > - EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg,
> File=Filename,
> > Line=LineNo, RaiseError=EdkLogger.IsRaiseError)
> > -
> > - #
> > - # Add a section item
> > - #
> > - SectionItemList.append([Line, LineNo])
> > - # End of parse
> > - #End of For
> > -
> > - #
> > - # Insert items data of last section
> > - #
> > - self.InsertSectionItemsIntoDatabase(self.FileID, Filename,
> > CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList)
> > -
> > - #
> > - # Parse conditional statements
> > - #
> > - self.ParseConditionalStatement()
> > -
> > - #
> > - # Replace all DEFINE macros with its actual values
> > - #
> > - #ParseDefineMacro2(self.TblDsc, self.RecordSet,
> > GlobalData.gGlobalDefines)
> > - ParseDefineMacro(self.TblDsc, GlobalData.gGlobalDefines)
> > -
> > -
> > - ## ParseConditionalStatement
> > - #
> > - # Search all conditional statement and disable no match records
> > - #
> > - def ParseConditionalStatement(self):
> > - #
> > - # Disabled all !if/!elif/!ifdef statements without DEFINE
> > - #
> > - SqlCommand = """select A.StartLine, A.EndLine from %s as A
> > - where A.Model in (%s, %s, %s)
> > - and A.Enabled = 0
> > - and A.BelongsToFile = %s
> > - and A.Value1 not in (select B.Value1 from %s as B
> > - where B.Model = %s
> > - and B.Enabled = 0
> > - and A.StartLine > B.StartLine
> > - and A.Arch = B.Arch
> > - and A.BelongsToItem = B.BelongsToItem
> > - and A.BelongsToFile = B.BelongsToFile) """ % \
> > - (self.TblDsc.Table, \
> > - MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE,
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF, \
> > - self.FileID, \
> > - self.TblDsc.Table, \
> > - MODEL_META_DATA_DEFINE)
> > - RecordSet = self.TblDsc.Exec(SqlCommand)
> > - for Record in RecordSet:
> > - SqlCommand = """Update %s set Enabled = -1 where StartLine >= %s
> > and EndLine <= %s""" % (self.TblDsc.Table, Record[0], Record[1])
> > - self.TblDsc.Exec(SqlCommand)
> > -
> > - #
> > - # Disabled !ifndef with DEFINE
> > - #
> > - SqlCommand = """select A.StartLine, A.EndLine from %s as A
> > - where A.Model = %s
> > - and A.Enabled = 0
> > - and A.BelongsToFile = %s
> > - and A.Value1 in (select B.Value1 from %s as B
> > - where B.Model = %s
> > - and B.Enabled = 0
> > - and A.StartLine > B.StartLine
> > - and A.Arch = B.Arch
> > - and A.BelongsToItem = B.BelongsToItem
> > - and A.BelongsToFile = B.BelongsToFile)""" % \
> > - (self.TblDsc.Table, \
> > - MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF, \
> > - self.FileID, \
> > - self.TblDsc.Table, \
> > - MODEL_META_DATA_DEFINE)
> > - RecordSet = self.TblDsc.Exec(SqlCommand)
> > - for Record in RecordSet:
> > - SqlCommand = """Update %s set Enabled = -1 where StartLine >= %s
> > and EndLine <= %s""" % (self.TblDsc.Table, Record[0], Record[1])
> > - EdkLogger.debug(4, "SqlCommand: %s" % SqlCommand)
> > - self.Cur.execute(SqlCommand)
> > -
> > - #
> > - # Disabled !if, !elif and !else with un-match value
> > - #
> > - SqlCommand = """select A.Model, A.Value1, A.Value2, A.Value3,
> > A.StartLine, A.EndLine, B.Value2 from %s as A join %s as B
> > - where A.Model in (%s, %s)
> > - and A.Enabled = 0
> > - and A.BelongsToFile = %s
> > - and B.Enabled = 0
> > - and B.Model = %s
> > - and A.Value1 = B.Value1
> > - and A.StartLine > B.StartLine
> > - and A.BelongsToItem = B.BelongsToItem
> > - and A.BelongsToFile = B.BelongsToFile""" % \
> > - (self.TblDsc.Table, self.TblDsc.Table, \
> > - MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE, \
> > - self.FileID, MODEL_META_DATA_DEFINE)
> > - RecordSet = self.TblDsc.Exec(SqlCommand)
> > - DisabledList = []
> > - for Record in RecordSet:
> > - if Record[0] ==
> MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:
> > - if not self.Compare(Record[6], Record[2], Record[3]):
> > - SqlCommand = """Update %s set Enabled = -1 where StartLine
> >=
> > %s and EndLine <= %s""" % (self.TblDsc.Table, Record[4], Record[5])
> > - self.TblDsc.Exec(SqlCommand)
> > - else:
> > - DisabledList.append(Record[1])
> > - continue
> > - if Record[0] ==
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE and Record[1] in
> > DisabledList:
> > - SqlCommand = """Update %s set Enabled = -1 where StartLine >=
> > %s and EndLine <= %s""" % (self.TblDsc.Table, Record[4], Record[5])
> > - self.TblDsc.Exec(SqlCommand)
> > -
> > - ## Compare
> > - #
> > - # Compare two values
> > - # @param Value1:
> > - # @param CompareType:
> > - # @param Value2:
> > - #
> > - def Compare(self, Value1, CompareType, Value2):
> > - Command = """Value1 %s Value2""" % CompareType
> > - return eval(Command)
> > -
> > - ## First time to insert records to database
> > - #
> > - # Insert item data of a section to database
> > - # @param FileID: The ID of belonging file
> > - # @param Filename: The name of belonging file
> > - # @param CurrentSection: The name of currect section
> > - # @param SectionItemList: A list of items of the section
> > - # @param ArchList: A list of arches
> > - # @param ThirdList: A list of third parameters, ModuleType for
> > LibraryClass and SkuId for Dynamic Pcds
> > - # @param IfDefList: A list of all conditional statements
> > - #
> > - def InsertSectionItemsIntoDatabase(self, FileID, Filename,
> > CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList):
> > - #
> > - # Insert each item data of a section
> > - #
> > - for Index in range(0, len(ArchList)):
> > - Arch = ArchList[Index]
> > - Third = ThirdList[Index]
> > - if Arch == '':
> > - Arch = TAB_ARCH_COMMON.upper()
> > -
> > - Model = Section[CurrentSection.upper()]
> > - #Records = self.RecordSet[Model]
> > -
> > - for SectionItem in SectionItemList:
> > - BelongsToItem, EndLine, EndColumn = -1, -1, -1
> > - LineValue, StartLine, EndLine = SectionItem[0], SectionItem[1],
> > SectionItem[1]
> > -
> > -
> > - EdkLogger.debug(4, "Parsing %s ..." % LineValue)
> > - #
> > - # Parse '!ifdef'
> > - #
> > - if LineValue.upper().find(TAB_IF_DEF.upper()) > -1:
> > - IfDefList.append((LineValue[len(TAB_IF_N_DEF):].strip(),
> > StartLine, MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF))
> > - continue
> > -
> > - #
> > - # Parse '!ifndef'
> > - #
> > - if LineValue.upper().find(TAB_IF_N_DEF.upper()) > -1:
> > - IfDefList.append((LineValue[len(TAB_IF_N_DEF):].strip(),
> > StartLine, MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF))
> > - continue
> > -
> > - #
> > - # Parse '!endif'
> > - #
> > - if LineValue.upper().find(TAB_END_IF.upper()) > -1:
> > - self.InsertConditionalStatement(Filename, FileID, Model,
> > IfDefList, StartLine, Arch)
> > - continue
> > - #
> > - # Parse '!if'
> > - #
> > - if LineValue.upper().find(TAB_IF.upper()) > -1:
> > - IfDefList.append((LineValue[len(TAB_IF):].strip(), StartLine,
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_IF))
> > - continue
> > -
> > - #
> > - # Parse '!elseif'
> > - #
> > - if LineValue.upper().find(TAB_ELSE_IF.upper()) > -1:
> > - self.InsertConditionalStatement(Filename, FileID, Model,
> > IfDefList, StartLine - 1, Arch)
> > - IfDefList.append((LineValue[len(TAB_ELSE_IF):].strip(),
> StartLine,
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_IF))
> > - continue
> > -
> > - #
> > - # Parse '!else'
> > - #
> > - if LineValue.upper().find(TAB_ELSE.upper()) > -1:
> > - Key = IfDefList[-1][0].split(' ' , 1)[0].strip()
> > - self.InsertConditionalStatement(Filename, FileID, Model,
> > IfDefList, StartLine, Arch)
> > - IfDefList.append((Key, StartLine,
> > MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE))
> > - continue
> > -
> > - #
> > - # Parse !include statement first
> > - #
> > - if LineValue.upper().find(DataType.TAB_INCLUDE.upper() + ' ') > -
> 1:
> > - self.ParseInclude(LineValue, StartLine, self.TblDsc, FileID,
> > Filename, CurrentSection, MODEL_META_DATA_INCLUDE, Arch)
> > - continue
> > -
> > - #
> > - # And then parse DEFINE statement
> > - #
> > - if LineValue.upper().find(DataType.TAB_DEFINE.upper() + ' ') > -1:
> > - self.ParseDefine(LineValue, StartLine, self.TblDsc, FileID,
> > Filename, CurrentSection, MODEL_META_DATA_DEFINE, Arch)
> > - continue
> > -
> > - #
> > - # At last parse other sections
> > - #
> > - if CurrentSection == TAB_LIBRARY_CLASSES or CurrentSection in
> > TAB_PCD_DYNAMIC_TYPE_LIST or CurrentSection in
> > TAB_PCD_DYNAMIC_EX_TYPE_LIST:
> > - ID = self.TblDsc.Insert(Model, LineValue, Third, '', Arch, -1, FileID,
> > StartLine, -1, StartLine, -1, 0)
> > - #Records.append([LineValue, Arch, StartLine, ID, Third])
> > - continue
> > - elif CurrentSection != TAB_COMPONENTS:
> > - ID = self.TblDsc.Insert(Model, LineValue, '', '', Arch, -1, FileID,
> > StartLine, -1, StartLine, -1, 0)
> > - #Records.append([LineValue, Arch, StartLine, ID, Third])
> > - continue
> > -
> > - #
> > - # Parse COMPONENT section
> > - #
> > - if CurrentSection == TAB_COMPONENTS:
> > - Components = []
> > - GetComponent(SectionItemList, Components)
> > - for Component in Components:
> > - EdkLogger.debug(4, "Parsing component %s ..." % Component)
> > - DscItmeID =
> > self.TblDsc.Insert(MODEL_META_DATA_COMPONENT, Component[0], '', '',
> > Arch, -1, FileID, StartLine, -1, StartLine, -1, 0)
> > - for Item in Component[1]:
> > - List = GetSplitValueList(Item, MaxSplit=2)
> > - LibName, LibIns = '', ''
> > - if len(List) == 2:
> > - LibName = List[0]
> > - LibIns = List[1]
> > - else:
> > - LibName = List[0]
> > - self.TblDsc.Insert(MODEL_EFI_LIBRARY_CLASS, LibName,
> LibIns,
> > '', Arch, DscItmeID, FileID, StartLine, -1, StartLine, -1, 0)
> > - for Item in Component[2]:
> > - self.TblDsc.Insert(MODEL_META_DATA_BUILD_OPTION,
> Item,
> > '', '', Arch, DscItmeID, FileID, StartLine, -1, StartLine, -1, 0)
> > - for Item in Component[3]:
> > - Model = Section[Item[0].upper()]
> > - self.TblDsc.Insert(Model, Item[1], '', '', Arch, DscItmeID, FileID,
> > StartLine, -1, StartLine, -1, 0)
> > -
> > - ## Show detailed information of Dsc
> > - #
> > - # Print all members and their values of Dsc class
> > - #
> > - def ShowDsc(self):
> > - print TAB_SECTION_START + TAB_INF_DEFINES + TAB_SECTION_END
> > - printDict(self.Defines.DefinesDictionary)
> > -
> > - for Key in self.KeyList:
> > - for Arch in DataType.ARCH_LIST_FULL:
> > - Command = "printList(TAB_SECTION_START + '" + \
> > - Key + DataType.TAB_SPLIT + Arch + \
> > - "' + TAB_SECTION_END, self.Contents[arch]." + Key + ')'
> > - eval(Command)
> > -
> > - ## Show detailed information of Platform
> > - #
> > - # Print all members and their values of Platform class
> > - #
> > - def ShowPlatform(self):
> > - M = self.Platform
> > - for Arch in M.Header.keys():
> > - print '\nArch =', Arch
> > - print 'Filename =', M.Header[Arch].FileName
> > - print 'FullPath =', M.Header[Arch].FullPath
> > - print 'BaseName =', M.Header[Arch].Name
> > - print 'Guid =', M.Header[Arch].Guid
> > - print 'Version =', M.Header[Arch].Version
> > - print 'DscSpecification =', M.Header[Arch].DscSpecification
> > - print 'SkuId =', M.Header[Arch].SkuIdName
> > - print 'SupArchList =', M.Header[Arch].SupArchList
> > - print 'BuildTargets =', M.Header[Arch].BuildTargets
> > - print 'OutputDirectory =', M.Header[Arch].OutputDirectory
> > - print 'BuildNumber =', M.Header[Arch].BuildNumber
> > - print 'MakefileName =', M.Header[Arch].MakefileName
> > - print 'BsBaseAddress =', M.Header[Arch].BsBaseAddress
> > - print 'RtBaseAddress =', M.Header[Arch].RtBaseAddress
> > - print 'Define =', M.Header[Arch].Define
> > - print 'Fdf =', M.FlashDefinitionFile.FilePath
> > - print '\nBuildOptions =', M.BuildOptions, M.BuildOptions.IncludeFiles
> > - for Item in M.BuildOptions.BuildOptionList:
> > - print '\t', 'ToolChainFamily =', Item.ToolChainFamily, 'ToolChain =',
> > Item.ToolChain, 'Option =', Item.Option, 'Arch =', Item.SupArchList
> > - print '\nSkuIds =', M.SkuInfos.SkuInfoList, M.SkuInfos.IncludeFiles
> > - print '\nLibraries =', M.Libraries, M.Libraries.IncludeFiles
> > - for Item in M.Libraries.LibraryList:
> > - print '\t', Item.FilePath, Item.SupArchList, Item.Define
> > - print '\nLibraryClasses =', M.LibraryClasses,
> > M.LibraryClasses.IncludeFiles
> > - for Item in M.LibraryClasses.LibraryList:
> > - print '\t', Item.Name, Item.FilePath, Item.SupModuleList,
> > Item.SupArchList, Item.Define
> > - print '\nPcds =', M.DynamicPcdBuildDefinitions
> > - for Item in M.DynamicPcdBuildDefinitions:
> > - print '\tCname=', Item.CName, 'TSG=',
> Item.TokenSpaceGuidCName,
> > 'Value=', Item.DefaultValue, 'Token=', Item.Token, 'Type=',
> Item.ItemType,
> > 'Datum=', Item.DatumType, 'Size=', Item.MaxDatumSize, 'Arch=',
> > Item.SupArchList, Item.SkuInfoList
> > - for Sku in Item.SkuInfoList.values():
> > - print '\t\t', str(Sku)
> > - print '\nComponents =', M.Modules.ModuleList,
> > M.Modules.IncludeFiles
> > - for Item in M.Modules.ModuleList:
> > - print '\t', Item.FilePath, Item.ExecFilePath, Item.SupArchList
> > - for Lib in Item.LibraryClasses.LibraryList:
> > - print '\t\tLib:', Lib.Name, Lib.FilePath
> > - for Bo in Item.ModuleSaBuildOption.BuildOptionList:
> > - print '\t\tBuildOption:', Bo.ToolChainFamily, Bo.ToolChain,
> > Bo.Option
> > - for Pcd in Item.PcdBuildDefinitions:
> > - print '\t\tPcd:', Pcd.CName, Pcd.TokenSpaceGuidCName,
> > Pcd.MaxDatumSize, Pcd.DefaultValue, Pcd.ItemType
> > -
> > -##
> > -#
> > -# This acts like the main() function for the script, unless it is 'import'ed into
> > another
> > -# script.
> > -#
> > -if __name__ == '__main__':
> > - EdkLogger.Initialize()
> > - EdkLogger.SetLevel(EdkLogger.DEBUG_0)
> > -
> > - W = os.getenv('WORKSPACE')
> > - F = os.path.join(W, 'Nt32Pkg/Nt32Pkg.dsc')
> > -
> > - Db = Database.Database('Dsc.db')
> > - Db.InitDatabase()
> > -
> > - P = Dsc(os.path.normpath(F), True, True, W, Db)
> > - P.ShowPlatform()
> > -
> > - Db.Close()
> > diff --git a/BaseTools/Source/Python/Common/FdfClassObject.py
> > b/BaseTools/Source/Python/Common/FdfClassObject.py
> > deleted file mode 100644
> > index 9a7d6494d331..000000000000
> > --- a/BaseTools/Source/Python/Common/FdfClassObject.py
> > +++ /dev/null
> > @@ -1,106 +0,0 @@
> > -## @file
> > -# This file is used to define each component of FDF file
> > -#
> > -# Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
> > -# This program and the accompanying materials
> > -# are licensed and made available under the terms and conditions of the
> > BSD License
> > -# which accompanies this distribution. The full text of the license may be
> > found at
> > -# http://opensource.org/licenses/bsd-license.php
> > -#
> > -# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS"
> > BASIS,
> > -# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER
> > EXPRESS OR IMPLIED.
> > -#
> > -
> > -##
> > -# Import Modules
> > -#
> > -from FdfParserLite import FdfParser
> > -from Table.TableFdf import TableFdf
> > -from CommonDataClass.DataClass import MODEL_FILE_FDF, MODEL_PCD,
> > MODEL_META_DATA_COMPONENT
> > -from String import NormPath
> > -
> > -
> > -## Fdf
> > -#
> > -# This class defined the structure used in Fdf object
> > -#
> > -# @param Filename: Input value for Ffilename of Fdf file, default is
> None
> > -# @param WorkspaceDir: Input value for current workspace directory,
> > default is None
> > -#
> > -class Fdf(object):
> > - def __init__(self, Filename = None, IsToDatabase = False, WorkspaceDir
> =
> > None, Database = None):
> > - self.WorkspaceDir = WorkspaceDir
> > - self.IsToDatabase = IsToDatabase
> > -
> > - self.Cur = Database.Cur
> > - self.TblFile = Database.TblFile
> > - self.TblFdf = Database.TblFdf
> > - self.FileID = -1
> > - self.FileList = {}
> > -
> > - #
> > - # Load Fdf file if filename is not None
> > - #
> > - if Filename is not None:
> > - self.LoadFdfFile(Filename)
> > -
> > - #
> > - # Insert a FDF file record into database
> > - #
> > - def InsertFile(self, Filename):
> > - FileID = -1
> > - Filename = NormPath(Filename)
> > - if Filename not in self.FileList:
> > - FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_FDF)
> > - self.FileList[Filename] = FileID
> > -
> > - return self.FileList[Filename]
> > -
> > -
> > - ## Load Fdf file
> > - #
> > - # Load the file if it exists
> > - #
> > - # @param Filename: Input value for filename of Fdf file
> > - #
> > - def LoadFdfFile(self, Filename):
> > - FileList = []
> > - #
> > - # Parse Fdf file
> > - #
> > - Filename = NormPath(Filename)
> > - Fdf = FdfParser(Filename)
> > - Fdf.ParseFile()
> > -
> > - #
> > - # Insert inf file and pcd information
> > - #
> > - if self.IsToDatabase:
> > - (Model, Value1, Value2, Value3, Arch, BelongsToItem,
> BelongsToFile,
> > StartLine, StartColumn, EndLine, EndColumn, Enabled) = \
> > - (0, '', '', '', 'COMMON', -1, -1, -1, -1, -1, -1, 0)
> > - for Index in range(0, len(Fdf.Profile.PcdDict)):
> > - pass
> > - for Key in Fdf.Profile.PcdDict.keys():
> > - Model = MODEL_PCD
> > - Value1 = ''
> > - Value2 = ".".join((Key[1], Key[0]))
> > - FileName = Fdf.Profile.PcdFileLineDict[Key][0]
> > - StartLine = Fdf.Profile.PcdFileLineDict[Key][1]
> > - BelongsToFile = self.InsertFile(FileName)
> > - self.TblFdf.Insert(Model, Value1, Value2, Value3, Arch,
> > BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine,
> EndColumn,
> > Enabled)
> > - for Index in range(0, len(Fdf.Profile.InfList)):
> > - Model = MODEL_META_DATA_COMPONENT
> > - Value1 = Fdf.Profile.InfList[Index]
> > - Value2 = ''
> > - FileName = Fdf.Profile.InfFileLineList[Index][0]
> > - StartLine = Fdf.Profile.InfFileLineList[Index][1]
> > - BelongsToFile = self.InsertFile(FileName)
> > - self.TblFdf.Insert(Model, Value1, Value2, Value3, Arch,
> > BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine,
> EndColumn,
> > Enabled)
> > -
> > -##
> > -#
> > -# This acts like the main() function for the script, unless it is 'import'ed into
> > another
> > -# script.
> > -#
> > -if __name__ == '__main__':
> > - pass
> > diff --git a/BaseTools/Source/Python/Common/InfClassObject.py
> > b/BaseTools/Source/Python/Common/InfClassObject.py
> > deleted file mode 100644
> > index 7a5ba4eb84ce..000000000000
> > --- a/BaseTools/Source/Python/Common/InfClassObject.py
> > +++ /dev/null
> > @@ -1,1105 +0,0 @@
> > -## @file
> > -# This file is used to define each component of INF file
> > -#
> > -# Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
> > -# This program and the accompanying materials
> > -# are licensed and made available under the terms and conditions of the
> > BSD License
> > -# which accompanies this distribution. The full text of the license may be
> > found at
> > -# http://opensource.org/licenses/bsd-license.php
> > -#
> > -# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS"
> > BASIS,
> > -# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER
> > EXPRESS OR IMPLIED.
> > -#
> > -
> > -##
> > -# Import Modules
> > -#
> > -import Common.LongFilePathOs as os
> > -import re
> > -import EdkLogger
> > -from CommonDataClass.CommonClass import LibraryClassClass
> > -from CommonDataClass.ModuleClass import *
> > -from String import *
> > -from DataType import *
> > -from Identification import *
> > -from Dictionary import *
> > -from BuildToolError import *
> > -from Misc import sdict
> > -import GlobalData
> > -from Table.TableInf import TableInf
> > -import Database
> > -from Parsing import *
> > -from Common.LongFilePathSupport import OpenLongFilePath as open
> > -
> > -#
> > -# Global variable
> > -#
> > -Section = {TAB_UNKNOWN.upper() : MODEL_UNKNOWN,
> > - TAB_INF_DEFINES.upper() : MODEL_META_DATA_HEADER,
> > - TAB_BUILD_OPTIONS.upper() :
> MODEL_META_DATA_BUILD_OPTION,
> > - TAB_INCLUDES.upper() : MODEL_EFI_INCLUDE,
> > - TAB_LIBRARIES.upper() : MODEL_EFI_LIBRARY_INSTANCE,
> > - TAB_LIBRARY_CLASSES.upper() : MODEL_EFI_LIBRARY_CLASS,
> > - TAB_PACKAGES.upper() : MODEL_META_DATA_PACKAGE,
> > - TAB_NMAKE.upper() : MODEL_META_DATA_NMAKE,
> > - TAB_INF_FIXED_PCD.upper() : MODEL_PCD_FIXED_AT_BUILD,
> > - TAB_INF_PATCH_PCD.upper() :
> > MODEL_PCD_PATCHABLE_IN_MODULE,
> > - TAB_INF_FEATURE_PCD.upper() : MODEL_PCD_FEATURE_FLAG,
> > - TAB_INF_PCD_EX.upper() : MODEL_PCD_DYNAMIC_EX,
> > - TAB_INF_PCD.upper() : MODEL_PCD_DYNAMIC,
> > - TAB_SOURCES.upper() : MODEL_EFI_SOURCE_FILE,
> > - TAB_GUIDS.upper() : MODEL_EFI_GUID,
> > - TAB_PROTOCOLS.upper() : MODEL_EFI_PROTOCOL,
> > - TAB_PPIS.upper() : MODEL_EFI_PPI,
> > - TAB_DEPEX.upper() : MODEL_EFI_DEPEX,
> > - TAB_BINARIES.upper() : MODEL_EFI_BINARY_FILE,
> > - TAB_USER_EXTENSIONS.upper() :
> > MODEL_META_DATA_USER_EXTENSION
> > - }
> > -
> > -gComponentType2ModuleType = {
> > - "LIBRARY" : "BASE",
> > - "SECURITY_CORE" : "SEC",
> > - "PEI_CORE" : "PEI_CORE",
> > - "COMBINED_PEIM_DRIVER" : "PEIM",
> > - "PIC_PEIM" : "PEIM",
> > - "RELOCATABLE_PEIM" : "PEIM",
> > - "PE32_PEIM" : "PEIM",
> > - "BS_DRIVER" : "DXE_DRIVER",
> > - "RT_DRIVER" : "DXE_RUNTIME_DRIVER",
> > - "SAL_RT_DRIVER" : "DXE_SAL_DRIVER",
> > - "APPLICATION" : "UEFI_APPLICATION",
> > - "LOGO" : "BASE",
> > -}
> > -
> > -gNmakeFlagPattern = re.compile("(?:EBC_)?([A-
> > Z]+)_(?:STD_|PROJ_|ARCH_)?FLAGS(?:_DLL|_ASL|_EXE)?", re.UNICODE)
> > -gNmakeFlagName2ToolCode = {
> > - "C" : "CC",
> > - "LIB" : "SLINK",
> > - "LINK" : "DLINK",
> > -}
> > -
> > -class InfHeader(ModuleHeaderClass):
> > - _Mapping_ = {
> > - #
> > - # Required Fields
> > - #
> > - TAB_INF_DEFINES_BASE_NAME : "Name",
> > - TAB_INF_DEFINES_FILE_GUID : "Guid",
> > - TAB_INF_DEFINES_MODULE_TYPE : "ModuleType",
> > - TAB_INF_DEFINES_EFI_SPECIFICATION_VERSION :
> > "UefiSpecificationVersion",
> > - TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION :
> > "UefiSpecificationVersion",
> > - TAB_INF_DEFINES_EDK_RELEASE_VERSION :
> "EdkReleaseVersion",
> > - #
> > - # Optional Fields
> > - #
> > - TAB_INF_DEFINES_INF_VERSION : "InfVersion",
> > - TAB_INF_DEFINES_BINARY_MODULE : "BinaryModule",
> > - TAB_INF_DEFINES_COMPONENT_TYPE : "ComponentType",
> > - TAB_INF_DEFINES_MAKEFILE_NAME : "MakefileName",
> > - TAB_INF_DEFINES_BUILD_NUMBER : "BuildNumber",
> > - TAB_INF_DEFINES_BUILD_TYPE : "BuildType",
> > - TAB_INF_DEFINES_FFS_EXT : "FfsExt",
> > - TAB_INF_DEFINES_FV_EXT : "FvExt",
> > - TAB_INF_DEFINES_SOURCE_FV : "SourceFv",
> > - TAB_INF_DEFINES_VERSION_NUMBER : "VersionNumber",
> > - TAB_INF_DEFINES_VERSION_STRING : "VersionString",
> > - TAB_INF_DEFINES_VERSION : "Version",
> > - TAB_INF_DEFINES_PCD_IS_DRIVER : "PcdIsDriver",
> > - TAB_INF_DEFINES_TIANO_EDK_FLASHMAP_H :
> > "TianoEdkFlashMap_h",
> > - TAB_INF_DEFINES_SHADOW : "Shadow",
> > -# TAB_INF_DEFINES_LIBRARY_CLASS : "LibraryClass",
> > -# TAB_INF_DEFINES_ENTRY_POINT : "ExternImages",
> > -# TAB_INF_DEFINES_UNLOAD_IMAGE : "ExternImages",
> > -# TAB_INF_DEFINES_CONSTRUCTOR : ,
> > -# TAB_INF_DEFINES_DESTRUCTOR : ,
> > -# TAB_INF_DEFINES_DEFINE : "Define",
> > -# TAB_INF_DEFINES_SPEC : "Specification",
> > -# TAB_INF_DEFINES_CUSTOM_MAKEFILE : "CustomMakefile",
> > -# TAB_INF_DEFINES_MACRO :
> > - }
> > -
> > - def __init__(self):
> > - ModuleHeaderClass.__init__(self)
> > - self.VersionNumber = ''
> > - self.VersionString = ''
> > - #print self.__dict__
> > - def __setitem__(self, key, value):
> > - self.__dict__[self._Mapping_[key]] = value
> > - def __getitem__(self, key):
> > - return self.__dict__[self._Mapping_[key]]
> > - ## "in" test support
> > - def __contains__(self, key):
> > - return key in self._Mapping_
> > -
> > -## Inf
> > -#
> > -# This class defined the structure used in Inf object
> > -#
> > -# @param Ffilename: Input value for Ffilename of Inf file, default is
> > None
> > -# @param IsMergeAllArches: Input value for IsMergeAllArches
> > -# True is to merge all arches
> > -# Fales is not to merge all arches
> > -# default is False
> > -# @param IsToModule: Input value for IsToModule
> > -# True is to transfer to ModuleObject automatically
> > -# False is not to transfer to ModuleObject automatically
> > -# default is False
> > -# @param WorkspaceDir: Input value for current workspace directory,
> > default is None
> > -#
> > -# @var Identification: To store value for Identification, it is a structure as
> > Identification
> > -# @var UserExtensions: To store value for UserExtensions
> > -# @var Module: To store value for Module, it is a structure as
> > ModuleClass
> > -# @var WorkspaceDir: To store value for WorkspaceDir
> > -# @var KeyList: To store value for KeyList, a list for all Keys used in
> Inf
> > -#
> > -class Inf(object):
> > - def __init__(self, Filename=None, IsToDatabase=False,
> > IsToModule=False, WorkspaceDir=None, Database=None,
> > SupArchList=DataType.ARCH_LIST):
> > - self.Identification = Identification()
> > - self.Module = ModuleClass()
> > - self.UserExtensions = ''
> > - self.WorkspaceDir = WorkspaceDir
> > - self.SupArchList = SupArchList
> > - self.IsToDatabase = IsToDatabase
> > -
> > - self.Cur = Database.Cur
> > - self.TblFile = Database.TblFile
> > - self.TblInf = Database.TblInf
> > - self.FileID = -1
> > - #self.TblInf = TableInf(Database.Cur)
> > -
> > - self.KeyList = [
> > - TAB_SOURCES, TAB_BUILD_OPTIONS, TAB_BINARIES,
> TAB_INCLUDES,
> > TAB_GUIDS,
> > - TAB_PROTOCOLS, TAB_PPIS, TAB_LIBRARY_CLASSES,
> TAB_PACKAGES,
> > TAB_LIBRARIES,
> > - TAB_INF_FIXED_PCD, TAB_INF_PATCH_PCD,
> > TAB_INF_FEATURE_PCD, TAB_INF_PCD,
> > - TAB_INF_PCD_EX, TAB_DEPEX, TAB_NMAKE, TAB_INF_DEFINES
> > - ]
> > - #
> > - # Upper all KEYs to ignore case sensitive when parsing
> > - #
> > - self.KeyList = map(lambda c: c.upper(), self.KeyList)
> > -
> > - #
> > - # Init RecordSet
> > - #
> > - self.RecordSet = {}
> > - for Key in self.KeyList:
> > - self.RecordSet[Section[Key]] = []
> > -
> > - #
> > - # Load Inf file if filename is not None
> > - #
> > - if Filename is not None:
> > - self.LoadInfFile(Filename)
> > -
> > - #
> > - # Transfer to Module Object if IsToModule is True
> > - #
> > - if IsToModule:
> > - self.InfToModule()
> > -
> > - ## Transfer to Module Object
> > - #
> > - # Transfer all contents of an Inf file to a standard Module Object
> > - #
> > - def InfToModule(self):
> > - #
> > - # Init global information for the file
> > - #
> > - ContainerFile = self.Identification.FileFullPath
> > -
> > - #
> > - # Generate Package Header
> > - #
> > - self.GenModuleHeader(ContainerFile)
> > -
> > - #
> > - # Generate BuildOptions
> > - #
> > - self.GenBuildOptions(ContainerFile)
> > -
> > - #
> > - # Generate Includes
> > - #
> > - self.GenIncludes(ContainerFile)
> > -
> > - #
> > - # Generate Libraries
> > - #
> > - self.GenLibraries(ContainerFile)
> > -
> > - #
> > - # Generate LibraryClasses
> > - #
> > - self.GenLibraryClasses(ContainerFile)
> > -
> > - #
> > - # Generate Packages
> > - #
> > - self.GenPackages(ContainerFile)
> > -
> > - #
> > - # Generate Nmakes
> > - #
> > - self.GenNmakes(ContainerFile)
> > -
> > - #
> > - # Generate Pcds
> > - #
> > - self.GenPcds(ContainerFile)
> > -
> > - #
> > - # Generate Sources
> > - #
> > - self.GenSources(ContainerFile)
> > -
> > - #
> > - # Generate UserExtensions
> > - #
> > - self.GenUserExtensions(ContainerFile)
> > -
> > - #
> > - # Generate Guids
> > - #
> > - self.GenGuidProtocolPpis(DataType.TAB_GUIDS, ContainerFile)
> > -
> > - #
> > - # Generate Protocols
> > - #
> > - self.GenGuidProtocolPpis(DataType.TAB_PROTOCOLS, ContainerFile)
> > -
> > - #
> > - # Generate Ppis
> > - #
> > - self.GenGuidProtocolPpis(DataType.TAB_PPIS, ContainerFile)
> > -
> > - #
> > - # Generate Depexes
> > - #
> > - self.GenDepexes(ContainerFile)
> > -
> > - #
> > - # Generate Binaries
> > - #
> > - self.GenBinaries(ContainerFile)
> > -
> > - ## Parse [Defines] section
> > - #
> > - # Parse [Defines] section into InfDefines object
> > - #
> > - # @param InfFile The path of the INF file
> > - # @param Section The title of "Defines" section
> > - # @param Lines The content of "Defines" section
> > - #
> > - def ParseDefines(self, InfFile, Section, Lines):
> > - TokenList = Section.split(TAB_SPLIT)
> > - if len(TokenList) == 3:
> > - RaiseParserError(Section, "Defines", InfFile, "[xx.yy.%s] format (with
> > platform) is not supported")
> > - if len(TokenList) == 2:
> > - Arch = TokenList[1].upper()
> > - else:
> > - Arch = TAB_ARCH_COMMON
> > -
> > - if Arch not in self.Defines:
> > - self.Defines[Arch] = InfDefines()
> > - GetSingleValueOfKeyFromLines(Lines,
> > self.Defines[Arch].DefinesDictionary,
> > - TAB_COMMENT_SPLIT, TAB_EQUAL_SPLIT, False,
> None)
> > -
> > - ## Load Inf file
> > - #
> > - # Load the file if it exists
> > - #
> > - # @param Filename: Input value for filename of Inf file
> > - #
> > - def LoadInfFile(self, Filename):
> > - #
> > - # Insert a record for file
> > - #
> > - Filename = NormPath(Filename)
> > - self.Identification.FileFullPath = Filename
> > - (self.Identification.FileRelativePath, self.Identification.FileName) =
> > os.path.split(Filename)
> > - self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_INF)
> > -
> > - #
> > - # Init InfTable
> > - #
> > - #self.TblInf.Table = "Inf%s" % self.FileID
> > - #self.TblInf.Create()
> > -
> > - #
> > - # Init common datas
> > - #
> > - IfDefList, SectionItemList, CurrentSection, ArchList, ThirdList,
> > IncludeFiles = \
> > - [], [], TAB_UNKNOWN, [], [], []
> > - LineNo = 0
> > -
> > - #
> > - # Parse file content
> > - #
> > - IsFindBlockComment = False
> > - ReservedLine = ''
> > - for Line in open(Filename, 'r'):
> > - LineNo = LineNo + 1
> > - #
> > - # Remove comment block
> > - #
> > - if Line.find(TAB_COMMENT_EDK_START) > -1:
> > - ReservedLine = GetSplitList(Line, TAB_COMMENT_EDK_START,
> 1)[0]
> > - IsFindBlockComment = True
> > - if Line.find(TAB_COMMENT_EDK_END) > -1:
> > - Line = ReservedLine + GetSplitList(Line,
> TAB_COMMENT_EDK_END,
> > 1)[1]
> > - ReservedLine = ''
> > - IsFindBlockComment = False
> > - if IsFindBlockComment:
> > - continue
> > -
> > - #
> > - # Remove comments at tail and remove spaces again
> > - #
> > - Line = CleanString(Line)
> > - if Line == '':
> > - continue
> > -
> > - #
> > - # Find a new section tab
> > - # First insert previous section items
> > - # And then parse the content of the new section
> > - #
> > - if Line.startswith(TAB_SECTION_START) and
> > Line.endswith(TAB_SECTION_END):
> > - if Line[1:3] == "--":
> > - continue
> > - Model = Section[CurrentSection.upper()]
> > - #
> > - # Insert items data of previous section
> > - #
> > - InsertSectionItemsIntoDatabase(self.TblInf, self.FileID, Filename,
> > Model, CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList,
> > self.RecordSet)
> > - #
> > - # Parse the new section
> > - #
> > - SectionItemList = []
> > - ArchList = []
> > - ThirdList = []
> > -
> > - CurrentSection = ''
> > - LineList =
> GetSplitValueList(Line[len(TAB_SECTION_START):len(Line)
> > - len(TAB_SECTION_END)], TAB_COMMA_SPLIT)
> > - for Item in LineList:
> > - ItemList = GetSplitValueList(Item, TAB_SPLIT)
> > - if CurrentSection == '':
> > - CurrentSection = ItemList[0]
> > - else:
> > - if CurrentSection != ItemList[0]:
> > - EdkLogger.error("Parser", PARSER_ERROR, "Different
> section
> > names '%s' and '%s' are found in one section definition, this is not allowed."
> > % (CurrentSection, ItemList[0]), File=Filename, Line=LineNo,
> > RaiseError=EdkLogger.IsRaiseError)
> > - if CurrentSection.upper() not in self.KeyList:
> > - RaiseParserError(Line, CurrentSection, Filename, '', LineNo)
> > - CurrentSection = TAB_UNKNOWN
> > - continue
> > - ItemList.append('')
> > - ItemList.append('')
> > - if len(ItemList) > 5:
> > - RaiseParserError(Line, CurrentSection, Filename, '', LineNo)
> > - else:
> > - if ItemList[1] != '' and ItemList[1].upper() not in
> > ARCH_LIST_FULL:
> > - EdkLogger.error("Parser", PARSER_ERROR, "Invalid Arch
> > definition '%s' found" % ItemList[1], File=Filename, Line=LineNo,
> > RaiseError=EdkLogger.IsRaiseError)
> > - ArchList.append(ItemList[1].upper())
> > - ThirdList.append(ItemList[2])
> > -
> > - continue
> > -
> > - #
> > - # Not in any defined section
> > - #
> > - if CurrentSection == TAB_UNKNOWN:
> > - ErrorMsg = "%s is not in any defined section" % Line
> > - EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg,
> File=Filename,
> > Line=LineNo, RaiseError=EdkLogger.IsRaiseError)
> > -
> > - #
> > - # Add a section item
> > - #
> > - SectionItemList.append([Line, LineNo])
> > - # End of parse
> > - #End of For
> > -
> > - #
> > - # Insert items data of last section
> > - #
> > - Model = Section[CurrentSection.upper()]
> > - InsertSectionItemsIntoDatabase(self.TblInf, self.FileID, Filename,
> > Model, CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList,
> > self.RecordSet)
> > -
> > - #
> > - # Replace all DEFINE macros with its actual values
> > - #
> > - ParseDefineMacro2(self.TblInf, self.RecordSet,
> > GlobalData.gGlobalDefines)
> > -
> > - ## Show detailed information of Module
> > - #
> > - # Print all members and their values of Module class
> > - #
> > - def ShowModule(self):
> > - M = self.Module
> > - for Arch in M.Header.keys():
> > - print '\nArch =', Arch
> > - print 'Filename =', M.Header[Arch].FileName
> > - print 'FullPath =', M.Header[Arch].FullPath
> > - print 'BaseName =', M.Header[Arch].Name
> > - print 'Guid =', M.Header[Arch].Guid
> > - print 'Version =', M.Header[Arch].Version
> > - print 'InfVersion =', M.Header[Arch].InfVersion
> > - print 'UefiSpecificationVersion =',
> > M.Header[Arch].UefiSpecificationVersion
> > - print 'EdkReleaseVersion =', M.Header[Arch].EdkReleaseVersion
> > - print 'ModuleType =', M.Header[Arch].ModuleType
> > - print 'BinaryModule =', M.Header[Arch].BinaryModule
> > - print 'ComponentType =', M.Header[Arch].ComponentType
> > - print 'MakefileName =', M.Header[Arch].MakefileName
> > - print 'BuildNumber =', M.Header[Arch].BuildNumber
> > - print 'BuildType =', M.Header[Arch].BuildType
> > - print 'FfsExt =', M.Header[Arch].FfsExt
> > - print 'FvExt =', M.Header[Arch].FvExt
> > - print 'SourceFv =', M.Header[Arch].SourceFv
> > - print 'PcdIsDriver =', M.Header[Arch].PcdIsDriver
> > - print 'TianoEdkFlashMap_h =',
> M.Header[Arch].TianoEdkFlashMap_h
> > - print 'Shadow =', M.Header[Arch].Shadow
> > - print 'LibraryClass =', M.Header[Arch].LibraryClass
> > - for Item in M.Header[Arch].LibraryClass:
> > - print Item.LibraryClass,
> > DataType.TAB_VALUE_SPLIT.join(Item.SupModuleList)
> > - print 'CustomMakefile =', M.Header[Arch].CustomMakefile
> > - print 'Define =', M.Header[Arch].Define
> > - print 'Specification =', M.Header[Arch].Specification
> > - for Item in self.Module.ExternImages:
> > - print '\nEntry_Point = %s, UnloadImage = %s' %
> > (Item.ModuleEntryPoint, Item.ModuleUnloadImage)
> > - for Item in self.Module.ExternLibraries:
> > - print 'Constructor = %s, Destructor = %s' % (Item.Constructor,
> > Item.Destructor)
> > - print '\nBuildOptions =', M.BuildOptions
> > - for Item in M.BuildOptions:
> > - print Item.ToolChainFamily, Item.ToolChain, Item.Option,
> > Item.SupArchList
> > - print '\nIncludes =', M.Includes
> > - for Item in M.Includes:
> > - print Item.FilePath, Item.SupArchList
> > - print '\nLibraries =', M.Libraries
> > - for Item in M.Libraries:
> > - print Item.Library, Item.SupArchList
> > - print '\nLibraryClasses =', M.LibraryClasses
> > - for Item in M.LibraryClasses:
> > - print Item.LibraryClass, Item.RecommendedInstance,
> > Item.FeatureFlag, Item.SupModuleList, Item.SupArchList, Item.Define
> > - print '\nPackageDependencies =', M.PackageDependencies
> > - for Item in M.PackageDependencies:
> > - print Item.FilePath, Item.SupArchList, Item.FeatureFlag
> > - print '\nNmake =', M.Nmake
> > - for Item in M.Nmake:
> > - print Item.Name, Item.Value, Item.SupArchList
> > - print '\nPcds =', M.PcdCodes
> > - for Item in M.PcdCodes:
> > - print '\tCName=', Item.CName, 'TokenSpaceGuidCName=',
> > Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue,
> > 'ItemType=', Item.ItemType, Item.SupArchList
> > - print '\nSources =', M.Sources
> > - for Source in M.Sources:
> > - print Source.SourceFile, 'Fam=', Source.ToolChainFamily, 'Pcd=',
> > Source.FeatureFlag, 'Tag=', Source.TagName, 'ToolCode=',
> Source.ToolCode,
> > Source.SupArchList
> > - print '\nUserExtensions =', M.UserExtensions
> > - for UserExtension in M.UserExtensions:
> > - print UserExtension.UserID, UserExtension.Identifier,
> > UserExtension.Content
> > - print '\nGuids =', M.Guids
> > - for Item in M.Guids:
> > - print Item.CName, Item.SupArchList, Item.FeatureFlag
> > - print '\nProtocols =', M.Protocols
> > - for Item in M.Protocols:
> > - print Item.CName, Item.SupArchList, Item.FeatureFlag
> > - print '\nPpis =', M.Ppis
> > - for Item in M.Ppis:
> > - print Item.CName, Item.SupArchList, Item.FeatureFlag
> > - print '\nDepex =', M.Depex
> > - for Item in M.Depex:
> > - print Item.Depex, Item.SupArchList, Item.Define
> > - print '\nBinaries =', M.Binaries
> > - for Binary in M.Binaries:
> > - print 'Type=', Binary.FileType, 'Target=', Binary.Target, 'Name=',
> > Binary.BinaryFile, 'FeatureFlag=', Binary.FeatureFlag, 'SupArchList=',
> > Binary.SupArchList
> > -
> > - ## Convert [Defines] section content to ModuleHeaderClass
> > - #
> > - # Convert [Defines] section content to ModuleHeaderClass
> > - #
> > - # @param Defines The content under [Defines] section
> > - # @param ModuleHeader An object of ModuleHeaderClass
> > - # @param Arch The supported ARCH
> > - #
> > - def GenModuleHeader(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate ModuleHeader ...")
> > - File = self.Identification.FileFullPath
> > - #
> > - # Update all defines item in database
> > - #
> > - RecordSet = self.RecordSet[MODEL_META_DATA_HEADER]
> > - for Record in RecordSet:
> > - ValueList = GetSplitValueList(Record[0], TAB_EQUAL_SPLIT)
> > - if len(ValueList) != 2:
> > - RaiseParserError(Record[0], 'Defines', ContainerFile, '<Key> =
> > <Value>', Record[2])
> > - ID, Value1, Value2, Arch, LineNo = Record[3], ValueList[0],
> > ValueList[1], Record[1], Record[2]
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s'
> > - where ID = %s""" % (self.TblInf.Table,
> > ConvertToSqlString2(Value1), ConvertToSqlString2(Value2), ID)
> > - self.TblInf.Exec(SqlCommand)
> > -
> > - for Arch in DataType.ARCH_LIST:
> > - ModuleHeader = InfHeader()
> > - ModuleHeader.FileName = self.Identification.FileName
> > - ModuleHeader.FullPath = self.Identification.FileFullPath
> > - DefineList = QueryDefinesItem2(self.TblInf, Arch, self.FileID)
> > -
> > - NotProcessedDefineList = []
> > - for D in DefineList:
> > - if D[0] in ModuleHeader:
> > - ModuleHeader[D[0]] = GetSplitValueList(D[1])[0]
> > - else:
> > - NotProcessedDefineList.append(D)
> > -
> > - if ModuleHeader.ComponentType == "LIBRARY":
> > - Lib = LibraryClassClass()
> > - Lib.LibraryClass = ModuleHeader.Name
> > - Lib.SupModuleList = DataType.SUP_MODULE_LIST
> > - ModuleHeader.LibraryClass.append(Lib)
> > -
> > - # we need to make some key defines resolved first
> > - for D in NotProcessedDefineList:
> > - if D[0] == TAB_INF_DEFINES_LIBRARY_CLASS:
> > - List = GetSplitValueList(D[1], DataType.TAB_VALUE_SPLIT, 1)
> > - Lib = LibraryClassClass()
> > - Lib.LibraryClass = CleanString(List[0])
> > - if len(List) == 1:
> > - Lib.SupModuleList = DataType.SUP_MODULE_LIST
> > - elif len(List) == 2:
> > - Lib.SupModuleList = GetSplitValueList(CleanString(List[1]), ' ')
> > - ModuleHeader.LibraryClass.append(Lib)
> > - elif D[0] == TAB_INF_DEFINES_CUSTOM_MAKEFILE:
> > - List = D[1].split(DataType.TAB_VALUE_SPLIT)
> > - if len(List) == 2:
> > - ModuleHeader.CustomMakefile[CleanString(List[0])] =
> > CleanString(List[1])
> > - else:
> > - RaiseParserError(D[1], 'CUSTOM_MAKEFILE of Defines', File,
> > 'CUSTOM_MAKEFILE=<Family>|<Filename>', D[2])
> > - elif D[0] == TAB_INF_DEFINES_ENTRY_POINT:
> > - Image = ModuleExternImageClass()
> > - Image.ModuleEntryPoint = CleanString(D[1])
> > - self.Module.ExternImages.append(Image)
> > - elif D[0] == TAB_INF_DEFINES_UNLOAD_IMAGE:
> > - Image = ModuleExternImageClass()
> > - Image.ModuleUnloadImage = CleanString(D[1])
> > - self.Module.ExternImages.append(Image)
> > - elif D[0] == TAB_INF_DEFINES_CONSTRUCTOR:
> > - LibraryClass = ModuleExternLibraryClass()
> > - LibraryClass.Constructor = CleanString(D[1])
> > - self.Module.ExternLibraries.append(LibraryClass)
> > - elif D[0] == TAB_INF_DEFINES_DESTRUCTOR:
> > - LibraryClass = ModuleExternLibraryClass()
> > - LibraryClass.Destructor = CleanString(D[1])
> > - self.Module.ExternLibraries.append(LibraryClass)
> > - elif D[0] == TAB_INF_DEFINES_DEFINE:
> > - List = D[1].split(DataType.TAB_EQUAL_SPLIT)
> > - if len(List) != 2:
> > - RaiseParserError(Item, 'DEFINE of Defines', File, 'DEFINE
> <Word>
> > = <Word>', D[2])
> > - else:
> > - ModuleHeader.Define[CleanString(List[0])] =
> > CleanString(List[1])
> > - elif D[0] == TAB_INF_DEFINES_SPEC:
> > - List = D[1].split(DataType.TAB_EQUAL_SPLIT)
> > - if len(List) != 2:
> > - RaiseParserError(Item, 'SPEC of Defines', File, 'SPEC <Word> =
> > <Version>', D[2])
> > - else:
> > - ModuleHeader.Specification[CleanString(List[0])] =
> > CleanString(List[1])
> > -
> > - #
> > - # Get version of INF
> > - #
> > - if ModuleHeader.InfVersion != "":
> > - # EdkII inf
> > - VersionNumber = ModuleHeader.VersionNumber
> > - VersionString = ModuleHeader.VersionString
> > - if len(VersionNumber) > 0 and len(VersionString) == 0:
> > - EdkLogger.warn(2000, 'VERSION_NUMBER depricated; INF file
> %s
> > should be modified to use VERSION_STRING instead.' %
> > self.Identification.FileFullPath)
> > - ModuleHeader.Version = VersionNumber
> > - if len(VersionString) > 0:
> > - if len(VersionNumber) > 0:
> > - EdkLogger.warn(2001, 'INF file %s defines both
> > VERSION_NUMBER and VERSION_STRING, using VERSION_STRING' %
> > self.Identification.FileFullPath)
> > - ModuleHeader.Version = VersionString
> > - else:
> > - # Edk inf
> > - ModuleHeader.InfVersion = "0x00010000"
> > - if ModuleHeader.ComponentType in
> > gComponentType2ModuleType:
> > - ModuleHeader.ModuleType =
> > gComponentType2ModuleType[ModuleHeader.ComponentType]
> > - elif ModuleHeader.ComponentType != '':
> > - EdkLogger.error("Parser", PARSER_ERROR, "Unsupported Edk
> > component type [%s]" % ModuleHeader.ComponentType, ExtraData=File,
> > RaiseError=EdkLogger.IsRaiseError)
> > -
> > - self.Module.Header[Arch] = ModuleHeader
> > -
> > -
> > - ## GenBuildOptions
> > - #
> > - # Gen BuildOptions of Inf
> > - # [<Family>:]<ToolFlag>=Flag
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenBuildOptions(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_BUILD_OPTIONS)
> > - BuildOptions = {}
> > - #
> > - # Get all BuildOptions
> > - #
> > - RecordSet = self.RecordSet[MODEL_META_DATA_BUILD_OPTION]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (Family, ToolChain, Flag) = GetBuildOption(Record[0],
> > ContainerFile, Record[2])
> > - MergeArches(BuildOptions, (Family, ToolChain, Flag), Arch)
> > - #
> > - # Update to Database
> > - #
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s',
> > Value3 = '%s'
> > - where ID = %s""" % (self.TblInf.Table,
> > ConvertToSqlString2(Family), ConvertToSqlString2(ToolChain),
> > ConvertToSqlString2(Flag), Record[3])
> > - self.TblInf.Exec(SqlCommand)
> > -
> > - for Key in BuildOptions.keys():
> > - BuildOption = BuildOptionClass(Key[0], Key[1], Key[2])
> > - BuildOption.SupArchList = BuildOptions[Key]
> > - self.Module.BuildOptions.append(BuildOption)
> > -
> > - ## GenIncludes
> > - #
> > - # Gen Includes of Inf
> > - #
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenIncludes(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_INCLUDES)
> > - Includes = sdict()
> > - #
> > - # Get all Includes
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_INCLUDE]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - MergeArches(Includes, Record[0], Arch)
> > -
> > - for Key in Includes.keys():
> > - Include = IncludeClass()
> > - Include.FilePath = NormPath(Key)
> > - Include.SupArchList = Includes[Key]
> > - self.Module.Includes.append(Include)
> > -
> > - ## GenLibraries
> > - #
> > - # Gen Libraries of Inf
> > - #
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenLibraries(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_LIBRARIES)
> > - Libraries = sdict()
> > - #
> > - # Get all Includes
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_LIBRARY_INSTANCE]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - MergeArches(Libraries, Record[0], Arch)
> > -
> > - for Key in Libraries.keys():
> > - Library = ModuleLibraryClass()
> > - # replace macro and remove file extension
> > - Library.Library = Key.rsplit('.', 1)[0]
> > - Library.SupArchList = Libraries[Key]
> > - self.Module.Libraries.append(Library)
> > -
> > - ## GenLibraryClasses
> > - #
> > - # Get LibraryClass of Inf
> > - # <LibraryClassKeyWord>|<LibraryInstance>
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenLibraryClasses(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_LIBRARY_CLASSES)
> > - LibraryClasses = {}
> > - #
> > - # Get all LibraryClasses
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_LIBRARY_CLASS]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (LibClassName, LibClassIns, Pcd, SupModelList) =
> > GetLibraryClassOfInf([Record[0], Record[4]], ContainerFile,
> > self.WorkspaceDir, Record[2])
> > - MergeArches(LibraryClasses, (LibClassName, LibClassIns, Pcd,
> > SupModelList), Arch)
> > - #
> > - # Update to Database
> > - #
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s',
> > Value3 = '%s'
> > - where ID = %s""" % (self.TblInf.Table,
> > ConvertToSqlString2(LibClassName), ConvertToSqlString2(LibClassIns),
> > ConvertToSqlString2(SupModelList), Record[3])
> > - self.TblInf.Exec(SqlCommand)
> > -
> > - for Key in LibraryClasses.keys():
> > - KeyList = Key[0].split(DataType.TAB_VALUE_SPLIT)
> > - LibraryClass = LibraryClassClass()
> > - LibraryClass.LibraryClass = Key[0]
> > - LibraryClass.RecommendedInstance = NormPath(Key[1])
> > - LibraryClass.FeatureFlag = Key[2]
> > - LibraryClass.SupArchList = LibraryClasses[Key]
> > - LibraryClass.SupModuleList = GetSplitValueList(Key[3])
> > - self.Module.LibraryClasses.append(LibraryClass)
> > -
> > - ## GenPackages
> > - #
> > - # Gen Packages of Inf
> > - #
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenPackages(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_PACKAGES)
> > - Packages = {}
> > - #
> > - # Get all Packages
> > - #
> > - RecordSet = self.RecordSet[MODEL_META_DATA_PACKAGE]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (Package, Pcd) = GetPackage(Record[0], ContainerFile,
> > self.WorkspaceDir, Record[2])
> > - MergeArches(Packages, (Package, Pcd), Arch)
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s'
> > - where ID = %s""" % (self.TblInf.Table,
> > ConvertToSqlString2(Package), ConvertToSqlString2(Pcd), Record[3])
> > - self.TblInf.Exec(SqlCommand)
> > -
> > -
> > - for Key in Packages.keys():
> > - Package = ModulePackageDependencyClass()
> > - Package.FilePath = NormPath(Key[0])
> > - Package.SupArchList = Packages[Key]
> > - Package.FeatureFlag = Key[1]
> > - self.Module.PackageDependencies.append(Package)
> > -
> > - ## GenNmakes
> > - #
> > - # Gen Nmakes of Inf
> > - #
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenNmakes(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_NMAKE)
> > - Nmakes = sdict()
> > - #
> > - # Get all Nmakes
> > - #
> > - RecordSet = self.RecordSet[MODEL_META_DATA_NMAKE]
> > -
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - MergeArches(Nmakes, Record[0], Arch)
> > -
> > - for Key in Nmakes.keys():
> > - List = GetSplitValueList(Key, DataType.TAB_EQUAL_SPLIT,
> MaxSplit=1)
> > - if len(List) != 2:
> > - RaiseParserError(Key, 'Nmake', ContainerFile, '<MacroName> =
> > <Value>')
> > - continue
> > - Nmake = ModuleNmakeClass()
> > - Nmake.Name = List[0]
> > - Nmake.Value = List[1]
> > - Nmake.SupArchList = Nmakes[Key]
> > - self.Module.Nmake.append(Nmake)
> > -
> > - # convert Edk format to EdkII format
> > - if Nmake.Name == "IMAGE_ENTRY_POINT":
> > - Image = ModuleExternImageClass()
> > - Image.ModuleEntryPoint = Nmake.Value
> > - self.Module.ExternImages.append(Image)
> > - elif Nmake.Name == "DPX_SOURCE":
> > - Source = ModuleSourceFileClass(NormPath(Nmake.Value), "", "",
> > "", "", Nmake.SupArchList)
> > - self.Module.Sources.append(Source)
> > - else:
> > - ToolList = gNmakeFlagPattern.findall(Nmake.Name)
> > - if len(ToolList) == 0 or len(ToolList) != 1:
> > - EdkLogger.warn("\nParser", "Don't know how to do with
> MACRO:
> > %s" % Nmake.Name,
> > - ExtraData=ContainerFile)
> > - else:
> > - if ToolList[0] in gNmakeFlagName2ToolCode:
> > - Tool = gNmakeFlagName2ToolCode[ToolList[0]]
> > - else:
> > - Tool = ToolList[0]
> > - BuildOption = BuildOptionClass("MSFT", "*_*_*_%s_FLAGS" %
> > Tool, Nmake.Value)
> > - BuildOption.SupArchList = Nmake.SupArchList
> > - self.Module.BuildOptions.append(BuildOption)
> > -
> > - ## GenPcds
> > - #
> > - # Gen Pcds of Inf
> > - # <TokenSpaceGuidCName>.<PcdCName>[|<Value>]
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenPcds(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_PCDS)
> > - Pcds = {}
> > - PcdToken = {}
> > -
> > - #
> > - # Get all Guids
> > - #
> > - RecordSet1 = self.RecordSet[MODEL_PCD_FIXED_AT_BUILD]
> > - RecordSet2 = self.RecordSet[MODEL_PCD_PATCHABLE_IN_MODULE]
> > - RecordSet3 = self.RecordSet[MODEL_PCD_FEATURE_FLAG]
> > - RecordSet4 = self.RecordSet[MODEL_PCD_DYNAMIC_EX]
> > - RecordSet5 = self.RecordSet[MODEL_PCD_DYNAMIC]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet1:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - if self.Module.Header[Arch].LibraryClass != {}:
> > - pass
> > - (TokenGuidCName, TokenName, Value, Type) =
> > GetPcdOfInf(Record[0], TAB_PCDS_FIXED_AT_BUILD, ContainerFile,
> > Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet2:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, Type) =
> > GetPcdOfInf(Record[0], TAB_PCDS_PATCHABLE_IN_MODULE,
> ContainerFile,
> > Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet3:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, Type) =
> > GetPcdOfInf(Record[0], TAB_PCDS_FEATURE_FLAG, ContainerFile,
> > Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet4:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, Type) =
> > GetPcdOfInf(Record[0], TAB_PCDS_DYNAMIC_EX, ContainerFile,
> Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - for Record in RecordSet5:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (TokenGuidCName, TokenName, Value, Type) =
> > GetPcdOfInf(Record[0], "", ContainerFile, Record[2])
> > - MergeArches(Pcds, (TokenGuidCName, TokenName, Value,
> > Type), Arch)
> > - PcdToken[Record[3]] = (TokenGuidCName, TokenName)
> > - #
> > - # Update to database
> > - #
> > - if self.IsToDatabase:
> > - for Key in PcdToken.keys():
> > - SqlCommand = """update %s set Value2 = '%s' where ID = %s""" %
> > (self.TblInf.Table, ".".join((PcdToken[Key][0], PcdToken[Key][1])), Key)
> > - self.TblInf.Exec(SqlCommand)
> > -
> > - for Key in Pcds.keys():
> > - Pcd = PcdClass()
> > - Pcd.CName = Key[1]
> > - Pcd.TokenSpaceGuidCName = Key[0]
> > - Pcd.DefaultValue = Key[2]
> > - Pcd.ItemType = Key[3]
> > - Pcd.SupArchList = Pcds[Key]
> > - self.Module.PcdCodes.append(Pcd)
> > -
> > - ## GenSources
> > - #
> > - # Gen Sources of Inf
> > - #
> > <Filename>[|<Family>[|<TagName>[|<ToolCode>[|<PcdFeatureFlag>]]]]
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenSources(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_SOURCES)
> > - Sources = {}
> > -
> > - #
> > - # Get all Nmakes
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_SOURCE_FILE]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (Filename, Family, TagName, ToolCode, Pcd) =
> > GetSource(Record[0], ContainerFile, self.Identification.FileRelativePath,
> > Record[2])
> > - MergeArches(Sources, (Filename, Family, TagName, ToolCode,
> > Pcd), Arch)
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s',
> > Value3 = '%s', Value4 = '%s', Value5 = '%s'
> > - where ID = %s""" % (self.TblInf.Table,
> > ConvertToSqlString2(Filename), ConvertToSqlString2(Family),
> > ConvertToSqlString2(TagName), ConvertToSqlString2(ToolCode),
> > ConvertToSqlString2(Pcd), Record[3])
> > - self.TblInf.Exec(SqlCommand)
> > -
> > - for Key in Sources.keys():
> > - Source = ModuleSourceFileClass(Key[0], Key[2], Key[3], Key[1],
> > Key[4], Sources[Key])
> > - self.Module.Sources.append(Source)
> > -
> > - ## GenUserExtensions
> > - #
> > - # Gen UserExtensions of Inf
> > - #
> > - def GenUserExtensions(self, ContainerFile):
> > -# #
> > -# # UserExtensions
> > -# #
> > -# if self.UserExtensions != '':
> > -# UserExtension = UserExtensionsClass()
> > -# Lines = self.UserExtensions.splitlines()
> > -# List = GetSplitValueList(Lines[0], DataType.TAB_SPLIT, 2)
> > -# if len(List) != 3:
> > -# RaiseParserError(Lines[0], 'UserExtensions', File,
> > "UserExtensions.UserId.'Identifier'")
> > -# else:
> > -# UserExtension.UserID = List[1]
> > -# UserExtension.Identifier = List[2][0:-1].replace("'", '').replace('\"',
> > '')
> > -# for Line in Lines[1:]:
> > -# UserExtension.Content = UserExtension.Content +
> > CleanString(Line) + '\n'
> > -# self.Module.UserExtensions.append(UserExtension)
> > - pass
> > -
> > - ## GenDepexes
> > - #
> > - # Gen Depex of Inf
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenDepexes(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_DEPEX)
> > - Depex = {}
> > - #
> > - # Get all Depexes
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_DEPEX]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - Line = ''
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - Line = Line + Record[0] + ' '
> > - if Line != '':
> > - MergeArches(Depex, Line, Arch)
> > -
> > - for Key in Depex.keys():
> > - Dep = ModuleDepexClass()
> > - Dep.Depex = Key
> > - Dep.SupArchList = Depex[Key]
> > - self.Module.Depex.append(Dep)
> > -
> > - ## GenBinaries
> > - #
> > - # Gen Binary of Inf
> > - #
> >
> <FileType>|<Filename>|<Target>[|<TokenSpaceGuidCName>.<PcdCName
> > >]
> > - #
> > - # @param ContainerFile: The Dec file full path
> > - #
> > - def GenBinaries(self, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % TAB_BINARIES)
> > - Binaries = {}
> > -
> > - #
> > - # Get all Guids
> > - #
> > - RecordSet = self.RecordSet[MODEL_EFI_BINARY_FILE]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (FileType, Filename, Target, Pcd) = GetBinary(Record[0],
> > ContainerFile, self.Identification.FileRelativePath, Record[2])
> > - MergeArches(Binaries, (FileType, Filename, Target, Pcd), Arch)
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s',
> > Value3 = '%s', Value4 = '%s'
> > - where ID = %s""" % (self.TblInf.Table,
> > ConvertToSqlString2(FileType), ConvertToSqlString2(Filename),
> > ConvertToSqlString2(Target), ConvertToSqlString2(Pcd), Record[3])
> > - self.TblInf.Exec(SqlCommand)
> > -
> > - for Key in Binaries.keys():
> > - Binary = ModuleBinaryFileClass(NormPath(Key[1]), Key[0], Key[2],
> > Key[3], Binaries[Key])
> > - self.Module.Binaries.append(Binary)
> > -
> > - ## GenGuids
> > - #
> > - # Gen Guids of Inf
> > - # <CName>=<GuidValue>
> > - #
> > - # @param ContainerFile: The Inf file full path
> > - #
> > - def GenGuidProtocolPpis(self, Type, ContainerFile):
> > - EdkLogger.debug(2, "Generate %s ..." % Type)
> > - Lists = {}
> > - #
> > - # Get all Items
> > - #
> > - RecordSet = self.RecordSet[Section[Type.upper()]]
> > -
> > - #
> > - # Go through each arch
> > - #
> > - for Arch in self.SupArchList:
> > - for Record in RecordSet:
> > - if Record[1] == Arch or Record[1] == TAB_ARCH_COMMON:
> > - (Name, Value) = GetGuidsProtocolsPpisOfInf(Record[0], Type,
> > ContainerFile, Record[2])
> > - MergeArches(Lists, (Name, Value), Arch)
> > - if self.IsToDatabase:
> > - SqlCommand = """update %s set Value1 = '%s', Value2 = '%s'
> > - where ID = %s""" % (self.TblInf.Table,
> > ConvertToSqlString2(Name), ConvertToSqlString2(Value), Record[3])
> > - self.TblInf.Exec(SqlCommand)
> > -
> > - ListMember = None
> > - if Type == TAB_GUIDS:
> > - ListMember = self.Module.Guids
> > - elif Type == TAB_PROTOCOLS:
> > - ListMember = self.Module.Protocols
> > - elif Type == TAB_PPIS:
> > - ListMember = self.Module.Ppis
> > -
> > - for Key in Lists.keys():
> > - ListClass = GuidProtocolPpiCommonClass()
> > - ListClass.CName = Key[0]
> > - ListClass.SupArchList = Lists[Key]
> > - ListClass.FeatureFlag = Key[1]
> > - ListMember.append(ListClass)
> > -
> > -##
> > -#
> > -# This acts like the main() function for the script, unless it is 'import'ed into
> > another
> > -# script.
> > -#
> > -if __name__ == '__main__':
> > - EdkLogger.Initialize()
> > - EdkLogger.SetLevel(EdkLogger.DEBUG_0)
> > -
> > - W = os.getenv('WORKSPACE')
> > - F = os.path.join(W,
> > 'MdeModulePkg/Application/HelloWorld/HelloWorld.inf')
> > -
> > - Db = Database.Database('Inf.db')
> > - Db.InitDatabase()
> > -
> > - P = Inf(os.path.normpath(F), True, True, W, Db)
> > - P.ShowModule()
> > -
> > - Db.Close()
> > --
> > 2.16.2.windows.1
prev parent reply other threads:[~2018-04-10 14:26 UTC|newest]
Thread overview: 9+ messages / expand[flat|nested] mbox.gz Atom feed top
2018-04-04 15:01 [PATCH v1 0/2] BaseTools: remove files not needed Jaben Carsey
2018-04-04 15:01 ` [PATCH v1 1/2] BaseTools: copy a dictionary from InfClassObject to BuildReport Jaben Carsey
2018-04-10 6:32 ` Zhu, Yonghong
2018-04-04 15:01 ` [PATCH v1 2/2] BaseTools: Remove unneeded files Jaben Carsey
2018-04-08 8:16 ` Zhu, Yonghong
2018-04-09 21:11 ` Carsey, Jaben
2018-04-10 0:35 ` Zhu, Yonghong
2018-04-10 6:39 ` Zhu, Yonghong
2018-04-10 14:26 ` Carsey, Jaben [this message]
Reply instructions:
You may reply publicly to this message via plain-text email
using any one of the following methods:
* Save the following mbox file, import it into your mail client,
and reply-to-list from there: mbox
Avoid top-posting and favor interleaved quoting:
https://en.wikipedia.org/wiki/Posting_style#Interleaved_style
* Reply using the --to, --cc, and --in-reply-to
switches of git-send-email(1):
git send-email \
--in-reply-to=CB6E33457884FA40993F35157061515CA3CBDCB3@FMSMSX103.amr.corp.intel.com \
--to=devel@edk2.groups.io \
/path/to/YOUR_REPLY
https://kernel.org/pub/software/scm/git/docs/git-send-email.html
* If your mail client supports setting the In-Reply-To header
via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line
before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox