* [PATCH v2 00/20] BaseTools: One step toward python3
@ 2018-02-01 8:35 Gary Lin
2018-02-01 8:35 ` [PATCH v2 01/20] BaseTools: Refactor python except statements Gary Lin
` (20 more replies)
0 siblings, 21 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
v2 changes:
- Rebase to the current git HEAD (821807bcefb9a36e598d71a8004fae5aab2052a0)
- Apply "futurize -f libfuturize.fixes.fix_absolute_import" and
refactor some python scripts to break the circular imports.
This patch series is also available in
https://github.com/lcp/edk2/tree/python3-futurize-v2
Since python2 will be EOL in 2020, we start to evaluate the impact of
the python2 removal. As expected, OMVF building failed the test. It's
actually a task noted in the wiki page:
https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support
Maybe it's time to convert the python scripts gradully.
This patchset doesn't make the python scripts in BaseTools compatible
with python3 immediately. It aims to do the trivial and safe conversion
and replacement to make some statements compatible with both python2 and
python3, so we can deal with the difficult cases later.
With the help of "futurize" from python-future, it's easier to refactor
the statements. This patchset is basically equivalent to "futurize -1"
plus "StringIO.StringIO => io.BytesIO".
For the "io.BytesIO" change, it MIGHT introduce slow down to the build
time since io.BytesIO is slower than StringIO.StringIO in python2(*).
For a quick test, I built OVMF with the following command based on
8ab0bd2397c9d3922e0c7dbb1aa6f7e08799079f:
$ rm -rf Build && make -C BaseTools/ clean
$ time ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
-D NETWORK_IP6_ENABLE \
-D HTTP_BOOT_ENABLE \
-D TLS_ENABLE
Before io.BytesIO:
Build total time: 00:03:56
real 4m22.991s
user 3m55.874s
sys 0m27.250s
After io.BytesIO:
Build total time: 00:03:57
real 4m23.953s
user 3m57.526s
sys 0m27.192s
The difference is only 1 second, and I would say the impact is subtle.
The next step will be fixing relative import and maybe applying more
futurize fixes. We won't get there soon but at least we are moving...
(*) https://stackoverflow.com/questions/37462075/confusing-about-stringio-cstringio-and-byteio
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
Gary Lin (20):
BaseTools: Refactor python except statements
BaseTools: Refactor python print statements
BaseTools: Remove the old python "not-equal"
BaseTools: Use the python3-range functions
BaseTools: Remove tuple parameter in python scripts
BaseTools: Remove the deprecated hash_key()
BaseTools: Import reduce() from functools
BaseTools: Replace StandardError with Expression
BaseTools: Remove types.TypeType
BaseTools: Refactor python raise statement
BaseTools: Adjust the spaces around commas and colons
BaseTools: Migrate to the new octal literal
BaseTools: Unify long int and int in python scripts
BaseTools: Adjust old python2 idioms
BaseTools: Replace StringIO.StringIO with io.BytesIO
BaseTools: Treat GenFds.py and build.py as python modules
BaseTools: Adopt absolute import for python scripts
BaseTools: Move OverrideAttribs to OptRomInfStatement.py
BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py
BaseTools: Move ImageBinDict to GenFdsGlobalVariable.py
BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py | 5 +-
BaseTools/BinWrappers/PosixLike/GenFds | 2 +-
BaseTools/BinWrappers/PosixLike/build | 2 +-
BaseTools/Scripts/BinToPcd.py | 46 +++--
BaseTools/Scripts/ConvertMasmToNasm.py | 1 +
BaseTools/Scripts/ConvertUni.py | 5 -
BaseTools/Scripts/MemoryProfileSymbolGen.py | 22 +-
BaseTools/Scripts/PatchCheck.py | 7 +-
BaseTools/Scripts/RunMakefile.py | 2 +-
BaseTools/Scripts/SmiHandlerProfileSymbolGen.py | 20 +-
BaseTools/Scripts/UpdateBuildVersions.py | 18 +-
BaseTools/Source/Python/AutoGen/AutoGen.py | 98 ++++-----
BaseTools/Source/Python/AutoGen/BuildEngine.py | 38 ++--
BaseTools/Source/Python/AutoGen/GenC.py | 12 +-
BaseTools/Source/Python/AutoGen/GenDepex.py | 8 +-
BaseTools/Source/Python/AutoGen/GenMake.py | 11 +-
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 149 +++++++-------
BaseTools/Source/Python/AutoGen/GenVar.py | 166 +++++++--------
BaseTools/Source/Python/AutoGen/IdfClassObject.py | 1 -
BaseTools/Source/Python/AutoGen/InfSectionParser.py | 1 +
BaseTools/Source/Python/AutoGen/StrGather.py | 8 +-
BaseTools/Source/Python/AutoGen/UniClassObject.py | 18 +-
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 10 +-
BaseTools/Source/Python/BPDG/BPDG.py | 8 +-
BaseTools/Source/Python/BPDG/GenVpd.py | 28 +--
BaseTools/Source/Python/Common/DataType.py | 4 +-
BaseTools/Source/Python/Common/Database.py | 8 +-
BaseTools/Source/Python/Common/DecClassObject.py | 56 ++---
BaseTools/Source/Python/Common/Dictionary.py | 14 +-
BaseTools/Source/Python/Common/DscClassObject.py | 91 +++++----
BaseTools/Source/Python/Common/EdkIIWorkspace.py | 28 +--
BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py | 152 +++++++-------
BaseTools/Source/Python/Common/EdkLogger.py | 3 +-
BaseTools/Source/Python/Common/Expression.py | 86 ++++----
BaseTools/Source/Python/Common/FdfClassObject.py | 6 +-
BaseTools/Source/Python/Common/FdfParserLite.py | 47 ++---
BaseTools/Source/Python/Common/InfClassObject.py | 134 ++++++------
BaseTools/Source/Python/Common/LongFilePathOs.py | 5 +-
BaseTools/Source/Python/Common/MigrationUtilities.py | 4 +-
BaseTools/Source/Python/Common/Misc.py | 79 ++++----
BaseTools/Source/Python/Common/Parsing.py | 6 +-
BaseTools/Source/Python/Common/RangeExpression.py | 32 +--
BaseTools/Source/Python/Common/String.py | 16 +-
BaseTools/Source/Python/Common/TargetTxtClassObject.py | 24 ++-
BaseTools/Source/Python/Common/ToolDefClassObject.py | 12 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 23 ++-
BaseTools/Source/Python/CommonDataClass/ModuleClass.py | 3 +-
BaseTools/Source/Python/CommonDataClass/PackageClass.py | 3 +-
BaseTools/Source/Python/CommonDataClass/PlatformClass.py | 3 +-
BaseTools/Source/Python/Ecc/CParser.py | 178 ++++++++--------
BaseTools/Source/Python/Ecc/Check.py | 10 +-
BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 82 ++++----
BaseTools/Source/Python/Ecc/Configuration.py | 5 +-
BaseTools/Source/Python/Ecc/Database.py | 7 +-
BaseTools/Source/Python/Ecc/Ecc.py | 25 +--
BaseTools/Source/Python/Ecc/Exception.py | 6 +-
BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
BaseTools/Source/Python/Ecc/MetaDataParser.py | 8 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 5 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 44 ++--
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 5 +-
BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 9 +-
BaseTools/Source/Python/Ecc/c.py | 28 +--
BaseTools/Source/Python/Eot/CParser.py | 178 ++++++++--------
BaseTools/Source/Python/Eot/CodeFragmentCollector.py | 72 +++----
BaseTools/Source/Python/Eot/Eot.py | 15 +-
BaseTools/Source/Python/Eot/FileProfile.py | 3 +-
BaseTools/Source/Python/Eot/FvImage.py | 28 +--
BaseTools/Source/Python/Eot/InfParserLite.py | 13 +-
BaseTools/Source/Python/Eot/Parser.py | 5 +-
BaseTools/Source/Python/Eot/Report.py | 3 +-
BaseTools/Source/Python/Eot/c.py | 32 +--
BaseTools/Source/Python/GenFds/AprioriSection.py | 12 +-
BaseTools/Source/Python/GenFds/Capsule.py | 22 +-
BaseTools/Source/Python/GenFds/CapsuleData.py | 11 +-
BaseTools/Source/Python/GenFds/CompressSection.py | 7 +-
BaseTools/Source/Python/GenFds/DataSection.py | 7 +-
BaseTools/Source/Python/GenFds/DepexSection.py | 7 +-
BaseTools/Source/Python/GenFds/EfiSection.py | 13 +-
BaseTools/Source/Python/GenFds/Fd.py | 32 +--
BaseTools/Source/Python/GenFds/FdfParser.py | 100 ++++-----
BaseTools/Source/Python/GenFds/FfsFileStatement.py | 16 +-
BaseTools/Source/Python/GenFds/FfsInfStatement.py | 35 ++--
BaseTools/Source/Python/GenFds/Fv.py | 34 ++--
BaseTools/Source/Python/GenFds/FvImageSection.py | 15 +-
BaseTools/Source/Python/GenFds/GenFds.py | 126 ++----------
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 108 +++++++++-
BaseTools/Source/Python/GenFds/GuidSection.py | 11 +-
BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 3 +-
BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 30 ++-
BaseTools/Source/Python/GenFds/OptionRom.py | 23 +--
BaseTools/Source/Python/GenFds/Region.py | 17 +-
BaseTools/Source/Python/GenFds/RuleComplexFile.py | 3 +-
BaseTools/Source/Python/GenFds/RuleSimpleFile.py | 3 +-
BaseTools/Source/Python/GenFds/Section.py | 3 +-
BaseTools/Source/Python/GenFds/UiSection.py | 7 +-
BaseTools/Source/Python/GenFds/VerSection.py | 7 +-
BaseTools/Source/Python/GenFds/Vtf.py | 3 +-
BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 9 +-
BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 1 +
BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 32 +--
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 30 +--
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 36 ++--
BaseTools/Source/Python/Table/TableDataModel.py | 3 +-
BaseTools/Source/Python/Table/TableDec.py | 3 +-
BaseTools/Source/Python/Table/TableDsc.py | 3 +-
BaseTools/Source/Python/Table/TableEotReport.py | 5 +-
BaseTools/Source/Python/Table/TableFdf.py | 3 +-
BaseTools/Source/Python/Table/TableFile.py | 3 +-
BaseTools/Source/Python/Table/TableFunction.py | 3 +-
BaseTools/Source/Python/Table/TableIdentifier.py | 5 +-
BaseTools/Source/Python/Table/TableInf.py | 3 +-
BaseTools/Source/Python/Table/TablePcd.py | 5 +-
BaseTools/Source/Python/Table/TableQuery.py | 3 +-
BaseTools/Source/Python/Table/TableReport.py | 3 +-
BaseTools/Source/Python/TargetTool/TargetTool.py | 39 ++--
BaseTools/Source/Python/Trim/Trim.py | 25 +--
BaseTools/Source/Python/UPT/Core/DependencyRules.py | 12 +-
BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py | 4 +-
BaseTools/Source/Python/UPT/Core/FileHook.py | 2 +-
BaseTools/Source/Python/UPT/Core/IpiDb.py | 6 +-
BaseTools/Source/Python/UPT/Core/PackageFile.py | 12 +-
BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py | 15 +-
BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py | 42 ++--
BaseTools/Source/Python/UPT/InstallPkg.py | 2 +-
BaseTools/Source/Python/UPT/InventoryWs.py | 2 +-
BaseTools/Source/Python/UPT/Library/CommentParsing.py | 5 +-
BaseTools/Source/Python/UPT/Library/ExpressionValidate.py | 11 +-
BaseTools/Source/Python/UPT/Library/Misc.py | 11 +-
BaseTools/Source/Python/UPT/Library/ParserValidate.py | 2 +-
BaseTools/Source/Python/UPT/Library/Parsing.py | 6 +-
BaseTools/Source/Python/UPT/Library/String.py | 5 +-
BaseTools/Source/Python/UPT/Library/UniClassObject.py | 20 +-
BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py | 4 +-
BaseTools/Source/Python/UPT/MkPkg.py | 2 +-
BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py | 6 +-
BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py | 2 +-
BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py | 4 +-
BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py | 2 +-
BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py | 4 +-
BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py | 4 +-
BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py | 4 +-
BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py | 4 +-
BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py | 2 +-
BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py | 3 +-
BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py | 4 +-
BaseTools/Source/Python/UPT/Parser/DecParserMisc.py | 1 +
BaseTools/Source/Python/UPT/Parser/InfSectionParser.py | 3 +-
BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 57 +++---
BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py | 3 +-
BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py | 3 +-
BaseTools/Source/Python/UPT/ReplacePkg.py | 2 +-
BaseTools/Source/Python/UPT/RmPkg.py | 2 +-
BaseTools/Source/Python/UPT/TestInstall.py | 4 +-
BaseTools/Source/Python/UPT/UPT.py | 9 +-
BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py | 5 +-
BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py | 10 +-
BaseTools/Source/Python/UPT/Xml/CommonXml.py | 2 +-
BaseTools/Source/Python/UPT/Xml/IniToXml.py | 1 +
BaseTools/Source/Python/UPT/Xml/XmlParser.py | 25 +--
BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py | 3 +-
BaseTools/Source/Python/Workspace/BuildClassObject.py | 2 +-
BaseTools/Source/Python/Workspace/DecBuildData.py | 14 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 213 ++++++++++----------
BaseTools/Source/Python/Workspace/InfBuildData.py | 6 +-
BaseTools/Source/Python/Workspace/MetaFileParser.py | 75 +++----
BaseTools/Source/Python/Workspace/MetaFileTable.py | 15 +-
BaseTools/Source/Python/Workspace/WorkspaceCommon.py | 5 +-
BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 7 +-
BaseTools/Source/Python/build/BuildReport.py | 19 +-
BaseTools/Source/Python/build/build.py | 38 ++--
BaseTools/Tests/CheckPythonSyntax.py | 2 +-
BaseTools/Tests/TestTools.py | 13 +-
BaseTools/Tests/TianoCompress.py | 6 +-
BaseTools/gcc/mingw-gcc-build.py | 112 +++++-----
175 files changed, 2092 insertions(+), 1927 deletions(-)
--
2.16.1
^ permalink raw reply [flat|nested] 24+ messages in thread
* [PATCH v2 01/20] BaseTools: Refactor python except statements
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 02/20] BaseTools: Refactor python print statements Gary Lin
` (19 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Convert "except ... ," to "except ... as" to be compatible with python3.
Based on "futurize -f lib2to3.fixes.fix_except"
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Scripts/UpdateBuildVersions.py | 12 +-
BaseTools/Source/Python/AutoGen/AutoGen.py | 2 +-
BaseTools/Source/Python/AutoGen/GenDepex.py | 2 +-
BaseTools/Source/Python/AutoGen/GenMake.py | 2 +-
BaseTools/Source/Python/AutoGen/UniClassObject.py | 4 +-
BaseTools/Source/Python/Common/Expression.py | 16 +--
BaseTools/Source/Python/Common/FdfParserLite.py | 6 +-
BaseTools/Source/Python/Common/Misc.py | 8 +-
BaseTools/Source/Python/Common/RangeExpression.py | 6 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 2 +-
BaseTools/Source/Python/Ecc/CParser.py | 142 ++++++++++----------
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 2 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 14 +-
BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 2 +-
BaseTools/Source/Python/Ecc/c.py | 2 +-
BaseTools/Source/Python/Eot/CParser.py | 142 ++++++++++----------
BaseTools/Source/Python/Eot/FvImage.py | 2 +-
BaseTools/Source/Python/GenFds/FdfParser.py | 10 +-
BaseTools/Source/Python/GenFds/GenFds.py | 4 +-
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 2 +-
BaseTools/Source/Python/TargetTool/TargetTool.py | 2 +-
BaseTools/Source/Python/Trim/Trim.py | 4 +-
BaseTools/Source/Python/UPT/Core/DependencyRules.py | 4 +-
BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py | 4 +-
BaseTools/Source/Python/UPT/Core/IpiDb.py | 2 +-
BaseTools/Source/Python/UPT/Core/PackageFile.py | 12 +-
BaseTools/Source/Python/UPT/InstallPkg.py | 2 +-
BaseTools/Source/Python/UPT/InventoryWs.py | 2 +-
| 2 +-
BaseTools/Source/Python/UPT/Library/ExpressionValidate.py | 8 +-
BaseTools/Source/Python/UPT/Library/UniClassObject.py | 8 +-
BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py | 2 +-
BaseTools/Source/Python/UPT/MkPkg.py | 2 +-
BaseTools/Source/Python/UPT/ReplacePkg.py | 2 +-
BaseTools/Source/Python/UPT/RmPkg.py | 2 +-
BaseTools/Source/Python/UPT/TestInstall.py | 4 +-
BaseTools/Source/Python/UPT/UPT.py | 4 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 8 +-
BaseTools/Source/Python/Workspace/InfBuildData.py | 2 +-
BaseTools/Source/Python/Workspace/MetaFileParser.py | 12 +-
BaseTools/Source/Python/Workspace/MetaFileTable.py | 4 +-
BaseTools/Source/Python/build/BuildReport.py | 4 +-
BaseTools/Source/Python/build/build.py | 10 +-
BaseTools/Tests/CheckPythonSyntax.py | 2 +-
BaseTools/gcc/mingw-gcc-build.py | 2 +-
45 files changed, 248 insertions(+), 244 deletions(-)
diff --git a/BaseTools/Scripts/UpdateBuildVersions.py b/BaseTools/Scripts/UpdateBuildVersions.py
index e62030aa9f0f..cff2e2263a8a 100755
--- a/BaseTools/Scripts/UpdateBuildVersions.py
+++ b/BaseTools/Scripts/UpdateBuildVersions.py
@@ -90,7 +90,8 @@ def ShellCommandResults(CmdLine, Opt):
sys.stderr.flush()
returnValue = err_val.returncode
- except IOError as (errno, strerror):
+ except IOError as err_arg:
+ (errno, strerror) = err_arg.args
file_list.close()
if not Opt.silent:
sys.stderr.write("I/O ERROR : %s : %s\n" % (str(errno), strerror))
@@ -100,7 +101,8 @@ def ShellCommandResults(CmdLine, Opt):
sys.stderr.flush()
returnValue = errno
- except OSError as (errno, strerror):
+ except OSError as err_arg:
+ (errno, strerror) = err_arg.args
file_list.close()
if not Opt.silent:
sys.stderr.write("OS ERROR : %s : %s\n" % (str(errno), strerror))
@@ -210,13 +212,15 @@ def RevertCmd(Filename, Opt):
sys.stderr.write("Subprocess ERROR : %s\n" % err_val)
sys.stderr.flush()
- except IOError as (errno, strerror):
+ except IOError as err_arg:
+ (errno, strerror) = err_arg.args
if not Opt.silent:
sys.stderr.write("I/O ERROR : %d : %s\n" % (str(errno), strerror))
sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
sys.stderr.flush()
- except OSError as (errno, strerror):
+ except OSError as err_arg:
+ (errno, strerror) = err_arg.args
if not Opt.silent:
sys.stderr.write("OS ERROR : %d : %s\n" % (str(errno), strerror))
sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 405bfa145a22..5f8694b66f35 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -2401,7 +2401,7 @@ class PlatformAutoGen(AutoGen):
if ToPcd.DefaultValue:
try:
ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, self._GuidDict)(True)
- except BadExpression, Value:
+ except BadExpression as Value:
EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
File=self.MetaFile)
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 7aa22bd944a0..98a43db7a4e5 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -448,7 +448,7 @@ def Main():
os.utime(Option.OutputFile, None)
else:
Dpx.Generate()
- except BaseException, X:
+ except BaseException as X:
EdkLogger.quiet("")
if Option != None and Option.debug != None:
EdkLogger.quiet(traceback.format_exc())
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 7d3374a49373..3f98a34d81ec 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1027,7 +1027,7 @@ cleanlib:
else:
try:
Fd = open(F.Path, 'r')
- except BaseException, X:
+ except BaseException as X:
EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
FileContent = Fd.read()
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 856d19cda270..2711fc104f52 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -242,7 +242,7 @@ class UniFileClassObject(object):
if len(Lang) != 3:
try:
FileIn = self.OpenUniFile(LongFilePath(File.Path))
- except UnicodeError, X:
+ except UnicodeError as X:
EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File);
except:
EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File);
@@ -393,7 +393,7 @@ class UniFileClassObject(object):
try:
FileIn = self.OpenUniFile(LongFilePath(File.Path))
- except UnicodeError, X:
+ except UnicodeError as X:
EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File.Path);
except:
EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File.Path);
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index b8c48460ff6d..e40677558a68 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -231,7 +231,7 @@ class ValueExpression(object):
}
try:
Val = eval(EvalStr, {}, Dict)
- except Exception, Excpt:
+ except Exception as Excpt:
raise BadExpression(str(Excpt))
if Operator in ['and', 'or']:
@@ -350,7 +350,7 @@ class ValueExpression(object):
continue
try:
Val = self.Eval(Op, Val, EvalFunc())
- except WrnExpression, Warn:
+ except WrnExpression as Warn:
self._WarnExcept = Warn
Val = Warn.result
return Val
@@ -389,7 +389,7 @@ class ValueExpression(object):
Op += ' ' + self._Token
try:
Val = self.Eval(Op, Val, self._RelExpr())
- except WrnExpression, Warn:
+ except WrnExpression as Warn:
self._WarnExcept = Warn
Val = Warn.result
return Val
@@ -415,14 +415,14 @@ class ValueExpression(object):
Val = self._UnaryExpr()
try:
return self.Eval('not', Val)
- except WrnExpression, Warn:
+ except WrnExpression as Warn:
self._WarnExcept = Warn
return Warn.result
if self._IsOperator(["~"]):
Val = self._UnaryExpr()
try:
return self.Eval('~', Val)
- except WrnExpression, Warn:
+ except WrnExpression as Warn:
self._WarnExcept = Warn
return Warn.result
return self._IdenExpr()
@@ -744,7 +744,7 @@ class ValueExpressionEx(ValueExpression):
elif self.PcdType in ['UINT8', 'UINT16', 'UINT32', 'UINT64', 'BOOLEAN'] and (PcdValue.startswith("'") or \
PcdValue.startswith('"') or PcdValue.startswith("L'") or PcdValue.startswith('L"') or PcdValue.startswith('{')):
raise BadExpression
- except WrnExpression, Value:
+ except WrnExpression as Value:
PcdValue = Value.result
except BadExpression:
if self.PcdType in ['UINT8', 'UINT16', 'UINT32', 'UINT64', 'BOOLEAN']:
@@ -904,8 +904,8 @@ if __name__ == '__main__':
try:
print ValueExpression(input)(True)
print ValueExpression(input)(False)
- except WrnExpression, Ex:
+ except WrnExpression as Ex:
print Ex.result
print str(Ex)
- except Exception, Ex:
+ except Exception as Ex:
print str(Ex)
diff --git a/BaseTools/Source/Python/Common/FdfParserLite.py b/BaseTools/Source/Python/Common/FdfParserLite.py
index 7d129bfcab59..ac03c3fef5bb 100644
--- a/BaseTools/Source/Python/Common/FdfParserLite.py
+++ b/BaseTools/Source/Python/Common/FdfParserLite.py
@@ -1190,7 +1190,7 @@ class FdfParser(object):
# pass
- except Warning, X:
+ except Warning as X:
self.__UndoToken()
FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
X.message += '\nGot Token: \"%s\" from File %s\n' % (self.__Token, FileLineTuple[0]) + \
@@ -3659,7 +3659,7 @@ if __name__ == "__main__":
import sys
try:
test_file = sys.argv[1]
- except IndexError, v:
+ except IndexError as v:
print "Usage: %s filename" % sys.argv[0]
sys.exit(1)
@@ -3667,7 +3667,7 @@ if __name__ == "__main__":
try:
parser.ParseFile()
parser.CycleReferenceCheck()
- except Warning, X:
+ except Warning as X:
print X.message
else:
print "Success!"
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index b34cb4c3be60..eed86ec98e14 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -522,7 +522,7 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
Fd = open(File, "wb")
Fd.write(Content)
Fd.close()
- except IOError, X:
+ except IOError as X:
EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
return True
@@ -556,7 +556,7 @@ def DataRestore(File):
try:
Fd = open(File, 'rb')
Data = cPickle.load(Fd)
- except Exception, e:
+ except Exception as e:
EdkLogger.verbose("Failed to load [%s]\n\t%s" % (File, str(e)))
Data = None
finally:
@@ -1494,7 +1494,7 @@ def ParseDevPathValue (Value):
try:
p = subprocess.Popen(Cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
out, err = p.communicate()
- except Exception, X:
+ except Exception as X:
raise BadExpression("DevicePath: %s" % (str(X)) )
finally:
subprocess._cleanup()
@@ -1549,7 +1549,7 @@ def ParseFieldValue (Value):
Value = Value[1:-1]
try:
Value = "'" + uuid.UUID(Value).get_bytes_le() + "'"
- except ValueError, Message:
+ except ValueError as Message:
raise BadExpression('%s' % Message)
Value, Size = ParseFieldValue(Value)
return Value, 16
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index b6c929fd885b..10b6ac55242b 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -441,7 +441,7 @@ class RangeExpression(object):
Op = self._Token
try:
Val = self.Eval(Op, Val, EvalFunc())
- except WrnExpression, Warn:
+ except WrnExpression as Warn:
self._WarnExcept = Warn
Val = Warn.result
return Val
@@ -464,7 +464,7 @@ class RangeExpression(object):
Op += ' ' + self._Token
try:
Val = self.Eval(Op, Val, self._RelExpr())
- except WrnExpression, Warn:
+ except WrnExpression as Warn:
self._WarnExcept = Warn
Val = Warn.result
return Val
@@ -476,7 +476,7 @@ class RangeExpression(object):
Val = self._NeExpr()
try:
return self.Eval(Token, Val)
- except WrnExpression, Warn:
+ except WrnExpression as Warn:
self._WarnExcept = Warn
return Warn.result
return self._IdenExpr()
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 716155e96d29..14ccabe833db 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -246,7 +246,7 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
stdout=subprocess.PIPE,
stderr= subprocess.PIPE,
shell=True)
- except Exception, X:
+ except Exception as X:
EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData="%s" % (str(X)))
(out, error) = PopenObject.communicate()
print out
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index 41f2811430a0..18a7ff055740 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -173,7 +173,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -532,7 +532,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -809,7 +809,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -964,7 +964,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1092,7 +1092,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1162,7 +1162,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1216,7 +1216,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1263,7 +1263,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1432,7 +1432,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1465,7 +1465,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1589,7 +1589,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1636,7 +1636,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1699,7 +1699,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1742,7 +1742,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1861,7 +1861,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1921,7 +1921,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2003,7 +2003,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2158,7 +2158,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2223,7 +2223,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2275,7 +2275,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2322,7 +2322,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2464,7 +2464,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3056,7 +3056,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3206,7 +3206,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3462,7 +3462,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3528,7 +3528,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3617,7 +3617,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3825,7 +3825,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3881,7 +3881,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3971,7 +3971,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4219,7 +4219,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4570,7 +4570,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4690,7 +4690,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4770,7 +4770,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4835,7 +4835,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4933,7 +4933,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5012,7 +5012,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5103,7 +5103,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5203,7 +5203,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5355,7 +5355,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5583,7 +5583,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5644,7 +5644,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5691,7 +5691,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5789,7 +5789,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5995,7 +5995,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -6065,7 +6065,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -6100,7 +6100,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8135,7 +8135,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8170,7 +8170,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8217,7 +8217,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8285,7 +8285,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8355,7 +8355,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8415,7 +8415,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8475,7 +8475,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8535,7 +8535,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8595,7 +8595,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8669,7 +8669,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8743,7 +8743,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8817,7 +8817,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9058,7 +9058,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9155,7 +9155,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9228,7 +9228,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9301,7 +9301,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -12467,7 +12467,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -12560,7 +12560,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -14530,7 +14530,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16251,7 +16251,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16322,7 +16322,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16435,7 +16435,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16586,7 +16586,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16703,7 +16703,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index a27e98c9752f..a4057ceb1775 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -98,7 +98,7 @@ class Table(object):
SqlCommand = """drop table IF EXISTS %s""" % self.Table
try:
self.Cur.execute(SqlCommand)
- except Exception, e:
+ except Exception as e:
print "An error occurred when Drop a table:", e.args[0]
## Get count
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index ba478f9ecf10..2fef87c4180a 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -1183,7 +1183,7 @@ class DscParser(MetaFileParser):
try:
Processer[self._ItemType]()
- except EvaluationException, Excpt:
+ except EvaluationException as Excpt:
#
# Only catch expression evaluation error here. We need to report
# the precise number of line on which the error occurred
@@ -1192,7 +1192,7 @@ class DscParser(MetaFileParser):
# EdkLogger.error('Parser', FORMAT_INVALID, "Invalid expression: %s" % str(Excpt),
# File=self._FileWithError, ExtraData=' '.join(self._ValueList),
# Line=self._LineIndex+1)
- except MacroException, Excpt:
+ except MacroException as Excpt:
EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
File=self._FileWithError, ExtraData=' '.join(self._ValueList),
Line=self._LineIndex+1)
@@ -1305,10 +1305,10 @@ class DscParser(MetaFileParser):
Macros.update(GlobalData.gGlobalDefines)
try:
Result = ValueExpression(self._ValueList[1], Macros)()
- except SymbolNotFound, Exc:
+ except SymbolNotFound as Exc:
EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
Result = False
- except WrnExpression, Excpt:
+ except WrnExpression as Excpt:
#
# Catch expression evaluation warning here. We need to report
# the precise number of line and return the evaluation result
@@ -1317,7 +1317,7 @@ class DscParser(MetaFileParser):
File=self._FileWithError, ExtraData=' '.join(self._ValueList),
Line=self._LineIndex+1)
Result = Excpt.result
- except BadExpression, Exc:
+ except BadExpression as Exc:
EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
Result = False
@@ -1437,13 +1437,13 @@ class DscParser(MetaFileParser):
PcdValue = ValueList[0]
try:
ValueList[0] = ValueExpression(PcdValue, self._Macros)(True)
- except WrnExpression, Value:
+ except WrnExpression as Value:
ValueList[0] = Value.result
else:
PcdValue = ValueList[-1]
try:
ValueList[-1] = ValueExpression(PcdValue, self._Macros)(True)
- except WrnExpression, Value:
+ except WrnExpression as Value:
ValueList[-1] = Value.result
if ValueList[-1] == 'True':
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index b93588eea61a..4ce8edf5573a 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -214,7 +214,7 @@ def XmlParseFile(FileName):
Dom = xml.dom.minidom.parse(XmlFile)
XmlFile.close()
return Dom
- except Exception, X:
+ except Exception as X:
print X
return ""
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 35b7405e550d..8a4b10727a07 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -2627,7 +2627,7 @@ if __name__ == '__main__':
# CollectSourceCodeDataIntoDB(sys.argv[1])
try:
test_file = sys.argv[1]
- except IndexError, v:
+ except IndexError as v:
print "Usage: %s filename" % sys.argv[0]
sys.exit(1)
MsgList = CheckFuncHeaderDoxygenComments(test_file)
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index 41f2811430a0..18a7ff055740 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -173,7 +173,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -532,7 +532,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -809,7 +809,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -964,7 +964,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1092,7 +1092,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1162,7 +1162,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1216,7 +1216,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1263,7 +1263,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1432,7 +1432,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1465,7 +1465,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1589,7 +1589,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1636,7 +1636,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1699,7 +1699,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1742,7 +1742,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1861,7 +1861,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -1921,7 +1921,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2003,7 +2003,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2158,7 +2158,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2223,7 +2223,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2275,7 +2275,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2322,7 +2322,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -2464,7 +2464,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3056,7 +3056,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3206,7 +3206,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3462,7 +3462,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3528,7 +3528,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3617,7 +3617,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3825,7 +3825,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3881,7 +3881,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -3971,7 +3971,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4219,7 +4219,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4570,7 +4570,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4690,7 +4690,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4770,7 +4770,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4835,7 +4835,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -4933,7 +4933,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5012,7 +5012,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5103,7 +5103,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5203,7 +5203,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5355,7 +5355,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5583,7 +5583,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5644,7 +5644,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5691,7 +5691,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5789,7 +5789,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -5995,7 +5995,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -6065,7 +6065,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -6100,7 +6100,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8135,7 +8135,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8170,7 +8170,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8217,7 +8217,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8285,7 +8285,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8355,7 +8355,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8415,7 +8415,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8475,7 +8475,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8535,7 +8535,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8595,7 +8595,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8669,7 +8669,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8743,7 +8743,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -8817,7 +8817,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9058,7 +9058,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9155,7 +9155,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9228,7 +9228,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -9301,7 +9301,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -12467,7 +12467,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -12560,7 +12560,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -14530,7 +14530,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16251,7 +16251,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16322,7 +16322,7 @@ class CParser(Parser):
retval.stop = self.input.LT(-1)
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16435,7 +16435,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16586,7 +16586,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
@@ -16703,7 +16703,7 @@ class CParser(Parser):
- except RecognitionException, re:
+ except RecognitionException as re:
self.reportError(re)
self.recover(self.input, re)
finally:
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 0f742c7d86c2..6696623aba68 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -1411,7 +1411,7 @@ def Main():
try:
Option = GetOptions()
build.main()
- except Exception, e:
+ except Exception as e:
print e
return 1
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 0190be884a33..15b2b792b2e1 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -938,7 +938,7 @@ class FdfParser:
return ValueExpression(Expression, MacroPcdDict)(True)
else:
return ValueExpression(Expression, MacroPcdDict)()
- except WrnExpression, Excpt:
+ except WrnExpression as Excpt:
#
# Catch expression evaluation warning here. We need to report
# the precise number of line and return the evaluation result
@@ -947,7 +947,7 @@ class FdfParser:
File=self.FileName, ExtraData=self.__CurrentLine(),
Line=Line)
return Excpt.result
- except Exception, Excpt:
+ except Exception as Excpt:
if hasattr(Excpt, 'Pcd'):
if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
Info = GlobalData.gPlatformOtherPcds[Excpt.Pcd]
@@ -1414,7 +1414,7 @@ class FdfParser:
while self.__GetFd() or self.__GetFv() or self.__GetFmp() or self.__GetCapsule() or self.__GetVtf() or self.__GetRule() or self.__GetOptionRom():
pass
- except Warning, X:
+ except Warning as X:
self.__UndoToken()
#'\n\tGot Token: \"%s\" from File %s\n' % (self.__Token, FileLineTuple[0]) + \
# At this point, the closest parent would be the included file itself
@@ -4817,7 +4817,7 @@ if __name__ == "__main__":
import sys
try:
test_file = sys.argv[1]
- except IndexError, v:
+ except IndexError as v:
print "Usage: %s filename" % sys.argv[0]
sys.exit(1)
@@ -4825,7 +4825,7 @@ if __name__ == "__main__":
try:
parser.ParseFile()
parser.CycleReferenceCheck()
- except Warning, X:
+ except Warning as X:
print str(X)
else:
print "Success!"
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index dcba9f24cb6b..bc2bb407560f 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -335,10 +335,10 @@ def main():
"""Display FV space info."""
GenFds.DisplayFvSpaceInfo(FdfParserObj)
- except FdfParser.Warning, X:
+ except FdfParser.Warning as X:
EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName, Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
ReturnCode = FORMAT_INVALID
- except FatalError, X:
+ except FatalError as X:
if Options.debug != None:
import traceback
EdkLogger.quiet(traceback.format_exc())
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 97e20753ae9b..1a5ef92afc1c 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -722,7 +722,7 @@ class GenFdsGlobalVariable:
try:
PopenObject = subprocess.Popen(' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
- except Exception, X:
+ except Exception as X:
EdkLogger.error("GenFds", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
(out, error) = PopenObject.communicate()
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index bfdf763a7abc..882b016bf058 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -254,7 +254,7 @@ if __name__ == '__main__':
FileHandle.RWFile('#', '=', 0)
else:
FileHandle.RWFile('#', '=', 1)
- except Exception, e:
+ except Exception as e:
last_type, last_value, last_tb = sys.exc_info()
traceback.print_exception(last_type, last_value, last_tb)
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index d1e40b025caa..05ba86262133 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -667,7 +667,7 @@ def Main():
EdkLogger.SetLevel(CommandOptions.LogLevel + 1)
else:
EdkLogger.SetLevel(CommandOptions.LogLevel)
- except FatalError, X:
+ except FatalError as X:
return 1
try:
@@ -687,7 +687,7 @@ def Main():
if CommandOptions.OutputFile == None:
CommandOptions.OutputFile = os.path.splitext(InputFile)[0] + '.iii'
TrimPreprocessedFile(InputFile, CommandOptions.OutputFile, CommandOptions.ConvertHex, CommandOptions.TrimLong)
- except FatalError, X:
+ except FatalError as X:
import platform
import traceback
if CommandOptions != None and CommandOptions.LogLevel <= EdkLogger.DEBUG_9:
diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
index 26c5a97da80f..3a7c9809e31a 100644
--- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py
+++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
@@ -394,7 +394,7 @@ def VerifyRemoveModuleDep(Path, DpPackagePathList):
return False
else:
return True
- except FatalError, ErrCode:
+ except FatalError as ErrCode:
if ErrCode.message == EDK1_INF_ERROR:
Logger.Warn("UPT",
ST.WRN_EDK1_INF_FOUND%Path)
@@ -446,7 +446,7 @@ def VerifyReplaceModuleDep(Path, DpPackagePathList, OtherPkgList):
return False
else:
return True
- except FatalError, ErrCode:
+ except FatalError as ErrCode:
if ErrCode.message == EDK1_INF_ERROR:
Logger.Warn("UPT",
ST.WRN_EDK1_INF_FOUND%Path)
diff --git a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
index 9c55e0ea88a7..81c67fb510a2 100644
--- a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
+++ b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
@@ -155,7 +155,7 @@ class DistributionPackageClass(object):
ModuleObj.GetName(), \
ModuleObj.GetCombinePath())] = ModuleObj
PackageObj.SetModuleDict(ModuleDict)
- except FatalError, ErrCode:
+ except FatalError as ErrCode:
if ErrCode.message == EDK1_INF_ERROR:
Logger.Warn("UPT",
ST.WRN_EDK1_INF_FOUND%Filename)
@@ -181,7 +181,7 @@ class DistributionPackageClass(object):
ModuleObj.GetName(),
ModuleObj.GetCombinePath())
self.ModuleSurfaceArea[ModuleKey] = ModuleObj
- except FatalError, ErrCode:
+ except FatalError as ErrCode:
if ErrCode.message == EDK1_INF_ERROR:
Logger.Error("UPT",
EDK1_INF_ERROR,
diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/Python/UPT/Core/IpiDb.py
index f147963288ad..baf687ef99ba 100644
--- a/BaseTools/Source/Python/UPT/Core/IpiDb.py
+++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py
@@ -230,7 +230,7 @@ class IpiDatabase(object):
self._AddDp(DpObj.Header.GetGuid(), DpObj.Header.GetVersion(), \
NewDpPkgFileName, DpPkgFileName, RePackage)
- except sqlite3.IntegrityError, DetailMsg:
+ except sqlite3.IntegrityError as DetailMsg:
Logger.Error("UPT",
UPT_DB_UPDATE_ERROR,
ST.ERR_UPT_DB_UPDATE_ERROR,
diff --git a/BaseTools/Source/Python/UPT/Core/PackageFile.py b/BaseTools/Source/Python/UPT/Core/PackageFile.py
index 5fafd85bffbf..db4725b1a56d 100644
--- a/BaseTools/Source/Python/UPT/Core/PackageFile.py
+++ b/BaseTools/Source/Python/UPT/Core/PackageFile.py
@@ -51,7 +51,7 @@ class PackageFile:
self._Files = {}
for Filename in self._ZipFile.namelist():
self._Files[os.path.normpath(Filename)] = Filename
- except BaseException, Xstr:
+ except BaseException as Xstr:
Logger.Error("PackagingTool", FILE_OPEN_FAILURE,
ExtraData="%s (%s)" % (FileName, str(Xstr)))
@@ -106,7 +106,7 @@ class PackageFile:
ExtraData="[%s] in %s" % (Which, self._FileName))
try:
FileContent = self._ZipFile.read(self._Files[Which])
- except BaseException, Xstr:
+ except BaseException as Xstr:
Logger.Error("PackagingTool", FILE_DECOMPRESS_FAILURE,
ExtraData="[%s] in %s (%s)" % (Which, \
self._FileName, \
@@ -119,14 +119,14 @@ class PackageFile:
return
else:
ToFile = __FileHookOpen__(ToDest, 'wb')
- except BaseException, Xstr:
+ except BaseException as Xstr:
Logger.Error("PackagingTool", FILE_OPEN_FAILURE,
ExtraData="%s (%s)" % (ToDest, str(Xstr)))
try:
ToFile.write(FileContent)
ToFile.close()
- except BaseException, Xstr:
+ except BaseException as Xstr:
Logger.Error("PackagingTool", FILE_WRITE_FAILURE,
ExtraData="%s (%s)" % (ToDest, str(Xstr)))
@@ -228,7 +228,7 @@ class PackageFile:
return
Logger.Info("packing ..." + File)
self._ZipFile.write(File, ArcName)
- except BaseException, Xstr:
+ except BaseException as Xstr:
Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
ExtraData="%s (%s)" % (File, str(Xstr)))
@@ -242,7 +242,7 @@ class PackageFile:
if os.path.splitext(ArcName)[1].lower() == '.pkg':
Data = Data.encode('utf_8')
self._ZipFile.writestr(ArcName, Data)
- except BaseException, Xstr:
+ except BaseException as Xstr:
Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
ExtraData="%s (%s)" % (ArcName, str(Xstr)))
diff --git a/BaseTools/Source/Python/UPT/InstallPkg.py b/BaseTools/Source/Python/UPT/InstallPkg.py
index a8d0e1ec440a..e268f7892290 100644
--- a/BaseTools/Source/Python/UPT/InstallPkg.py
+++ b/BaseTools/Source/Python/UPT/InstallPkg.py
@@ -537,7 +537,7 @@ def Main(Options = None):
Options, Dep, WorkspaceDir, DataBase)
ReturnCode = 0
- except FatalError, XExcept:
+ except FatalError as XExcept:
ReturnCode = XExcept.args[0]
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
diff --git a/BaseTools/Source/Python/UPT/InventoryWs.py b/BaseTools/Source/Python/UPT/InventoryWs.py
index 824e1c288947..cd92753a8d4b 100644
--- a/BaseTools/Source/Python/UPT/InventoryWs.py
+++ b/BaseTools/Source/Python/UPT/InventoryWs.py
@@ -92,7 +92,7 @@ def Main(Options = None):
DataBase = GlobalData.gDB
InventoryDistInstalled(DataBase)
ReturnCode = 0
- except FatalError, XExcept:
+ except FatalError as XExcept:
ReturnCode = XExcept.args[0]
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
--git a/BaseTools/Source/Python/UPT/Library/CommentParsing.py b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
index e6d45103f94b..9cd7b60e16ab 100644
--- a/BaseTools/Source/Python/UPT/Library/CommentParsing.py
+++ b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
@@ -217,7 +217,7 @@ def ParsePcdErrorCode (Value = None, ContainerFile = None, LineNum = None):
# To delete the tailing 'L'
#
return hex(ErrorCode)[:-1]
- except ValueError, XStr:
+ except ValueError as XStr:
if XStr:
pass
Logger.Error('Parser',
diff --git a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
index 090c7eb95716..ca21e6995217 100644
--- a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
@@ -297,7 +297,7 @@ class _LogicalExpressionParser(_ExprBase):
try:
if self.LogicalExpression() not in [self.ARITH, self.LOGICAL, self.REALLOGICAL, self.STRINGITEM]:
return False, ST.ERR_EXPR_LOGICAL % self.Token
- except _ExprError, XExcept:
+ except _ExprError as XExcept:
return False, XExcept.Error
self.SkipWhitespace()
if self.Index != self.Len:
@@ -327,7 +327,7 @@ class _ValidRangeExpressionParser(_ExprBase):
try:
if self.RangeExpression() not in [self.HEX, self.INT]:
return False, ST.ERR_EXPR_RANGE % self.Token
- except _ExprError, XExcept:
+ except _ExprError as XExcept:
return False, XExcept.Error
self.SkipWhitespace()
@@ -423,7 +423,7 @@ class _ValidListExpressionParser(_ExprBase):
try:
if self.ListExpression() not in [self.NUM]:
return False, ST.ERR_EXPR_LIST % self.Token
- except _ExprError, XExcept:
+ except _ExprError as XExcept:
return False, XExcept.Error
self.SkipWhitespace()
@@ -457,7 +457,7 @@ class _StringTestParser(_ExprBase):
return False, ST.ERR_EXPR_EMPTY
try:
self.StringTest()
- except _ExprError, XExcept:
+ except _ExprError as XExcept:
return False, XExcept.Error
return True, ''
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index 1fbbf2e49887..b00bba1f8440 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -329,9 +329,9 @@ class UniFileClassObject(object):
if len(Lang) != 3:
try:
FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
- except UnicodeError, Xstr:
+ except UnicodeError as Xstr:
FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
- except UnicodeError, Xstr:
+ except UnicodeError as Xstr:
FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
except:
EdkLogger.Error("Unicode File Parser",
@@ -438,7 +438,7 @@ class UniFileClassObject(object):
try:
FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
- except UnicodeError, Xstr:
+ except UnicodeError as Xstr:
FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
except UnicodeError:
FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
@@ -1060,7 +1060,7 @@ class UniFileClassObject(object):
ExtraData=FilaPath)
try:
FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_8').readlines()
- except UnicodeError, Xstr:
+ except UnicodeError as Xstr:
FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16').readlines()
except UnicodeError:
FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16_le').readlines()
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
index d7614b884990..fd02efb6bf04 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
@@ -224,6 +224,6 @@ def XmlParseFile(FileName):
Dom = xml.dom.minidom.parse(XmlFile)
XmlFile.close()
return Dom
- except BaseException, XExcept:
+ except BaseException as XExcept:
XmlFile.close()
Logger.Error('\nUPT', PARSER_ERROR, XExcept, File=FileName, RaiseError=True)
diff --git a/BaseTools/Source/Python/UPT/MkPkg.py b/BaseTools/Source/Python/UPT/MkPkg.py
index 87c84f0cc25b..99d6bcc19220 100644
--- a/BaseTools/Source/Python/UPT/MkPkg.py
+++ b/BaseTools/Source/Python/UPT/MkPkg.py
@@ -213,7 +213,7 @@ def Main(Options = None):
Logger.Quiet(ST.MSG_FINISH)
ReturnCode = 0
- except FatalError, XExcept:
+ except FatalError as XExcept:
ReturnCode = XExcept.args[0]
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % \
diff --git a/BaseTools/Source/Python/UPT/ReplacePkg.py b/BaseTools/Source/Python/UPT/ReplacePkg.py
index efbf68a4ecc6..6f52b4f8f8e8 100644
--- a/BaseTools/Source/Python/UPT/ReplacePkg.py
+++ b/BaseTools/Source/Python/UPT/ReplacePkg.py
@@ -71,7 +71,7 @@ def Main(Options = None):
InstallDp(DistPkg, DpPkgFileName, ContentZipFile, Options, Dep, WorkspaceDir, DataBase)
ReturnCode = 0
- except FatalError, XExcept:
+ except FatalError as XExcept:
ReturnCode = XExcept.args[0]
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
diff --git a/BaseTools/Source/Python/UPT/RmPkg.py b/BaseTools/Source/Python/UPT/RmPkg.py
index ea842c11859f..6427a8f16c88 100644
--- a/BaseTools/Source/Python/UPT/RmPkg.py
+++ b/BaseTools/Source/Python/UPT/RmPkg.py
@@ -157,7 +157,7 @@ def Main(Options = None):
ReturnCode = 0
- except FatalError, XExcept:
+ except FatalError as XExcept:
ReturnCode = XExcept.args[0]
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
diff --git a/BaseTools/Source/Python/UPT/TestInstall.py b/BaseTools/Source/Python/UPT/TestInstall.py
index 899cae56aa87..d8918737f907 100644
--- a/BaseTools/Source/Python/UPT/TestInstall.py
+++ b/BaseTools/Source/Python/UPT/TestInstall.py
@@ -68,12 +68,12 @@ def Main(Options=None):
else:
Logger.Quiet(ST.MSG_TEST_INSTALL_FAIL)
- except TE.FatalError, XExcept:
+ except TE.FatalError as XExcept:
ReturnCode = XExcept.args[0]
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
- except Exception, x:
+ except Exception as x:
ReturnCode = TE.CODE_ERROR
Logger.Error(
"\nTestInstallPkg",
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 325b96bf560d..0bfcc44e3f19 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -179,7 +179,7 @@ def Main():
try:
GlobalData.gWORKSPACE, GlobalData.gPACKAGE_PATH = GetWorkspace()
- except FatalError, XExcept:
+ except FatalError as XExcept:
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
return XExcept.args[0]
@@ -294,7 +294,7 @@ def Main():
return OPTION_MISSING
ReturnCode = RunModule(Opt)
- except FatalError, XExcept:
+ except FatalError as XExcept:
ReturnCode = XExcept.args[0]
if Logger.GetLevel() <= Logger.DEBUG_9:
Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index ad5b267fd158..480ec3e6cfce 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -827,11 +827,11 @@ class DscBuildData(PlatformBuildClassObject):
DatumType = self._DecPcds[PcdCName, TokenSpaceGuid].DatumType
try:
ValueList[Index] = ValueExpressionEx(ValueList[Index], DatumType, self._GuidDict)(True)
- except BadExpression, Value:
+ except BadExpression as Value:
EdkLogger.error('Parser', FORMAT_INVALID, Value, File=self.MetaFile, Line=LineNo,
ExtraData="PCD [%s.%s] Value \"%s\" " % (
TokenSpaceGuid, PcdCName, ValueList[Index]))
- except EvaluationException, Excpt:
+ except EvaluationException as Excpt:
if hasattr(Excpt, 'Pcd'):
if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
EdkLogger.error('Parser', FORMAT_INVALID, "Cannot use this PCD (%s) in an expression as"
@@ -998,7 +998,7 @@ class DscBuildData(PlatformBuildClassObject):
if pcdvalue.startswith('H'):
try:
pcdvalue = ValueExpressionEx(pcdvalue[1:], PcdDatumType, self._GuidDict)(True)
- except BadExpression, Value:
+ except BadExpression as Value:
if Value.result > 1:
EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %
(TokenSpaceGuidCName, TokenCName, pcdvalue, Value))
@@ -1015,7 +1015,7 @@ class DscBuildData(PlatformBuildClassObject):
if pcdvalue.startswith('H'):
try:
pcdvalue = ValueExpressionEx(pcdvalue[1:], PcdDatumType, self._GuidDict)(True)
- except BadExpression, Value:
+ except BadExpression as Value:
EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %
(TokenSpaceGuidCName, TokenCName, pcdvalue, Value))
pcdvalue = 'H' + pcdvalue
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 7ea9b56d5dec..67c08ee47841 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -1148,7 +1148,7 @@ class InfBuildData(ModuleBuildClassObject):
else:
try:
Pcd.DefaultValue = ValueExpressionEx(Pcd.DefaultValue, Pcd.DatumType, self.Guids)(True)
- except BadExpression, Value:
+ except BadExpression as Value:
EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(TokenSpaceGuid, PcdRealName, Pcd.DefaultValue, Value),
File=self.MetaFile, Line=LineNo)
break
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 57642de4ee73..17b7e7e1bd62 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -1327,7 +1327,7 @@ class DscParser(MetaFileParser):
self._InSubsection = False
try:
Processer[self._ItemType]()
- except EvaluationException, Excpt:
+ except EvaluationException as Excpt:
#
# Only catch expression evaluation error here. We need to report
# the precise number of line on which the error occurred
@@ -1349,7 +1349,7 @@ class DscParser(MetaFileParser):
EdkLogger.error('Parser', FORMAT_INVALID, "Invalid expression: %s" % str(Excpt),
File=self._FileWithError, ExtraData=' '.join(self._ValueList),
Line=self._LineIndex + 1)
- except MacroException, Excpt:
+ except MacroException as Excpt:
EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
File=self._FileWithError, ExtraData=' '.join(self._ValueList),
Line=self._LineIndex + 1)
@@ -1447,10 +1447,10 @@ class DscParser(MetaFileParser):
Macros.update(GlobalData.gGlobalDefines)
try:
Result = ValueExpression(self._ValueList[1], Macros)()
- except SymbolNotFound, Exc:
+ except SymbolNotFound as Exc:
EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
Result = False
- except WrnExpression, Excpt:
+ except WrnExpression as Excpt:
#
# Catch expression evaluation warning here. We need to report
# the precise number of line and return the evaluation result
@@ -1591,7 +1591,7 @@ class DscParser(MetaFileParser):
if PcdValue and "." not in self._ValueList[0]:
try:
ValList[Index] = ValueExpression(PcdValue, self._Macros)(True)
- except WrnExpression, Value:
+ except WrnExpression as Value:
ValList[Index] = Value.result
except:
pass
@@ -1995,7 +1995,7 @@ class DecParser(MetaFileParser):
if PcdValue:
try:
ValueList[0] = ValueExpressionEx(ValueList[0], ValueList[1], self._GuidDict)(True)
- except BadExpression, Value:
+ except BadExpression as Value:
EdkLogger.error('Parser', FORMAT_INVALID, Value, ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
# check format of default value against the datum type
IsValid, Cause = CheckPcdDatum(ValueList[1], ValueList[0])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index d8549c9d66e6..92fcf6dd2b22 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -63,7 +63,7 @@ class MetaFileTable(Table):
# update the timestamp in database
self._FileIndexTable.SetFileTimeStamp(self.IdBase, TimeStamp)
return False
- except Exception, Exc:
+ except Exception as Exc:
EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc))
return False
return True
@@ -250,7 +250,7 @@ class PackageTable(MetaFileTable):
if comment.startswith("@Expression"):
comment = comment.replace("@Expression", "", 1)
expressions.append(comment.split("|")[1].strip())
- except Exception, Exc:
+ except Exception as Exc:
ValidType = ""
if oricomment.startswith("@ValidRange"):
ValidType = "@ValidRange"
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 53d0039c5149..e71c0abc25b9 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -650,7 +650,7 @@ class ModuleReport(object):
cmd = ["GenFw", "--rebase", str(0), "-o", Tempfile, DefaultEFIfile]
try:
PopenObject = subprocess.Popen(' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
- except Exception, X:
+ except Exception as X:
EdkLogger.error("GenFw", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
EndOfProcedure = threading.Event()
EndOfProcedure.clear()
@@ -958,7 +958,7 @@ class PcdReport(object):
if DscDefaultValue != DscDefaultValBak:
try:
DscDefaultValue = ValueExpressionEx(DscDefaultValue, Pcd.DatumType, self._GuidDict)(True)
- except BadExpression, Value:
+ except BadExpression as Value:
EdkLogger.error('BuildReport', FORMAT_INVALID, "PCD Value: %s, Type: %s" %(DscDefaultValue, Pcd.DatumType))
InfDefaultValue = None
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 24f9962c9d94..b8dc20b1fd22 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -546,7 +546,7 @@ class BuildTask:
EdkLogger.debug(EdkLogger.DEBUG_8, "Threads [%s]" % ", ".join([Th.getName() for Th in threading.enumerate()]))
# avoid tense loop
time.sleep(0.1)
- except BaseException, X:
+ except BaseException as X:
#
# TRICK: hide the output of threads left runing, so that the user can
# catch the error message easily
@@ -1322,7 +1322,7 @@ class Build():
try:
#os.rmdir(AutoGenObject.BuildDir)
RemoveDirectory(AutoGenObject.BuildDir, True)
- except WindowsError, X:
+ except WindowsError as X:
EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
return True
@@ -1412,7 +1412,7 @@ class Build():
try:
#os.rmdir(AutoGenObject.BuildDir)
RemoveDirectory(AutoGenObject.BuildDir, True)
- except WindowsError, X:
+ except WindowsError as X:
EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
return True
@@ -2494,14 +2494,14 @@ def Main():
# All job done, no error found and no exception raised
#
BuildError = False
- except FatalError, X:
+ except FatalError as X:
if MyBuild != None:
# for multi-thread build exits safely
MyBuild.Relinquish()
if Option != None and Option.debug != None:
EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
ReturnCode = X.args[0]
- except Warning, X:
+ except Warning as X:
# error from Fdf parser
if MyBuild != None:
# for multi-thread build exits safely
diff --git a/BaseTools/Tests/CheckPythonSyntax.py b/BaseTools/Tests/CheckPythonSyntax.py
index 61a048ad5d05..a55b29de4713 100644
--- a/BaseTools/Tests/CheckPythonSyntax.py
+++ b/BaseTools/Tests/CheckPythonSyntax.py
@@ -29,7 +29,7 @@ class Tests(TestTools.BaseToolsTest):
def SingleFileTest(self, filename):
try:
py_compile.compile(filename, doraise=True)
- except Exception, e:
+ except Exception as e:
self.fail('syntax error: %s, Error is %s' % (filename, str(e)))
def MakePythonSyntaxCheckTests():
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 420b3dea80f7..858b4020ef9f 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -337,7 +337,7 @@ class SourceFiles:
print '[KeyboardInterrupt]'
return False
- except Exception, e:
+ except Exception as e:
print e
if not completed: return False
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 02/20] BaseTools: Refactor python print statements
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
2018-02-01 8:35 ` [PATCH v2 01/20] BaseTools: Refactor python except statements Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 03/20] BaseTools: Remove the old python "not-equal" Gary Lin
` (18 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Refactor print statements to be compatible with python 3.
Based on "futurize -f libfuturize.fixes.fix_print_with_import"
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py | 3 +-
BaseTools/Scripts/BinToPcd.py | 37 +++---
BaseTools/Scripts/MemoryProfileSymbolGen.py | 14 +--
BaseTools/Scripts/SmiHandlerProfileSymbolGen.py | 20 +--
BaseTools/Source/Python/AutoGen/AutoGen.py | 5 +-
BaseTools/Source/Python/AutoGen/BuildEngine.py | 31 ++---
BaseTools/Source/Python/AutoGen/UniClassObject.py | 7 +-
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 5 +-
BaseTools/Source/Python/BPDG/BPDG.py | 3 +-
BaseTools/Source/Python/Common/DecClassObject.py | 39 +++---
BaseTools/Source/Python/Common/Dictionary.py | 7 +-
BaseTools/Source/Python/Common/DscClassObject.py | 67 +++++-----
BaseTools/Source/Python/Common/EdkIIWorkspace.py | 23 ++--
BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py | 133 ++++++++++----------
BaseTools/Source/Python/Common/Expression.py | 11 +-
BaseTools/Source/Python/Common/FdfParserLite.py | 29 ++---
BaseTools/Source/Python/Common/InfClassObject.py | 113 ++++++++---------
BaseTools/Source/Python/Common/RangeExpression.py | 5 +-
BaseTools/Source/Python/Common/TargetTxtClassObject.py | 13 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 3 +-
BaseTools/Source/Python/Ecc/CParser.py | 3 +-
BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 69 +++++-----
BaseTools/Source/Python/Ecc/Configuration.py | 5 +-
BaseTools/Source/Python/Ecc/Exception.py | 3 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 3 +-
BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 5 +-
BaseTools/Source/Python/Ecc/c.py | 13 +-
BaseTools/Source/Python/Eot/CParser.py | 3 +-
BaseTools/Source/Python/Eot/CodeFragmentCollector.py | 61 ++++-----
BaseTools/Source/Python/Eot/FvImage.py | 13 +-
BaseTools/Source/Python/Eot/InfParserLite.py | 7 +-
BaseTools/Source/Python/Eot/c.py | 3 +-
BaseTools/Source/Python/GenFds/FdfParser.py | 7 +-
BaseTools/Source/Python/GenFds/GenFds.py | 3 +-
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 3 +-
BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 7 +-
BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 23 ++--
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 15 +--
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 17 +--
BaseTools/Source/Python/TargetTool/TargetTool.py | 23 ++--
BaseTools/Source/Python/UPT/Library/ExpressionValidate.py | 3 +-
BaseTools/Source/Python/UPT/Library/UniClassObject.py | 9 +-
BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 51 ++++----
BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py | 5 +-
BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py | 9 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 11 +-
BaseTools/Source/Python/Workspace/MetaFileParser.py | 3 +-
BaseTools/Source/Python/build/build.py | 3 +-
BaseTools/Tests/TestTools.py | 5 +-
BaseTools/Tests/TianoCompress.py | 5 +-
BaseTools/gcc/mingw-gcc-build.py | 99 +++++++--------
51 files changed, 553 insertions(+), 504 deletions(-)
diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
index 69fd2d54413e..dd66c7111ac0 100755
--- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
+++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
@@ -23,6 +23,7 @@
#
# ExceptionList if a tool takes an argument with a / add it to the exception list
#
+from __future__ import print_function
import sys
import os
import subprocess
@@ -86,7 +87,7 @@ if __name__ == "__main__":
ret = main(sys.argv[2:])
except:
- print "exiting: exception from " + sys.argv[0]
+ print("exiting: exception from " + sys.argv[0])
ret = 2
sys.exit(ret)
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 68a7ac652d70..c4e7b8a5c2e2 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -14,6 +14,7 @@
'''
BinToPcd
'''
+from __future__ import print_function
import sys
import argparse
@@ -98,7 +99,7 @@ if __name__ == '__main__':
Buffer = args.InputFile.read()
args.InputFile.close()
except:
- print 'BinToPcd: error: can not read binary input file'
+ print('BinToPcd: error: can not read binary input file')
sys.exit()
#
@@ -109,7 +110,7 @@ if __name__ == '__main__':
# If PcdName is None, then only a PCD value is being requested.
Pcd = ByteArray (Buffer)
if args.Verbose:
- print 'PcdToBin: Convert binary file to PCD Value'
+ print('PcdToBin: Convert binary file to PCD Value')
elif args.PcdType is None:
#
# If --type is neither VPD nor HII, then use PCD statement syntax that is
@@ -123,18 +124,18 @@ if __name__ == '__main__':
#
Pcd = ' %s|%s' % (args.PcdName, ByteArray (Buffer))
elif args.MaxSize < len(Buffer):
- print 'BinToPcd: error: argument --max-size is smaller than input file.'
+ print('BinToPcd: error: argument --max-size is smaller than input file.')
sys.exit()
else:
Pcd = ' %s|%s|VOID*|%d' % (args.PcdName, ByteArray (Buffer), args.MaxSize)
args.MaxSize = len(Buffer)
if args.Verbose:
- print 'PcdToBin: Convert binary file to PCD statement compatible with PCD sections:'
- print ' [PcdsFixedAtBuild]'
- print ' [PcdsPatchableInModule]'
- print ' [PcdsDynamicDefault]'
- print ' [PcdsDynamicExDefault]'
+ print('PcdToBin: Convert binary file to PCD statement compatible with PCD sections:')
+ print(' [PcdsFixedAtBuild]')
+ print(' [PcdsPatchableInModule]')
+ print(' [PcdsDynamicDefault]')
+ print(' [PcdsDynamicExDefault]')
elif args.PcdType == 'VPD':
if args.MaxSize is None:
#
@@ -143,7 +144,7 @@ if __name__ == '__main__':
#
args.MaxSize = len(Buffer)
if args.MaxSize < len(Buffer):
- print 'BinToPcd: error: argument --max-size is smaller than input file.'
+ print('BinToPcd: error: argument --max-size is smaller than input file.')
sys.exit()
if args.Offset is None:
#
@@ -157,15 +158,15 @@ if __name__ == '__main__':
#
Pcd = ' %s|%d|%d|%s' % (args.PcdName, args.Offset, args.MaxSize, ByteArray (Buffer))
if args.Verbose:
- print 'PcdToBin: Convert binary file to PCD statement compatible with PCD sections'
- print ' [PcdsDynamicVpd]'
- print ' [PcdsDynamicExVpd]'
+ print('PcdToBin: Convert binary file to PCD statement compatible with PCD sections')
+ print(' [PcdsDynamicVpd]')
+ print(' [PcdsDynamicExVpd]')
elif args.PcdType == 'HII':
if args.VariableGuid is None:
- print 'BinToPcd: error: argument --variable-guid is required for --type HII.'
+ print('BinToPcd: error: argument --variable-guid is required for --type HII.')
sys.exit()
if args.VariableName is None:
- print 'BinToPcd: error: argument --variable-name is required for --type HII.'
+ print('BinToPcd: error: argument --variable-name is required for --type HII.')
sys.exit()
if args.Offset is None:
#
@@ -174,9 +175,9 @@ if __name__ == '__main__':
args.Offset = 0
Pcd = ' %s|L"%s"|%s|%d|%s' % (args.PcdName, args.VariableName, args.VariableGuid, args.Offset, ByteArray (Buffer))
if args.Verbose:
- print 'PcdToBin: Convert binary file to PCD statement compatible with PCD sections'
- print ' [PcdsDynamicHii]'
- print ' [PcdsDynamicExHii]'
+ print('PcdToBin: Convert binary file to PCD statement compatible with PCD sections')
+ print(' [PcdsDynamicHii]')
+ print(' [PcdsDynamicExHii]')
#
# Write PCD value or PCD statement to the output file
@@ -189,4 +190,4 @@ if __name__ == '__main__':
# If output file is not specified or it can not be written, then write the
# PCD value or PCD statement to the console
#
- print Pcd
+ print(Pcd)
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 5709ad4641cb..3bc6a8897bcc 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -13,7 +13,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
##
-
+from __future__ import print_function
import os
import re
import sys
@@ -58,10 +58,10 @@ class Symbols:
try:
nmCommand = "nm"
nmLineOption = "-l"
- print "parsing (debug) - " + pdbName
+ print("parsing (debug) - " + pdbName)
os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
except :
- print 'ERROR: nm command not available. Please verify PATH'
+ print('ERROR: nm command not available. Please verify PATH')
return
#
@@ -111,11 +111,11 @@ class Symbols:
DIA2DumpCommand = "Dia2Dump.exe"
#DIA2SymbolOption = "-p"
DIA2LinesOption = "-l"
- print "parsing (pdb) - " + pdbName
+ print("parsing (pdb) - " + pdbName)
#os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
except :
- print 'ERROR: DIA2Dump command not available. Please verify PATH'
+ print('ERROR: DIA2Dump command not available. Please verify PATH')
return
#
@@ -254,12 +254,12 @@ def main():
try :
file = open(Options.inputfilename)
except Exception:
- print "fail to open " + Options.inputfilename
+ print("fail to open " + Options.inputfilename)
return 1
try :
newfile = open(Options.outputfilename, "w")
except Exception:
- print "fail to open " + Options.outputfilename
+ print("fail to open " + Options.outputfilename)
return 1
try:
diff --git a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
index f03278b64f8f..d0963a17e870 100644
--- a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
+++ b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
@@ -13,7 +13,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
##
-
+from __future__ import print_function
import os
import re
import sys
@@ -61,10 +61,10 @@ class Symbols:
try:
nmCommand = "nm"
nmLineOption = "-l"
- print "parsing (debug) - " + pdbName
+ print("parsing (debug) - " + pdbName)
os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
except :
- print 'ERROR: nm command not available. Please verify PATH'
+ print('ERROR: nm command not available. Please verify PATH')
return
#
@@ -103,11 +103,11 @@ class Symbols:
DIA2DumpCommand = "Dia2Dump.exe"
#DIA2SymbolOption = "-p"
DIA2LinesOption = "-l"
- print "parsing (pdb) - " + pdbName
+ print("parsing (pdb) - " + pdbName)
#os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
except :
- print 'ERROR: DIA2Dump command not available. Please verify PATH'
+ print('ERROR: DIA2Dump command not available. Please verify PATH')
return
#
@@ -235,14 +235,14 @@ def main():
try :
DOMTree = xml.dom.minidom.parse(Options.inputfilename)
except Exception:
- print "fail to open input " + Options.inputfilename
+ print("fail to open input " + Options.inputfilename)
return 1
if Options.guidreffilename is not None:
try :
guidreffile = open(Options.guidreffilename)
except Exception:
- print "fail to open guidref" + Options.guidreffilename
+ print("fail to open guidref" + Options.guidreffilename)
return 1
genGuidString(guidreffile)
guidreffile.close()
@@ -277,7 +277,7 @@ def main():
Handler = smiHandler.getElementsByTagName("Handler")
RVA = Handler[0].getElementsByTagName("RVA")
- print " Handler RVA: %s" % RVA[0].childNodes[0].data
+ print(" Handler RVA: %s" % RVA[0].childNodes[0].data)
if (len(RVA)) >= 1:
rvaName = RVA[0].childNodes[0].data
@@ -289,7 +289,7 @@ def main():
Caller = smiHandler.getElementsByTagName("Caller")
RVA = Caller[0].getElementsByTagName("RVA")
- print " Caller RVA: %s" % RVA[0].childNodes[0].data
+ print(" Caller RVA: %s" % RVA[0].childNodes[0].data)
if (len(RVA)) >= 1:
rvaName = RVA[0].childNodes[0].data
@@ -302,7 +302,7 @@ def main():
try :
newfile = open(Options.outputfilename, "w")
except Exception:
- print "fail to open output" + Options.outputfilename
+ print("fail to open output" + Options.outputfilename)
return 1
newfile.write(DOMTree.toprettyxml(indent = "\t", newl = "\n", encoding = "utf-8"))
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 5f8694b66f35..816dd9e86bd3 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -13,6 +13,7 @@
## Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import re
import os.path as path
@@ -681,7 +682,7 @@ class WorkspaceAutoGen(AutoGen):
os.makedirs(self.BuildDir)
with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file:
for f in AllWorkSpaceMetaFiles:
- print >> file, f
+ print(f, file=file)
return True
def _GenPkgLevelHash(self, Pkg):
@@ -4553,7 +4554,7 @@ class ModuleAutoGen(AutoGen):
os.remove (self.GetTimeStampPath())
with open(self.GetTimeStampPath(), 'w+') as file:
for f in FileSet:
- print >> file, f
+ print(f, file=file)
Module = property(_GetModule)
Name = property(_GetBaseName)
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index 63ed47d94bcb..46685967d1ee 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import re
import copy
@@ -625,19 +626,19 @@ if __name__ == '__main__':
EdkLogger.Initialize()
if len(sys.argv) > 1:
Br = BuildRule(sys.argv[1])
- print str(Br[".c", "DXE_DRIVER", "IA32", "MSFT"][1])
- print
- print str(Br[".c", "DXE_DRIVER", "IA32", "INTEL"][1])
- print
- print str(Br[".c", "DXE_DRIVER", "IA32", "GCC"][1])
- print
- print str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1])
- print
- print str(Br[".h", "ACPI_TABLE", "IA32", "INTEL"][1])
- print
- print str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1])
- print
- print str(Br[".s", "SEC", "IPF", "COMMON"][1])
- print
- print str(Br[".s", "SEC"][1])
+ print(str(Br[".c", "DXE_DRIVER", "IA32", "MSFT"][1]))
+ print()
+ print(str(Br[".c", "DXE_DRIVER", "IA32", "INTEL"][1]))
+ print()
+ print(str(Br[".c", "DXE_DRIVER", "IA32", "GCC"][1]))
+ print()
+ print(str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1]))
+ print()
+ print(str(Br[".h", "ACPI_TABLE", "IA32", "INTEL"][1]))
+ print()
+ print(str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1]))
+ print()
+ print(str(Br[".s", "SEC", "IPF", "COMMON"][1]))
+ print()
+ print(str(Br[".s", "SEC"][1]))
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 2711fc104f52..264cf1546566 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -16,6 +16,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os, codecs, re
import distutils.util
import Common.EdkLogger as EdkLogger
@@ -684,12 +685,12 @@ class UniFileClassObject(object):
# Show the instance itself
#
def ShowMe(self):
- print self.LanguageDef
+ print(self.LanguageDef)
#print self.OrderedStringList
for Item in self.OrderedStringList:
- print Item
+ print(Item)
for Member in self.OrderedStringList[Item]:
- print str(Member)
+ print(str(Member))
# This acts like the main() function for the script, unless it is 'import'ed into another
# script.
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 92ede7a82324..53da9b881f25 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -14,6 +14,7 @@
# #
# Import Modules
#
+from __future__ import print_function
import os
from Common.RangeExpression import RangeExpression
from Common.Misc import *
@@ -345,6 +346,6 @@ if __name__ == "__main__":
test2 = TestObj(2)
testarr = [test1, test2]
- print TestObj(2) in testarr
- print TestObj(2) == test2
+ print(TestObj(2) in testarr)
+ print(TestObj(2) == test2)
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index b1e328ff3f11..9ab13a39e8bf 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -20,6 +20,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import sys
import encodings.ascii
@@ -132,7 +133,7 @@ def MyOptionParser():
#
def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
if os.path.exists(VpdFileName) and not Force:
- print "\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName
+ print("\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName)
choice = sys.stdin.readline()
if choice.strip().lower() not in ['y', 'yes', '']:
return
diff --git a/BaseTools/Source/Python/Common/DecClassObject.py b/BaseTools/Source/Python/Common/DecClassObject.py
index d7c70a7336a0..970e644318d0 100644
--- a/BaseTools/Source/Python/Common/DecClassObject.py
+++ b/BaseTools/Source/Python/Common/DecClassObject.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
from String import *
from DataType import *
@@ -517,31 +518,31 @@ class Dec(DecObject):
def ShowPackage(self):
M = self.Package
for Arch in M.Header.keys():
- print '\nArch =', Arch
- print 'Filename =', M.Header[Arch].FileName
- print 'FullPath =', M.Header[Arch].FullPath
- print 'BaseName =', M.Header[Arch].Name
- print 'Guid =', M.Header[Arch].Guid
- print 'Version =', M.Header[Arch].Version
- print 'DecSpecification =', M.Header[Arch].DecSpecification
- print '\nIncludes =', M.Includes
+ print('\nArch =', Arch)
+ print('Filename =', M.Header[Arch].FileName)
+ print('FullPath =', M.Header[Arch].FullPath)
+ print('BaseName =', M.Header[Arch].Name)
+ print('Guid =', M.Header[Arch].Guid)
+ print('Version =', M.Header[Arch].Version)
+ print('DecSpecification =', M.Header[Arch].DecSpecification)
+ print('\nIncludes =', M.Includes)
for Item in M.Includes:
- print Item.FilePath, Item.SupArchList
- print '\nGuids =', M.GuidDeclarations
+ print(Item.FilePath, Item.SupArchList)
+ print('\nGuids =', M.GuidDeclarations)
for Item in M.GuidDeclarations:
- print Item.CName, Item.Guid, Item.SupArchList
- print '\nProtocols =', M.ProtocolDeclarations
+ print(Item.CName, Item.Guid, Item.SupArchList)
+ print('\nProtocols =', M.ProtocolDeclarations)
for Item in M.ProtocolDeclarations:
- print Item.CName, Item.Guid, Item.SupArchList
- print '\nPpis =', M.PpiDeclarations
+ print(Item.CName, Item.Guid, Item.SupArchList)
+ print('\nPpis =', M.PpiDeclarations)
for Item in M.PpiDeclarations:
- print Item.CName, Item.Guid, Item.SupArchList
- print '\nLibraryClasses =', M.LibraryClassDeclarations
+ print(Item.CName, Item.Guid, Item.SupArchList)
+ print('\nLibraryClasses =', M.LibraryClassDeclarations)
for Item in M.LibraryClassDeclarations:
- print Item.LibraryClass, Item.RecommendedInstance, Item.SupModuleList, Item.SupArchList
- print '\nPcds =', M.PcdDeclarations
+ print(Item.LibraryClass, Item.RecommendedInstance, Item.SupModuleList, Item.SupArchList)
+ print('\nPcds =', M.PcdDeclarations)
for Item in M.PcdDeclarations:
- print 'CName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, 'Token=', Item.Token, 'DatumType=', Item.DatumType, Item.SupArchList
+ print('CName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, 'Token=', Item.Token, 'DatumType=', Item.DatumType, Item.SupArchList)
##
#
diff --git a/BaseTools/Source/Python/Common/Dictionary.py b/BaseTools/Source/Python/Common/Dictionary.py
index 1c33fefabf98..5f2cc8f31ffa 100644
--- a/BaseTools/Source/Python/Common/Dictionary.py
+++ b/BaseTools/Source/Python/Common/Dictionary.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import EdkLogger
from DataType import *
from Common.LongFilePathSupport import OpenLongFilePath as open
@@ -58,7 +59,7 @@ def printDict(Dict):
KeyList = Dict.keys()
for Key in KeyList:
if Dict[Key] != '':
- print Key + ' = ' + str(Dict[Key])
+ print(Key + ' = ' + str(Dict[Key]))
## Print the dictionary
#
@@ -71,6 +72,6 @@ def printList(Key, List):
if type(List) == type([]):
if len(List) > 0:
if Key.find(TAB_SPLIT) != -1:
- print "\n" + Key
+ print("\n" + Key)
for Item in List:
- print Item
+ print(Item)
diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/Source/Python/Common/DscClassObject.py
index c2fa1c275a2d..3a27fbffc934 100644
--- a/BaseTools/Source/Python/Common/DscClassObject.py
+++ b/BaseTools/Source/Python/Common/DscClassObject.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import EdkLogger as EdkLogger
import Database
@@ -1365,7 +1366,7 @@ class Dsc(DscObject):
# Print all members and their values of Dsc class
#
def ShowDsc(self):
- print TAB_SECTION_START + TAB_INF_DEFINES + TAB_SECTION_END
+ print(TAB_SECTION_START + TAB_INF_DEFINES + TAB_SECTION_END)
printDict(self.Defines.DefinesDictionary)
for Key in self.KeyList:
@@ -1382,47 +1383,47 @@ class Dsc(DscObject):
def ShowPlatform(self):
M = self.Platform
for Arch in M.Header.keys():
- print '\nArch =', Arch
- print 'Filename =', M.Header[Arch].FileName
- print 'FullPath =', M.Header[Arch].FullPath
- print 'BaseName =', M.Header[Arch].Name
- print 'Guid =', M.Header[Arch].Guid
- print 'Version =', M.Header[Arch].Version
- print 'DscSpecification =', M.Header[Arch].DscSpecification
- print 'SkuId =', M.Header[Arch].SkuIdName
- print 'SupArchList =', M.Header[Arch].SupArchList
- print 'BuildTargets =', M.Header[Arch].BuildTargets
- print 'OutputDirectory =', M.Header[Arch].OutputDirectory
- print 'BuildNumber =', M.Header[Arch].BuildNumber
- print 'MakefileName =', M.Header[Arch].MakefileName
- print 'BsBaseAddress =', M.Header[Arch].BsBaseAddress
- print 'RtBaseAddress =', M.Header[Arch].RtBaseAddress
- print 'Define =', M.Header[Arch].Define
- print 'Fdf =', M.FlashDefinitionFile.FilePath
- print '\nBuildOptions =', M.BuildOptions, M.BuildOptions.IncludeFiles
+ print('\nArch =', Arch)
+ print('Filename =', M.Header[Arch].FileName)
+ print('FullPath =', M.Header[Arch].FullPath)
+ print('BaseName =', M.Header[Arch].Name)
+ print('Guid =', M.Header[Arch].Guid)
+ print('Version =', M.Header[Arch].Version)
+ print('DscSpecification =', M.Header[Arch].DscSpecification)
+ print('SkuId =', M.Header[Arch].SkuIdName)
+ print('SupArchList =', M.Header[Arch].SupArchList)
+ print('BuildTargets =', M.Header[Arch].BuildTargets)
+ print('OutputDirectory =', M.Header[Arch].OutputDirectory)
+ print('BuildNumber =', M.Header[Arch].BuildNumber)
+ print('MakefileName =', M.Header[Arch].MakefileName)
+ print('BsBaseAddress =', M.Header[Arch].BsBaseAddress)
+ print('RtBaseAddress =', M.Header[Arch].RtBaseAddress)
+ print('Define =', M.Header[Arch].Define)
+ print('Fdf =', M.FlashDefinitionFile.FilePath)
+ print('\nBuildOptions =', M.BuildOptions, M.BuildOptions.IncludeFiles)
for Item in M.BuildOptions.BuildOptionList:
- print '\t', 'ToolChainFamily =', Item.ToolChainFamily, 'ToolChain =', Item.ToolChain, 'Option =', Item.Option, 'Arch =', Item.SupArchList
- print '\nSkuIds =', M.SkuInfos.SkuInfoList, M.SkuInfos.IncludeFiles
- print '\nLibraries =', M.Libraries, M.Libraries.IncludeFiles
+ print('\t', 'ToolChainFamily =', Item.ToolChainFamily, 'ToolChain =', Item.ToolChain, 'Option =', Item.Option, 'Arch =', Item.SupArchList)
+ print('\nSkuIds =', M.SkuInfos.SkuInfoList, M.SkuInfos.IncludeFiles)
+ print('\nLibraries =', M.Libraries, M.Libraries.IncludeFiles)
for Item in M.Libraries.LibraryList:
- print '\t', Item.FilePath, Item.SupArchList, Item.Define
- print '\nLibraryClasses =', M.LibraryClasses, M.LibraryClasses.IncludeFiles
+ print('\t', Item.FilePath, Item.SupArchList, Item.Define)
+ print('\nLibraryClasses =', M.LibraryClasses, M.LibraryClasses.IncludeFiles)
for Item in M.LibraryClasses.LibraryList:
- print '\t', Item.Name, Item.FilePath, Item.SupModuleList, Item.SupArchList, Item.Define
- print '\nPcds =', M.DynamicPcdBuildDefinitions
+ print('\t', Item.Name, Item.FilePath, Item.SupModuleList, Item.SupArchList, Item.Define)
+ print('\nPcds =', M.DynamicPcdBuildDefinitions)
for Item in M.DynamicPcdBuildDefinitions:
- print '\tCname=', Item.CName, 'TSG=', Item.TokenSpaceGuidCName, 'Value=', Item.DefaultValue, 'Token=', Item.Token, 'Type=', Item.ItemType, 'Datum=', Item.DatumType, 'Size=', Item.MaxDatumSize, 'Arch=', Item.SupArchList, Item.SkuInfoList
+ print('\tCname=', Item.CName, 'TSG=', Item.TokenSpaceGuidCName, 'Value=', Item.DefaultValue, 'Token=', Item.Token, 'Type=', Item.ItemType, 'Datum=', Item.DatumType, 'Size=', Item.MaxDatumSize, 'Arch=', Item.SupArchList, Item.SkuInfoList)
for Sku in Item.SkuInfoList.values():
- print '\t\t', str(Sku)
- print '\nComponents =', M.Modules.ModuleList, M.Modules.IncludeFiles
+ print('\t\t', str(Sku))
+ print('\nComponents =', M.Modules.ModuleList, M.Modules.IncludeFiles)
for Item in M.Modules.ModuleList:
- print '\t', Item.FilePath, Item.ExecFilePath, Item.SupArchList
+ print('\t', Item.FilePath, Item.ExecFilePath, Item.SupArchList)
for Lib in Item.LibraryClasses.LibraryList:
- print '\t\tLib:', Lib.Name, Lib.FilePath
+ print('\t\tLib:', Lib.Name, Lib.FilePath)
for Bo in Item.ModuleSaBuildOption.BuildOptionList:
- print '\t\tBuildOption:', Bo.ToolChainFamily, Bo.ToolChain, Bo.Option
+ print('\t\tBuildOption:', Bo.ToolChainFamily, Bo.ToolChain, Bo.Option)
for Pcd in Item.PcdBuildDefinitions:
- print '\t\tPcd:', Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.MaxDatumSize, Pcd.DefaultValue, Pcd.ItemType
+ print('\t\tPcd:', Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.MaxDatumSize, Pcd.DefaultValue, Pcd.ItemType)
##
#
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspace.py b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
index f22a545b77ce..ed85e4ee0b06 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspace.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os, sys, time
from DataType import *
from Common.LongFilePathSupport import OpenLongFilePath as open
@@ -39,7 +40,7 @@ class EdkIIWorkspace:
# Check environment valiable 'WORKSPACE'
#
if os.environ.get('WORKSPACE') == None:
- print 'ERROR: WORKSPACE not defined. Please run EdkSetup from the EDK II install directory.'
+ print('ERROR: WORKSPACE not defined. Please run EdkSetup from the EDK II install directory.')
return False
self.CurrentWorkingDir = os.getcwd()
@@ -76,18 +77,18 @@ class EdkIIWorkspace:
if self.PrintRunTime:
Seconds = int(time.time() - self.StartTime)
if Seconds < 60:
- print 'Run Time: %d seconds' % (Seconds)
+ print('Run Time: %d seconds' % (Seconds))
else:
Minutes = Seconds / 60
Seconds = Seconds % 60
if Minutes < 60:
- print 'Run Time: %d minutes %d seconds' % (Minutes, Seconds)
+ print('Run Time: %d minutes %d seconds' % (Minutes, Seconds))
else:
Hours = Minutes / 60
Minutes = Minutes % 60
- print 'Run Time: %d hours %d minutes %d seconds' % (Hours, Minutes, Seconds)
+ print('Run Time: %d hours %d minutes %d seconds' % (Hours, Minutes, Seconds))
if self.RunStatus != '':
- print self.RunStatus
+ print(self.RunStatus)
## Convert to a workspace relative filename
#
@@ -136,7 +137,7 @@ class EdkIIWorkspace:
#
def XmlParseFile (self, FileName):
if self.Verbose:
- print FileName
+ print(FileName)
return XmlParseFile (self.WorkspaceFile(FileName))
## Convert a XML section
@@ -150,7 +151,7 @@ class EdkIIWorkspace:
#
def XmlParseFileSection (self, FileName, SectionTag):
if self.Verbose:
- print FileName
+ print(FileName)
return XmlParseFileSection (self.WorkspaceFile(FileName), SectionTag)
## Save a XML file
@@ -164,7 +165,7 @@ class EdkIIWorkspace:
#
def XmlSaveFile (self, Dom, FileName):
if self.Verbose:
- print FileName
+ print(FileName)
return XmlSaveFile (Dom, self.WorkspaceFile(FileName))
## Convert Text File To Dictionary
@@ -182,7 +183,7 @@ class EdkIIWorkspace:
#
def ConvertTextFileToDictionary(self, FileName, Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter):
if self.Verbose:
- print FileName
+ print(FileName)
return ConvertTextFileToDictionary(self.WorkspaceFile(FileName), Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter)
## Convert Dictionary To Text File
@@ -200,7 +201,7 @@ class EdkIIWorkspace:
#
def ConvertDictionaryToTextFile(self, FileName, Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter):
if self.Verbose:
- print FileName
+ print(FileName)
return ConvertDictionaryToTextFile(self.WorkspaceFile(FileName), Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter)
## Convert Text File To Dictionary
@@ -317,4 +318,4 @@ def CreateFile(Directory, FileName, Mode='w'):
#
if __name__ == '__main__':
# Nothing to do here. Could do some unit tests
- pass
\ No newline at end of file
+ pass
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py b/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
index d6df01d4ce06..a2f7c94c1ca7 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os, string, copy, pdb, copy
import EdkLogger
import DataType
@@ -1568,89 +1569,89 @@ class WorkspaceBuild(object):
# Print each item of the workspacebuild with (Key = Value) pair
#
def ShowWorkspaceBuild(self):
- print self.DscDatabase
- print self.InfDatabase
- print self.DecDatabase
- print 'SupArchList', self.SupArchList
- print 'BuildTarget', self.BuildTarget
- print 'SkuId', self.SkuId
+ print(self.DscDatabase)
+ print(self.InfDatabase)
+ print(self.DecDatabase)
+ print('SupArchList', self.SupArchList)
+ print('BuildTarget', self.BuildTarget)
+ print('SkuId', self.SkuId)
for Arch in self.SupArchList:
- print Arch
- print 'Platform'
+ print(Arch)
+ print('Platform')
for Platform in self.Build[Arch].PlatformDatabase.keys():
P = self.Build[Arch].PlatformDatabase[Platform]
- print 'DescFilePath = ', P.DescFilePath
- print 'PlatformName = ', P.PlatformName
- print 'Guid = ', P.Guid
- print 'Version = ', P.Version
- print 'OutputDirectory = ', P.OutputDirectory
- print 'FlashDefinition = ', P.FlashDefinition
- print 'SkuIds = ', P.SkuIds
- print 'Modules = ', P.Modules
- print 'LibraryClasses = ', P.LibraryClasses
- print 'Pcds = ', P.Pcds
+ print('DescFilePath = ', P.DescFilePath)
+ print('PlatformName = ', P.PlatformName)
+ print('Guid = ', P.Guid)
+ print('Version = ', P.Version)
+ print('OutputDirectory = ', P.OutputDirectory)
+ print('FlashDefinition = ', P.FlashDefinition)
+ print('SkuIds = ', P.SkuIds)
+ print('Modules = ', P.Modules)
+ print('LibraryClasses = ', P.LibraryClasses)
+ print('Pcds = ', P.Pcds)
for item in P.Pcds.keys():
- print P.Pcds[item]
- print 'BuildOptions = ', P.BuildOptions
- print ''
+ print(P.Pcds[item])
+ print('BuildOptions = ', P.BuildOptions)
+ print('')
# End of Platform
- print 'package'
+ print('package')
for Package in self.Build[Arch].PackageDatabase.keys():
P = self.Build[Arch].PackageDatabase[Package]
- print 'DescFilePath = ', P.DescFilePath
- print 'PackageName = ', P.PackageName
- print 'Guid = ', P.Guid
- print 'Version = ', P.Version
- print 'Protocols = ', P.Protocols
- print 'Ppis = ', P.Ppis
- print 'Guids = ', P.Guids
- print 'Includes = ', P.Includes
- print 'LibraryClasses = ', P.LibraryClasses
- print 'Pcds = ', P.Pcds
+ print('DescFilePath = ', P.DescFilePath)
+ print('PackageName = ', P.PackageName)
+ print('Guid = ', P.Guid)
+ print('Version = ', P.Version)
+ print('Protocols = ', P.Protocols)
+ print('Ppis = ', P.Ppis)
+ print('Guids = ', P.Guids)
+ print('Includes = ', P.Includes)
+ print('LibraryClasses = ', P.LibraryClasses)
+ print('Pcds = ', P.Pcds)
for item in P.Pcds.keys():
- print P.Pcds[item]
- print ''
+ print(P.Pcds[item])
+ print('')
# End of Package
- print 'module'
+ print('module')
for Module in self.Build[Arch].ModuleDatabase.keys():
P = self.Build[Arch].ModuleDatabase[Module]
- print 'DescFilePath = ', P.DescFilePath
- print 'BaseName = ', P.BaseName
- print 'ModuleType = ', P.ModuleType
- print 'Guid = ', P.Guid
- print 'Version = ', P.Version
- print 'CustomMakefile = ', P.CustomMakefile
- print 'Specification = ', P.Specification
- print 'Shadow = ', P.Shadow
- print 'PcdIsDriver = ', P.PcdIsDriver
+ print('DescFilePath = ', P.DescFilePath)
+ print('BaseName = ', P.BaseName)
+ print('ModuleType = ', P.ModuleType)
+ print('Guid = ', P.Guid)
+ print('Version = ', P.Version)
+ print('CustomMakefile = ', P.CustomMakefile)
+ print('Specification = ', P.Specification)
+ print('Shadow = ', P.Shadow)
+ print('PcdIsDriver = ', P.PcdIsDriver)
for Lib in P.LibraryClass:
- print 'LibraryClassDefinition = ', Lib.LibraryClass, 'SupModList = ', Lib.SupModList
- print 'ModuleEntryPointList = ', P.ModuleEntryPointList
- print 'ModuleUnloadImageList = ', P.ModuleUnloadImageList
- print 'ConstructorList = ', P.ConstructorList
- print 'DestructorList = ', P.DestructorList
+ print('LibraryClassDefinition = ', Lib.LibraryClass, 'SupModList = ', Lib.SupModList)
+ print('ModuleEntryPointList = ', P.ModuleEntryPointList)
+ print('ModuleUnloadImageList = ', P.ModuleUnloadImageList)
+ print('ConstructorList = ', P.ConstructorList)
+ print('DestructorList = ', P.DestructorList)
- print 'Binaries = '
+ print('Binaries = ')
for item in P.Binaries:
- print item.BinaryFile, item.FeatureFlag, item.SupArchList
- print 'Sources = '
+ print(item.BinaryFile, item.FeatureFlag, item.SupArchList)
+ print('Sources = ')
for item in P.Sources:
- print item.SourceFile
- print 'LibraryClasses = ', P.LibraryClasses
- print 'Protocols = ', P.Protocols
- print 'Ppis = ', P.Ppis
- print 'Guids = ', P.Guids
- print 'Includes = ', P.Includes
- print 'Packages = ', P.Packages
- print 'Pcds = ', P.Pcds
+ print(item.SourceFile)
+ print('LibraryClasses = ', P.LibraryClasses)
+ print('Protocols = ', P.Protocols)
+ print('Ppis = ', P.Ppis)
+ print('Guids = ', P.Guids)
+ print('Includes = ', P.Includes)
+ print('Packages = ', P.Packages)
+ print('Pcds = ', P.Pcds)
for item in P.Pcds.keys():
- print P.Pcds[item]
- print 'BuildOptions = ', P.BuildOptions
- print 'Depex = ', P.Depex
- print ''
+ print(P.Pcds[item])
+ print('BuildOptions = ', P.BuildOptions)
+ print('Depex = ', P.Depex)
+ print('')
# End of Module
##
@@ -1659,12 +1660,12 @@ class WorkspaceBuild(object):
# script.
#
if __name__ == '__main__':
- print 'Start!', time.strftime('%H:%M:%S', time.localtime())
+ print('Start!', time.strftime('%H:%M:%S', time.localtime()))
EdkLogger.Initialize()
EdkLogger.SetLevel(EdkLogger.QUIET)
W = os.getenv('WORKSPACE')
Ewb = WorkspaceBuild('Nt32Pkg/Nt32Pkg.dsc', W)
Ewb.GenBuildDatabase({('PcdDevicePathSupportDevicePathFromText', 'gEfiMdeModulePkgTokenSpaceGuid') : 'KKKKKKKKKKKKKKKKKKKKK'}, ['Test.Inf'])
- print 'Done!', time.strftime('%H:%M:%S', time.localtime())
+ print('Done!', time.strftime('%H:%M:%S', time.localtime()))
Ewb.ShowWorkspaceBuild()
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index e40677558a68..145acfc072e7 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -12,6 +12,7 @@
## Import Modules
#
+from __future__ import print_function
from Common.GlobalData import *
from CommonDataClass.Exceptions import BadExpression
from CommonDataClass.Exceptions import WrnExpression
@@ -902,10 +903,10 @@ if __name__ == '__main__':
if input in 'qQ':
break
try:
- print ValueExpression(input)(True)
- print ValueExpression(input)(False)
+ print(ValueExpression(input)(True))
+ print(ValueExpression(input)(False))
except WrnExpression as Ex:
- print Ex.result
- print str(Ex)
+ print(Ex.result)
+ print(str(Ex))
except Exception as Ex:
- print str(Ex)
+ print(str(Ex))
diff --git a/BaseTools/Source/Python/Common/FdfParserLite.py b/BaseTools/Source/Python/Common/FdfParserLite.py
index ac03c3fef5bb..f2741616c46f 100644
--- a/BaseTools/Source/Python/Common/FdfParserLite.py
+++ b/BaseTools/Source/Python/Common/FdfParserLite.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import re
import Common.LongFilePathOs as os
@@ -1269,8 +1270,8 @@ class FdfParser(object):
self.__UndoToken()
if not self.__IsToken("[FD.", True):
FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
- print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
- % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+ print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+ % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
raise Warning("expected [FD.] At Line ", self.FileName, self.CurrentLineNumber)
FdName = self.__GetUiName()
@@ -1837,8 +1838,8 @@ class FdfParser(object):
self.__UndoToken()
if not self.__IsToken("[FV.", True):
FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
- print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
- % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+ print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+ % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
raise Warning("Unknown Keyword At Line ", self.FileName, self.CurrentLineNumber)
FvName = self.__GetUiName()
@@ -2643,8 +2644,8 @@ class FdfParser(object):
self.__UndoToken()
if not self.__IsToken("[CAPSULE.", True):
FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
- print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
- % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+ print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+ % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
raise Warning("expected [Capsule.] At Line ", self.FileName, self.CurrentLineNumber)
CapsuleObj = CommonDataClass.FdfClass.CapsuleClassObject()
@@ -2766,8 +2767,8 @@ class FdfParser(object):
self.__UndoToken()
if not self.__IsToken("[Rule.", True):
FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
- print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
- % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+ print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+ % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
raise Warning("expected [Rule.] At Line ", self.FileName, self.CurrentLineNumber)
if not self.__SkipToToken("."):
@@ -3357,8 +3358,8 @@ class FdfParser(object):
self.__UndoToken()
if not self.__IsToken("[VTF.", True):
FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
- print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
- % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+ print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+ % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
raise Warning("expected [VTF.] At Line ", self.FileName, self.CurrentLineNumber)
if not self.__SkipToToken("."):
@@ -3650,7 +3651,7 @@ class FdfParser(object):
raise Warning(LogStr)
except Warning:
- print LogStr
+ print(LogStr)
finally:
return CycleRefExists
@@ -3660,7 +3661,7 @@ if __name__ == "__main__":
try:
test_file = sys.argv[1]
except IndexError as v:
- print "Usage: %s filename" % sys.argv[0]
+ print("Usage: %s filename" % sys.argv[0])
sys.exit(1)
parser = FdfParser(test_file)
@@ -3668,7 +3669,7 @@ if __name__ == "__main__":
parser.ParseFile()
parser.CycleReferenceCheck()
except Warning as X:
- print X.message
+ print(X.message)
else:
- print "Success!"
+ print("Success!")
diff --git a/BaseTools/Source/Python/Common/InfClassObject.py b/BaseTools/Source/Python/Common/InfClassObject.py
index f24e4e41a0c1..fe82ffd8eb4e 100644
--- a/BaseTools/Source/Python/Common/InfClassObject.py
+++ b/BaseTools/Source/Python/Common/InfClassObject.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import re
import EdkLogger
@@ -447,79 +448,79 @@ class Inf(InfObject):
def ShowModule(self):
M = self.Module
for Arch in M.Header.keys():
- print '\nArch =', Arch
- print 'Filename =', M.Header[Arch].FileName
- print 'FullPath =', M.Header[Arch].FullPath
- print 'BaseName =', M.Header[Arch].Name
- print 'Guid =', M.Header[Arch].Guid
- print 'Version =', M.Header[Arch].Version
- print 'InfVersion =', M.Header[Arch].InfVersion
- print 'UefiSpecificationVersion =', M.Header[Arch].UefiSpecificationVersion
- print 'EdkReleaseVersion =', M.Header[Arch].EdkReleaseVersion
- print 'ModuleType =', M.Header[Arch].ModuleType
- print 'BinaryModule =', M.Header[Arch].BinaryModule
- print 'ComponentType =', M.Header[Arch].ComponentType
- print 'MakefileName =', M.Header[Arch].MakefileName
- print 'BuildNumber =', M.Header[Arch].BuildNumber
- print 'BuildType =', M.Header[Arch].BuildType
- print 'FfsExt =', M.Header[Arch].FfsExt
- print 'FvExt =', M.Header[Arch].FvExt
- print 'SourceFv =', M.Header[Arch].SourceFv
- print 'PcdIsDriver =', M.Header[Arch].PcdIsDriver
- print 'TianoEdkFlashMap_h =', M.Header[Arch].TianoEdkFlashMap_h
- print 'Shadow =', M.Header[Arch].Shadow
- print 'LibraryClass =', M.Header[Arch].LibraryClass
+ print('\nArch =', Arch)
+ print('Filename =', M.Header[Arch].FileName)
+ print('FullPath =', M.Header[Arch].FullPath)
+ print('BaseName =', M.Header[Arch].Name)
+ print('Guid =', M.Header[Arch].Guid)
+ print('Version =', M.Header[Arch].Version)
+ print('InfVersion =', M.Header[Arch].InfVersion)
+ print('UefiSpecificationVersion =', M.Header[Arch].UefiSpecificationVersion)
+ print('EdkReleaseVersion =', M.Header[Arch].EdkReleaseVersion)
+ print('ModuleType =', M.Header[Arch].ModuleType)
+ print('BinaryModule =', M.Header[Arch].BinaryModule)
+ print('ComponentType =', M.Header[Arch].ComponentType)
+ print('MakefileName =', M.Header[Arch].MakefileName)
+ print('BuildNumber =', M.Header[Arch].BuildNumber)
+ print('BuildType =', M.Header[Arch].BuildType)
+ print('FfsExt =', M.Header[Arch].FfsExt)
+ print('FvExt =', M.Header[Arch].FvExt)
+ print('SourceFv =', M.Header[Arch].SourceFv)
+ print('PcdIsDriver =', M.Header[Arch].PcdIsDriver)
+ print('TianoEdkFlashMap_h =', M.Header[Arch].TianoEdkFlashMap_h)
+ print('Shadow =', M.Header[Arch].Shadow)
+ print('LibraryClass =', M.Header[Arch].LibraryClass)
for Item in M.Header[Arch].LibraryClass:
- print Item.LibraryClass, DataType.TAB_VALUE_SPLIT.join(Item.SupModuleList)
- print 'CustomMakefile =', M.Header[Arch].CustomMakefile
- print 'Define =', M.Header[Arch].Define
- print 'Specification =', M.Header[Arch].Specification
+ print(Item.LibraryClass, DataType.TAB_VALUE_SPLIT.join(Item.SupModuleList))
+ print('CustomMakefile =', M.Header[Arch].CustomMakefile)
+ print('Define =', M.Header[Arch].Define)
+ print('Specification =', M.Header[Arch].Specification)
for Item in self.Module.ExternImages:
- print '\nEntry_Point = %s, UnloadImage = %s' % (Item.ModuleEntryPoint, Item.ModuleUnloadImage)
+ print('\nEntry_Point = %s, UnloadImage = %s' % (Item.ModuleEntryPoint, Item.ModuleUnloadImage))
for Item in self.Module.ExternLibraries:
- print 'Constructor = %s, Destructor = %s' % (Item.Constructor, Item.Destructor)
- print '\nBuildOptions =', M.BuildOptions
+ print('Constructor = %s, Destructor = %s' % (Item.Constructor, Item.Destructor))
+ print('\nBuildOptions =', M.BuildOptions)
for Item in M.BuildOptions:
- print Item.ToolChainFamily, Item.ToolChain, Item.Option, Item.SupArchList
- print '\nIncludes =', M.Includes
+ print(Item.ToolChainFamily, Item.ToolChain, Item.Option, Item.SupArchList)
+ print('\nIncludes =', M.Includes)
for Item in M.Includes:
- print Item.FilePath, Item.SupArchList
- print '\nLibraries =', M.Libraries
+ print(Item.FilePath, Item.SupArchList)
+ print('\nLibraries =', M.Libraries)
for Item in M.Libraries:
- print Item.Library, Item.SupArchList
- print '\nLibraryClasses =', M.LibraryClasses
+ print(Item.Library, Item.SupArchList)
+ print('\nLibraryClasses =', M.LibraryClasses)
for Item in M.LibraryClasses:
- print Item.LibraryClass, Item.RecommendedInstance, Item.FeatureFlag, Item.SupModuleList, Item.SupArchList, Item.Define
- print '\nPackageDependencies =', M.PackageDependencies
+ print(Item.LibraryClass, Item.RecommendedInstance, Item.FeatureFlag, Item.SupModuleList, Item.SupArchList, Item.Define)
+ print('\nPackageDependencies =', M.PackageDependencies)
for Item in M.PackageDependencies:
- print Item.FilePath, Item.SupArchList, Item.FeatureFlag
- print '\nNmake =', M.Nmake
+ print(Item.FilePath, Item.SupArchList, Item.FeatureFlag)
+ print('\nNmake =', M.Nmake)
for Item in M.Nmake:
- print Item.Name, Item.Value, Item.SupArchList
- print '\nPcds =', M.PcdCodes
+ print(Item.Name, Item.Value, Item.SupArchList)
+ print('\nPcds =', M.PcdCodes)
for Item in M.PcdCodes:
- print '\tCName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, Item.SupArchList
- print '\nSources =', M.Sources
+ print('\tCName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, Item.SupArchList)
+ print('\nSources =', M.Sources)
for Source in M.Sources:
- print Source.SourceFile, 'Fam=', Source.ToolChainFamily, 'Pcd=', Source.FeatureFlag, 'Tag=', Source.TagName, 'ToolCode=', Source.ToolCode, Source.SupArchList
- print '\nUserExtensions =', M.UserExtensions
+ print(Source.SourceFile, 'Fam=', Source.ToolChainFamily, 'Pcd=', Source.FeatureFlag, 'Tag=', Source.TagName, 'ToolCode=', Source.ToolCode, Source.SupArchList)
+ print('\nUserExtensions =', M.UserExtensions)
for UserExtension in M.UserExtensions:
- print UserExtension.UserID, UserExtension.Identifier, UserExtension.Content
- print '\nGuids =', M.Guids
+ print(UserExtension.UserID, UserExtension.Identifier, UserExtension.Content)
+ print('\nGuids =', M.Guids)
for Item in M.Guids:
- print Item.CName, Item.SupArchList, Item.FeatureFlag
- print '\nProtocols =', M.Protocols
+ print(Item.CName, Item.SupArchList, Item.FeatureFlag)
+ print('\nProtocols =', M.Protocols)
for Item in M.Protocols:
- print Item.CName, Item.SupArchList, Item.FeatureFlag
- print '\nPpis =', M.Ppis
+ print(Item.CName, Item.SupArchList, Item.FeatureFlag)
+ print('\nPpis =', M.Ppis)
for Item in M.Ppis:
- print Item.CName, Item.SupArchList, Item.FeatureFlag
- print '\nDepex =', M.Depex
+ print(Item.CName, Item.SupArchList, Item.FeatureFlag)
+ print('\nDepex =', M.Depex)
for Item in M.Depex:
- print Item.Depex, Item.SupArchList, Item.Define
- print '\nBinaries =', M.Binaries
+ print(Item.Depex, Item.SupArchList, Item.Define)
+ print('\nBinaries =', M.Binaries)
for Binary in M.Binaries:
- print 'Type=', Binary.FileType, 'Target=', Binary.Target, 'Name=', Binary.BinaryFile, 'FeatureFlag=', Binary.FeatureFlag, 'SupArchList=', Binary.SupArchList
+ print('Type=', Binary.FileType, 'Target=', Binary.Target, 'Name=', Binary.BinaryFile, 'FeatureFlag=', Binary.FeatureFlag, 'SupArchList=', Binary.SupArchList)
## Convert [Defines] section content to ModuleHeaderClass
#
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 10b6ac55242b..ee33ae3d3266 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -12,6 +12,7 @@
# # Import Modules
#
+from __future__ import print_function
from Common.GlobalData import *
from CommonDataClass.Exceptions import BadExpression
from CommonDataClass.Exceptions import WrnExpression
@@ -93,11 +94,11 @@ class RangeContainer(object):
self.__clean__()
def dump(self):
- print "----------------------"
+ print("----------------------")
rangelist = ""
for object in self.rangelist:
rangelist = rangelist + "[%d , %d]" % (object.start, object.end)
- print rangelist
+ print(rangelist)
class XOROperatorObject(object):
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 387e51523097..3408cff8d75e 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import EdkLogger
import DataType
@@ -148,7 +149,7 @@ class TargetTxtClassObject(object):
KeyList = Dict.keys()
for Key in KeyList:
if Dict[Key] != '':
- print Key + ' = ' + str(Dict[Key])
+ print(Key + ' = ' + str(Dict[Key]))
## Print the dictionary
#
@@ -161,9 +162,9 @@ class TargetTxtClassObject(object):
if type(List) == type([]):
if len(List) > 0:
if Key.find(TAB_SPLIT) != -1:
- print "\n" + Key
+ print("\n" + Key)
for Item in List:
- print Item
+ print(Item)
## TargetTxtDict
#
# Load target.txt in input Conf dir
@@ -185,6 +186,6 @@ def TargetTxtDict(ConfDir):
if __name__ == '__main__':
pass
Target = TargetTxtDict(os.getenv("WORKSPACE"))
- print Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER]
- print Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TARGET]
- print Target.TargetTxtDictionary
+ print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER])
+ print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TARGET])
+ print(Target.TargetTxtDictionary)
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 14ccabe833db..a6c1fb70bd7d 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -15,6 +15,7 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import re
import Common.EdkLogger as EdkLogger
@@ -249,7 +250,7 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
except Exception as X:
EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData="%s" % (str(X)))
(out, error) = PopenObject.communicate()
- print out
+ print(out)
while PopenObject.returncode == None :
PopenObject.wait()
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index 18a7ff055740..2df8fc3e0c26 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -1,3 +1,4 @@
+from __future__ import print_function
# $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
from antlr3 import *
@@ -102,7 +103,7 @@ class CParser(Parser):
self.postfix_expression_stack = []
def printTokenInfo(self, line, offset, tokenText):
- print str(line)+ ',' + str(offset) + ':' + str(tokenText)
+ print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
diff --git a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
index 171600feebf9..7bdb3cc3aea5 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
@@ -16,6 +16,7 @@
# Import Modules
#
+from __future__ import print_function
import re
import Common.LongFilePathOs as os
import sys
@@ -567,58 +568,58 @@ class CodeFragmentCollector:
def PrintFragments(self):
- print '################# ' + self.FileName + '#####################'
+ print('################# ' + self.FileName + '#####################')
- print '/****************************************/'
- print '/*************** COMMENTS ***************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/*************** COMMENTS ***************/')
+ print('/****************************************/')
for comment in FileProfile.CommentList:
- print str(comment.StartPos) + comment.Content
+ print(str(comment.StartPos) + comment.Content)
- print '/****************************************/'
- print '/********* PREPROCESS DIRECTIVES ********/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/********* PREPROCESS DIRECTIVES ********/')
+ print('/****************************************/')
for pp in FileProfile.PPDirectiveList:
- print str(pp.StartPos) + pp.Content
+ print(str(pp.StartPos) + pp.Content)
- print '/****************************************/'
- print '/********* VARIABLE DECLARATIONS ********/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/********* VARIABLE DECLARATIONS ********/')
+ print('/****************************************/')
for var in FileProfile.VariableDeclarationList:
- print str(var.StartPos) + var.Modifier + ' '+ var.Declarator
+ print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
- print '/****************************************/'
- print '/********* FUNCTION DEFINITIONS *********/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/********* FUNCTION DEFINITIONS *********/')
+ print('/****************************************/')
for func in FileProfile.FunctionDefinitionList:
- print str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos)
+ print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
- print '/****************************************/'
- print '/************ ENUMERATIONS **************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/************ ENUMERATIONS **************/')
+ print('/****************************************/')
for enum in FileProfile.EnumerationDefinitionList:
- print str(enum.StartPos) + enum.Content
+ print(str(enum.StartPos) + enum.Content)
- print '/****************************************/'
- print '/*********** STRUCTS/UNIONS *************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/*********** STRUCTS/UNIONS *************/')
+ print('/****************************************/')
for su in FileProfile.StructUnionDefinitionList:
- print str(su.StartPos) + su.Content
+ print(str(su.StartPos) + su.Content)
- print '/****************************************/'
- print '/********* PREDICATE EXPRESSIONS ********/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/********* PREDICATE EXPRESSIONS ********/')
+ print('/****************************************/')
for predexp in FileProfile.PredicateExpressionList:
- print str(predexp.StartPos) + predexp.Content
+ print(str(predexp.StartPos) + predexp.Content)
- print '/****************************************/'
- print '/************** TYPEDEFS ****************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/************** TYPEDEFS ****************/')
+ print('/****************************************/')
for typedef in FileProfile.TypedefDefinitionList:
- print str(typedef.StartPos) + typedef.ToType
+ print(str(typedef.StartPos) + typedef.ToType)
if __name__ == "__main__":
collector = CodeFragmentCollector(sys.argv[1])
collector.PreprocessFile()
- print "For Test."
+ print("For Test.")
diff --git a/BaseTools/Source/Python/Ecc/Configuration.py b/BaseTools/Source/Python/Ecc/Configuration.py
index b523858e1b1f..c3bbba09b744 100644
--- a/BaseTools/Source/Python/Ecc/Configuration.py
+++ b/BaseTools/Source/Python/Ecc/Configuration.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import Common.EdkLogger as EdkLogger
from Common.DataType import *
@@ -315,6 +316,6 @@ class Configuration(object):
self.__dict__[List[0]] = List[1]
def ShowMe(self):
- print self.Filename
+ print(self.Filename)
for Key in self.__dict__.keys():
- print Key, '=', self.__dict__[Key]
+ print(Key, '=', self.__dict__[Key])
diff --git a/BaseTools/Source/Python/Ecc/Exception.py b/BaseTools/Source/Python/Ecc/Exception.py
index b0882afa6289..bde41c3a4b57 100644
--- a/BaseTools/Source/Python/Ecc/Exception.py
+++ b/BaseTools/Source/Python/Ecc/Exception.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
from Xml.XmlRoutines import *
import Common.LongFilePathOs as os
@@ -84,4 +85,4 @@ class ExceptionCheck(object):
#
if __name__ == '__main__':
El = ExceptionCheck('C:\\Hess\\Project\\BuildTool\\src\\Ecc\\exception.xml')
- print El.ExceptionList
+ print(El.ExceptionList)
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index a4057ceb1775..5bb7759e2120 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import Common.EdkLogger as EdkLogger
@@ -99,7 +100,7 @@ class Table(object):
try:
self.Cur.execute(SqlCommand)
except Exception as e:
- print "An error occurred when Drop a table:", e.args[0]
+ print("An error occurred when Drop a table:", e.args[0])
## Get count
#
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index 4ce8edf5573a..eb76f4e6d54a 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import xml.dom.minidom
from Common.LongFilePathSupport import OpenLongFilePath as open
@@ -215,7 +216,7 @@ def XmlParseFile(FileName):
XmlFile.close()
return Dom
except Exception as X:
- print X
+ print(X)
return ""
# This acts like the main() function for the script, unless it is 'import'ed
@@ -225,5 +226,5 @@ if __name__ == '__main__':
A = CreateXmlElement('AAA', 'CCC', [['AAA', '111'], ['BBB', '222']], [['A', '1'], ['B', '2']])
B = CreateXmlElement('ZZZ', 'CCC', [['XXX', '111'], ['YYY', '222']], [['A', '1'], ['B', '2']])
C = CreateXmlList('DDD', 'EEE', [A, B], ['FFF', 'GGG'])
- print C.toprettyxml(indent = " ")
+ print(C.toprettyxml(indent = " "))
pass
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 8a4b10727a07..7f83387c08c8 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -11,6 +11,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import print_function
import sys
import Common.LongFilePathOs as os
import re
@@ -2279,7 +2280,7 @@ def CheckDoxygenTripleForwardSlash(FullFileName):
for Result in ResultSet:
CommentSet.append(Result)
except:
- print 'Unrecognized chars in comment of file %s', FullFileName
+ print('Unrecognized chars in comment of file %s', FullFileName)
for Result in CommentSet:
@@ -2432,7 +2433,7 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
for Result in ResultSet:
CommentSet.append(Result)
except:
- print 'Unrecognized chars in comment of file %s', FullFileName
+ print('Unrecognized chars in comment of file %s', FullFileName)
# Func Decl check
SqlStatement = """ select Modifier, Name, StartLine, ID, Value
@@ -2463,7 +2464,7 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
for Result in ResultSet:
CommentSet.append(Result)
except:
- print 'Unrecognized chars in comment of file %s', FullFileName
+ print('Unrecognized chars in comment of file %s', FullFileName)
SqlStatement = """ select Modifier, Header, StartLine, ID, Name
from Function
@@ -2628,9 +2629,9 @@ if __name__ == '__main__':
try:
test_file = sys.argv[1]
except IndexError as v:
- print "Usage: %s filename" % sys.argv[0]
+ print("Usage: %s filename" % sys.argv[0])
sys.exit(1)
MsgList = CheckFuncHeaderDoxygenComments(test_file)
for Msg in MsgList:
- print Msg
- print 'Done!'
+ print(Msg)
+ print('Done!')
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index 18a7ff055740..2df8fc3e0c26 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -1,3 +1,4 @@
+from __future__ import print_function
# $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
from antlr3 import *
@@ -102,7 +103,7 @@ class CParser(Parser):
self.postfix_expression_stack = []
def printTokenInfo(self, line, offset, tokenText):
- print str(line)+ ',' + str(offset) + ':' + str(tokenText)
+ print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
diff --git a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
index bb78a0f882d5..5d5336bee463 100644
--- a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import re
import Common.LongFilePathOs as os
import sys
@@ -413,49 +414,49 @@ class CodeFragmentCollector:
#
def PrintFragments(self):
- print '################# ' + self.FileName + '#####################'
+ print('################# ' + self.FileName + '#####################')
- print '/****************************************/'
- print '/*************** ASSIGNMENTS ***************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/*************** ASSIGNMENTS ***************/')
+ print('/****************************************/')
for asign in FileProfile.AssignmentExpressionList:
- print str(asign.StartPos) + asign.Name + asign.Operator + asign.Value
+ print(str(asign.StartPos) + asign.Name + asign.Operator + asign.Value)
- print '/****************************************/'
- print '/********* PREPROCESS DIRECTIVES ********/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/********* PREPROCESS DIRECTIVES ********/')
+ print('/****************************************/')
for pp in FileProfile.PPDirectiveList:
- print str(pp.StartPos) + pp.Content
+ print(str(pp.StartPos) + pp.Content)
- print '/****************************************/'
- print '/********* VARIABLE DECLARATIONS ********/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/********* VARIABLE DECLARATIONS ********/')
+ print('/****************************************/')
for var in FileProfile.VariableDeclarationList:
- print str(var.StartPos) + var.Modifier + ' '+ var.Declarator
+ print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
- print '/****************************************/'
- print '/********* FUNCTION DEFINITIONS *********/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/********* FUNCTION DEFINITIONS *********/')
+ print('/****************************************/')
for func in FileProfile.FunctionDefinitionList:
- print str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos)
+ print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
- print '/****************************************/'
- print '/************ ENUMERATIONS **************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/************ ENUMERATIONS **************/')
+ print('/****************************************/')
for enum in FileProfile.EnumerationDefinitionList:
- print str(enum.StartPos) + enum.Content
+ print(str(enum.StartPos) + enum.Content)
- print '/****************************************/'
- print '/*********** STRUCTS/UNIONS *************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/*********** STRUCTS/UNIONS *************/')
+ print('/****************************************/')
for su in FileProfile.StructUnionDefinitionList:
- print str(su.StartPos) + su.Content
+ print(str(su.StartPos) + su.Content)
- print '/****************************************/'
- print '/************** TYPEDEFS ****************/'
- print '/****************************************/'
+ print('/****************************************/')
+ print('/************** TYPEDEFS ****************/')
+ print('/****************************************/')
for typedef in FileProfile.TypedefDefinitionList:
- print str(typedef.StartPos) + typedef.ToType
+ print(str(typedef.StartPos) + typedef.ToType)
##
#
@@ -464,4 +465,4 @@ class CodeFragmentCollector:
#
if __name__ == "__main__":
- print "For Test."
+ print("For Test.")
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 6696623aba68..9d8f0864dc41 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -13,6 +13,7 @@
## Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import re
import sys
@@ -1190,17 +1191,17 @@ class PeImage:
self.Machine, self.NumberOfSections, self.SizeOfOptionalHeader = \
self._FileHeader.unpack_from(self._PeImageBuf, self.Offset + FileHeaderOffset)
- print "Machine=%x NumberOfSections=%x SizeOfOptionalHeader=%x" % (self.Machine, self.NumberOfSections, self.SizeOfOptionalHeader)
+ print("Machine=%x NumberOfSections=%x SizeOfOptionalHeader=%x" % (self.Machine, self.NumberOfSections, self.SizeOfOptionalHeader))
# optional header follows the FILE header
OptionalHeaderOffset = FileHeaderOffset + struct.calcsize(self._FileHeaderFormat)
Magic, self.SizeOfImage, SizeOfHeaders, self.Checksum, NumberOfRvaAndSizes = \
self._OptionalHeader32.unpack_from(self._PeImageBuf, self.Offset + OptionalHeaderOffset)
- print "Magic=%x SizeOfImage=%x SizeOfHeaders=%x, Checksum=%x, NumberOfRvaAndSizes=%x" % (Magic, self.SizeOfImage, SizeOfHeaders, self.Checksum, NumberOfRvaAndSizes)
+ print("Magic=%x SizeOfImage=%x SizeOfHeaders=%x, Checksum=%x, NumberOfRvaAndSizes=%x" % (Magic, self.SizeOfImage, SizeOfHeaders, self.Checksum, NumberOfRvaAndSizes))
PeImageSectionTableOffset = OptionalHeaderOffset + self.SizeOfOptionalHeader
PeSections = PeSectionTable(self._PeImageBuf, self.Offset + PeImageSectionTableOffset, self.NumberOfSections)
- print "%x" % PeSections.GetFileAddress(0x3920)
+ print("%x" % PeSections.GetFileAddress(0x3920))
## PeSectionTable() class
#
@@ -1215,7 +1216,7 @@ class PeSectionTable:
SectionHeader = PeSectionHeader(Buf, SectionHeaderOffset)
self._SectionList.append(SectionHeader)
SectionHeaderOffset += len(SectionHeader)
- print SectionHeader
+ print(SectionHeader)
def GetFileAddress(self, Rva):
for PeSection in self._SectionList:
@@ -1412,7 +1413,7 @@ def Main():
Option = GetOptions()
build.main()
except Exception as e:
- print e
+ print(e)
return 1
return 0
@@ -1435,7 +1436,7 @@ if __name__ == '__main__':
fv = FirmwareVolume("FVRECOVERY")
fv.frombuffer(buf, 0, len(buf))
#fv.Dispatch(None)
- print fv
+ print(fv)
elif FilePath.endswith(".efi"):
fd = open(FilePath, 'rb')
buf = array('B')
diff --git a/BaseTools/Source/Python/Eot/InfParserLite.py b/BaseTools/Source/Python/Eot/InfParserLite.py
index 6bb2c5f9f1d6..f624837f2587 100644
--- a/BaseTools/Source/Python/Eot/InfParserLite.py
+++ b/BaseTools/Source/Python/Eot/InfParserLite.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import Common.EdkLogger as EdkLogger
from Common.DataType import *
@@ -164,8 +165,8 @@ if __name__ == '__main__':
Db.InitDatabase()
P = EdkInfParser(os.path.normpath("C:\Framework\Edk\Sample\Platform\Nt32\Dxe\PlatformBds\PlatformBds.inf"), Db, '', '')
for Inf in P.Sources:
- print Inf
+ print(Inf)
for Item in P.Macros:
- print Item, P.Macros[Item]
+ print(Item, P.Macros[Item])
- Db.Close()
\ No newline at end of file
+ Db.Close()
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index 8199ce5ee73e..c70f62f393a9 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import sys
import Common.LongFilePathOs as os
import re
@@ -384,4 +385,4 @@ if __name__ == '__main__':
EdkLogger.SetLevel(EdkLogger.QUIET)
CollectSourceCodeDataIntoDB(sys.argv[1])
- print 'Done!'
+ print('Done!')
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 15b2b792b2e1..d4ba485bcdff 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -16,6 +16,7 @@
##
# Import Modules
#
+from __future__ import print_function
import re
import Fd
@@ -4818,7 +4819,7 @@ if __name__ == "__main__":
try:
test_file = sys.argv[1]
except IndexError as v:
- print "Usage: %s filename" % sys.argv[0]
+ print("Usage: %s filename" % sys.argv[0])
sys.exit(1)
parser = FdfParser(test_file)
@@ -4826,7 +4827,7 @@ if __name__ == "__main__":
parser.ParseFile()
parser.CycleReferenceCheck()
except Warning as X:
- print str(X)
+ print(str(X))
else:
- print "Success!"
+ print("Success!")
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index bc2bb407560f..4415b44ef77c 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
from optparse import OptionParser
import sys
import Common.LongFilePathOs as os
@@ -743,7 +744,7 @@ class GenFds :
ModuleDict = BuildDb.BuildObject[DscFile, 'COMMON', GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Modules
for Key in ModuleDict:
ModuleObj = BuildDb.BuildObject[Key, 'COMMON', GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
- print ModuleObj.BaseName + ' ' + ModuleObj.ModuleType
+ print(ModuleObj.BaseName + ' ' + ModuleObj.ModuleType)
def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 1a5ef92afc1c..393820651a11 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import sys
import subprocess
@@ -737,7 +738,7 @@ class GenFdsGlobalVariable:
GenFdsGlobalVariable.InfLogger (out)
GenFdsGlobalVariable.InfLogger (error)
if PopenObject.returncode != 0:
- print "###", cmd
+ print("###", cmd)
EdkLogger.error("GenFds", COMMAND_FAILURE, errorMess)
def VerboseLogger (msg):
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index fdad5a44dc3d..127385228fcf 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -17,6 +17,7 @@
#
#====================================== External Libraries ========================================
+from __future__ import print_function
import optparse
import Common.LongFilePathOs as os
import re
@@ -215,7 +216,7 @@ if __name__ == '__main__':
(options, args) = parser.parse_args()
if options.mapfile == None or options.efifile == None:
- print parser.get_usage()
+ print(parser.get_usage())
elif os.path.exists(options.mapfile) and os.path.exists(options.efifile):
list = parsePcdInfoFromMapFile(options.mapfile, options.efifile)
if list != None:
@@ -224,6 +225,6 @@ if __name__ == '__main__':
else:
generatePcdTable(list, options.mapfile.replace('.map', '.BinaryPcdTable.txt'))
else:
- print 'Fail to generate Patch PCD Table based on map file and efi file'
+ print('Fail to generate Patch PCD Table based on map file and efi file')
else:
- print 'Fail to generate Patch PCD Table for fail to find map file or efi file!'
+ print('Fail to generate Patch PCD Table for fail to find map file or efi file!')
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index de8575676cac..4f79d0f82967 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -19,6 +19,7 @@
'''
Pkcs7Sign
'''
+from __future__ import print_function
import os
import sys
@@ -113,14 +114,14 @@ if __name__ == '__main__':
try:
Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
except:
- print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
+ print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(1)
Version = Process.communicate()
if Process.returncode <> 0:
- print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
+ print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(Process.returncode)
- print Version[0]
+ print(Version[0])
#
# Read input file into a buffer and save input filename
@@ -134,7 +135,7 @@ if __name__ == '__main__':
#
OutputDir = os.path.dirname(args.OutputFile)
if not os.path.exists(OutputDir):
- print 'ERROR: The output path does not exist: %s' % OutputDir
+ print('ERROR: The output path does not exist: %s' % OutputDir)
sys.exit(1)
args.OutputFileName = args.OutputFile
@@ -170,7 +171,7 @@ if __name__ == '__main__':
args.SignerPrivateCertFile = open(args.SignerPrivateCertFileName, 'rb')
args.SignerPrivateCertFile.close()
except:
- print 'ERROR: test signer private cert file %s missing' % (args.SignerPrivateCertFileName)
+ print('ERROR: test signer private cert file %s missing' % (args.SignerPrivateCertFileName))
sys.exit(1)
#
@@ -196,7 +197,7 @@ if __name__ == '__main__':
args.OtherPublicCertFile = open(args.OtherPublicCertFileName, 'rb')
args.OtherPublicCertFile.close()
except:
- print 'ERROR: test other public cert file %s missing' % (args.OtherPublicCertFileName)
+ print('ERROR: test other public cert file %s missing' % (args.OtherPublicCertFileName))
sys.exit(1)
format = "%dsQ" % len(args.InputFileBuffer)
@@ -242,11 +243,11 @@ if __name__ == '__main__':
args.TrustedPublicCertFile = open(args.TrustedPublicCertFileName, 'rb')
args.TrustedPublicCertFile.close()
except:
- print 'ERROR: test trusted public cert file %s missing' % (args.TrustedPublicCertFileName)
+ print('ERROR: test trusted public cert file %s missing' % (args.TrustedPublicCertFileName))
sys.exit(1)
if not args.SignatureSizeStr:
- print "ERROR: please use the option --signature-size to specify the size of the signature data!"
+ print("ERROR: please use the option --signature-size to specify the size of the signature data!")
sys.exit(1)
else:
if args.SignatureSizeStr.upper().startswith('0X'):
@@ -254,10 +255,10 @@ if __name__ == '__main__':
else:
SignatureSize = (long)(args.SignatureSizeStr)
if SignatureSize < 0:
- print "ERROR: The value of option --signature-size can't be set to negative value!"
+ print("ERROR: The value of option --signature-size can't be set to negative value!")
sys.exit(1)
elif SignatureSize > len(args.InputFileBuffer):
- print "ERROR: The value of option --signature-size is exceed the size of the input file !"
+ print("ERROR: The value of option --signature-size is exceed the size of the input file !")
sys.exit(1)
args.SignatureBuffer = args.InputFileBuffer[0:SignatureSize]
@@ -277,7 +278,7 @@ if __name__ == '__main__':
Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName, args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.communicate(input=args.SignatureBuffer)[0]
if Process.returncode <> 0:
- print 'ERROR: Verification failed'
+ print('ERROR: Verification failed')
os.remove (args.OutputFileName)
sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 95a636966c59..06ed2610271f 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -22,6 +22,7 @@
'''
Rsa2048Sha256GenerateKeys
'''
+from __future__ import print_function
import os
import sys
@@ -75,14 +76,14 @@ if __name__ == '__main__':
try:
Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
except:
- print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
+ print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(1)
Version = Process.communicate()
if Process.returncode <> 0:
- print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
+ print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(Process.returncode)
- print Version[0]
+ print(Version[0])
args.PemFileName = []
@@ -103,7 +104,7 @@ if __name__ == '__main__':
Process = subprocess.Popen('%s genrsa -out %s 2048' % (OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.communicate()
if Process.returncode <> 0:
- print 'ERROR: RSA 2048 key generation failed'
+ print('ERROR: RSA 2048 key generation failed')
sys.exit(Process.returncode)
#
@@ -125,7 +126,7 @@ if __name__ == '__main__':
Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
if Process.returncode <> 0:
- print 'ERROR: Unable to extract public key from private key'
+ print('ERROR: Unable to extract public key from private key')
sys.exit(Process.returncode)
PublicKey = ''
for Index in range (0, len(PublicKeyHexString), 2):
@@ -138,7 +139,7 @@ if __name__ == '__main__':
Process.stdin.write (PublicKey)
PublicKeyHash = PublicKeyHash + Process.communicate()[0]
if Process.returncode <> 0:
- print 'ERROR: Unable to extract SHA 256 hash of public key'
+ print('ERROR: Unable to extract SHA 256 hash of public key')
sys.exit(Process.returncode)
#
@@ -171,4 +172,4 @@ if __name__ == '__main__':
# If verbose is enabled display the public key in C structure format
#
if args.Verbose:
- print 'PublicKeySha256 = ' + PublicKeyHashC
+ print('PublicKeySha256 = ' + PublicKeyHashC)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 1ae6ebb35886..99a5d8aa5a01 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -17,6 +17,7 @@
'''
Rsa2048Sha256Sign
'''
+from __future__ import print_function
import os
import sys
@@ -96,14 +97,14 @@ if __name__ == '__main__':
try:
Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
except:
- print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
+ print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(1)
Version = Process.communicate()
if Process.returncode <> 0:
- print 'ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH'
+ print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(Process.returncode)
- print Version[0]
+ print(Version[0])
#
# Read input file into a buffer and save input filename
@@ -117,7 +118,7 @@ if __name__ == '__main__':
#
OutputDir = os.path.dirname(args.OutputFile)
if not os.path.exists(OutputDir):
- print 'ERROR: The output path does not exist: %s' % OutputDir
+ print('ERROR: The output path does not exist: %s' % OutputDir)
sys.exit(1)
args.OutputFileName = args.OutputFile
@@ -144,7 +145,7 @@ if __name__ == '__main__':
args.PrivateKeyFile = open(args.PrivateKeyFileName, 'rb')
args.PrivateKeyFile.close()
except:
- print 'ERROR: test signing private key file %s missing' % (args.PrivateKeyFileName)
+ print('ERROR: test signing private key file %s missing' % (args.PrivateKeyFileName))
sys.exit(1)
#
@@ -202,14 +203,14 @@ if __name__ == '__main__':
# Verify that the Hash Type matches the expected SHA256 type
#
if uuid.UUID(bytes_le = Header.HashType) <> EFI_HASH_ALGORITHM_SHA256_GUID:
- print 'ERROR: unsupport hash GUID'
+ print('ERROR: unsupport hash GUID')
sys.exit(1)
#
# Verify the public key
#
if Header.PublicKey <> PublicKey:
- print 'ERROR: Public key in input file does not match public key from private key file'
+ print('ERROR: Public key in input file does not match public key from private key file')
sys.exit(1)
FullInputFileBuffer = args.InputFileBuffer
@@ -228,7 +229,7 @@ if __name__ == '__main__':
Process = subprocess.Popen('%s sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.communicate(input=FullInputFileBuffer)
if Process.returncode <> 0:
- print 'ERROR: Verification failed'
+ print('ERROR: Verification failed')
os.remove (args.OutputFileName)
sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index 882b016bf058..ebed7a0ea7b8 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -12,6 +12,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import sys
import traceback
@@ -32,7 +33,7 @@ class TargetTool():
self.Arg = args[0]
self.FileName = os.path.normpath(os.path.join(self.WorkSpace, 'Conf', 'target.txt'))
if os.path.isfile(self.FileName) == False:
- print "%s does not exist." % self.FileName
+ print("%s does not exist." % self.FileName)
sys.exit(1)
self.TargetTxtDictionary = {
TAB_TAT_DEFINES_ACTIVE_PLATFORM : None,
@@ -84,14 +85,14 @@ class TargetTool():
errMsg = ''
for Key in KeyList:
if type(self.TargetTxtDictionary[Key]) == type([]):
- print "%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key]))
+ print("%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key])))
elif self.TargetTxtDictionary[Key] == None:
errMsg += " Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep
else:
- print "%-30s = %s" % (Key, self.TargetTxtDictionary[Key])
+ print("%-30s = %s" % (Key, self.TargetTxtDictionary[Key]))
if errMsg != '':
- print os.linesep + 'Warning:' + os.linesep + errMsg
+ print(os.linesep + 'Warning:' + os.linesep + errMsg)
def RWFile(self, CommentCharacter, KeySplitCharacter, Num):
try:
@@ -110,7 +111,7 @@ class TargetTool():
if Key not in existKeys:
existKeys.append(Key)
else:
- print "Warning: Found duplicate key item in original configuration files!"
+ print("Warning: Found duplicate key item in original configuration files!")
if Num == 0:
Line = "%-30s = \n" % Key
@@ -121,7 +122,7 @@ class TargetTool():
fw.write(Line)
for key in self.TargetTxtDictionary.keys():
if key not in existKeys:
- print "Warning: %s does not exist in original configuration file" % key
+ print("Warning: %s does not exist in original configuration file" % key)
Line = GetConfigureKeyValue(self, key)
if Line == None:
Line = "%-30s = " % key
@@ -224,25 +225,25 @@ if __name__ == '__main__':
EdkLogger.Initialize()
EdkLogger.SetLevel(EdkLogger.QUIET)
if os.getenv('WORKSPACE') == None:
- print "ERROR: WORKSPACE should be specified or edksetup script should be executed before run TargetTool"
+ print("ERROR: WORKSPACE should be specified or edksetup script should be executed before run TargetTool")
sys.exit(1)
(opt, args) = MyOptionParser()
if len(args) != 1 or (args[0].lower() != 'print' and args[0].lower() != 'clean' and args[0].lower() != 'set'):
- print "The number of args isn't 1 or the value of args is invalid."
+ print("The number of args isn't 1 or the value of args is invalid.")
sys.exit(1)
if opt.NUM != None and opt.NUM < 1:
- print "The MAX_CONCURRENT_THREAD_NUMBER must be larger than 0."
+ print("The MAX_CONCURRENT_THREAD_NUMBER must be larger than 0.")
sys.exit(1)
if opt.TARGET != None and len(opt.TARGET) > 1:
for elem in opt.TARGET:
if elem == '0':
- print "0 will clear the TARGET setting in target.txt and can't combine with other value."
+ print("0 will clear the TARGET setting in target.txt and can't combine with other value.")
sys.exit(1)
if opt.TARGET_ARCH != None and len(opt.TARGET_ARCH) > 1:
for elem in opt.TARGET_ARCH:
if elem == '0':
- print "0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value."
+ print("0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value.")
sys.exit(1)
try:
diff --git a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
index ca21e6995217..afa5b2407ec5 100644
--- a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
@@ -14,6 +14,7 @@
'''
ExpressionValidate
'''
+from __future__ import print_function
##
# Import Modules
@@ -566,7 +567,7 @@ def IsValidFeatureFlagExp(Token, Flag=False):
if __name__ == '__main__':
# print IsValidRangeExpr('LT 9')
- print _LogicalExpressionParser('gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression()
+ print(_LogicalExpressionParser('gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression())
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index b00bba1f8440..84958ae38cef 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -14,6 +14,7 @@
"""
Collect all defined strings in multiple uni files
"""
+from __future__ import print_function
##
# Import Modules
@@ -748,7 +749,7 @@ class UniFileClassObject(object):
EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
NewLines.append(Line)
else:
- print Line
+ print(Line)
EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
if StrName and not StrName.split()[1].startswith(u'STR_'):
@@ -1040,12 +1041,12 @@ class UniFileClassObject(object):
# Show the instance itself
#
def ShowMe(self):
- print self.LanguageDef
+ print(self.LanguageDef)
#print self.OrderedStringList
for Item in self.OrderedStringList:
- print Item
+ print(Item)
for Member in self.OrderedStringList[Item]:
- print str(Member)
+ print(str(Member))
#
# Read content from '!include' UNI file
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 436dc90e6dd3..074aa311f31d 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -15,6 +15,7 @@
'''
DecPomAlignment
'''
+from __future__ import print_function
##
# Import Modules
@@ -902,47 +903,47 @@ class DecPomAlignment(PackageObject):
# Print all members and their values of Package class
#
def ShowPackage(self):
- print '\nName =', self.GetName()
- print '\nBaseName =', self.GetBaseName()
- print '\nVersion =', self.GetVersion()
- print '\nGuid =', self.GetGuid()
+ print('\nName =', self.GetName())
+ print('\nBaseName =', self.GetBaseName())
+ print('\nVersion =', self.GetVersion())
+ print('\nGuid =', self.GetGuid())
- print '\nStandardIncludes = %d ' \
- % len(self.GetStandardIncludeFileList()),
+ print('\nStandardIncludes = %d ' \
+ % len(self.GetStandardIncludeFileList()), end=' ')
for Item in self.GetStandardIncludeFileList():
- print Item.GetFilePath(), ' ', Item.GetSupArchList()
- print '\nPackageIncludes = %d \n' \
- % len(self.GetPackageIncludeFileList()),
+ print(Item.GetFilePath(), ' ', Item.GetSupArchList())
+ print('\nPackageIncludes = %d \n' \
+ % len(self.GetPackageIncludeFileList()), end=' ')
for Item in self.GetPackageIncludeFileList():
- print Item.GetFilePath(), ' ', Item.GetSupArchList()
+ print(Item.GetFilePath(), ' ', Item.GetSupArchList())
- print '\nGuids =', self.GetGuidList()
+ print('\nGuids =', self.GetGuidList())
for Item in self.GetGuidList():
- print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
- print '\nProtocols =', self.GetProtocolList()
+ print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+ print('\nProtocols =', self.GetProtocolList())
for Item in self.GetProtocolList():
- print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
- print '\nPpis =', self.GetPpiList()
+ print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+ print('\nPpis =', self.GetPpiList())
for Item in self.GetPpiList():
- print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
- print '\nLibraryClasses =', self.GetLibraryClassList()
+ print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+ print('\nLibraryClasses =', self.GetLibraryClassList())
for Item in self.GetLibraryClassList():
- print Item.GetLibraryClass(), Item.GetRecommendedInstance(), \
- Item.GetSupArchList()
- print '\nPcds =', self.GetPcdList()
+ print(Item.GetLibraryClass(), Item.GetRecommendedInstance(), \
+ Item.GetSupArchList())
+ print('\nPcds =', self.GetPcdList())
for Item in self.GetPcdList():
- print 'CName=', Item.GetCName(), 'TokenSpaceGuidCName=', \
+ print('CName=', Item.GetCName(), 'TokenSpaceGuidCName=', \
Item.GetTokenSpaceGuidCName(), \
'DefaultValue=', Item.GetDefaultValue(), \
'ValidUsage=', Item.GetValidUsage(), \
'SupArchList', Item.GetSupArchList(), \
- 'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType()
+ 'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType())
for Item in self.GetMiscFileList():
- print Item.GetName()
+ print(Item.GetName())
for FileObjectItem in Item.GetFileList():
- print FileObjectItem.GetURI()
- print '****************\n'
+ print(FileObjectItem.GetURI())
+ print('****************\n')
## GenPcdDeclaration
#
diff --git a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
index 8b4ece2617a1..5f0abcafef27 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
@@ -11,6 +11,7 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
+from __future__ import print_function
import os
import unittest
@@ -66,7 +67,7 @@ def TestTemplate(TestString, TestFunc):
# Close file
f.close()
except:
- print 'Can not create temporary file [%s]!' % Path
+ print('Can not create temporary file [%s]!' % Path)
exit(-1)
# Call test function to test
@@ -279,6 +280,6 @@ if __name__ == '__main__':
unittest.FunctionTestCase(TestDecPcd).runTest()
unittest.FunctionTestCase(TestDecUserExtension).runTest()
- print 'All tests passed...'
+ print('All tests passed...')
diff --git a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
index f3b43ee0bc27..626f17426de7 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
@@ -11,6 +11,7 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
+from __future__ import print_function
import os
#import Object.Parser.InfObject as InfObject
from Object.Parser.InfCommonObject import CurrentLine
@@ -271,7 +272,7 @@ def PrepareTest(String):
TempFile = open (FileName, "w")
TempFile.close()
except:
- print "File Create Error"
+ print("File Create Error")
CurrentLine = CurrentLine()
CurrentLine.SetFileName("Test")
CurrentLine.SetLineString(Item[0])
@@ -376,11 +377,11 @@ if __name__ == '__main__':
try:
InfBinariesInstance.SetBinary(Ver = Ver, ArchList = ArchList)
except:
- print "Test Failed!"
+ print("Test Failed!")
AllPassedFlag = False
if AllPassedFlag :
- print 'All tests passed...'
+ print('All tests passed...')
else:
- print 'Some unit test failed!'
+ print('Some unit test failed!')
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 480ec3e6cfce..4d0a7a30ccce 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -17,6 +17,7 @@
# This class is used to retrieve information stored in database and convert them
# into PlatformBuildClassObject form for easier use for AutoGen.
#
+from __future__ import print_function
from Common.String import *
from Common.DataType import *
from Common.Misc import *
@@ -1071,9 +1072,9 @@ class DscBuildData(PlatformBuildClassObject):
for skuid in pcdobj.SkuInfoList:
if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]):
for storename in pcdobj.SkuInfoList[skuid].DefaultStoreDict:
- print "PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,storename,str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename]))
+ print("PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,storename,str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename])))
else:
- print "PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,str(pcdobj.SkuInfoList[skuid].DefaultValue))
+ print("PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,str(pcdobj.SkuInfoList[skuid].DefaultValue)))
## Retrieve [BuildOptions]
def _GetBuildOptions(self):
if self._BuildOptions == None:
@@ -1280,7 +1281,7 @@ class DscBuildData(PlatformBuildClassObject):
for (skuname,StoreName,PcdGuid,PcdName,PcdValue) in Str_Pcd_Values:
str_pcd_obj = S_pcd_set.get((PcdName, PcdGuid))
if str_pcd_obj is None:
- print PcdName, PcdGuid
+ print(PcdName, PcdGuid)
raise
if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
@@ -1457,10 +1458,10 @@ class DscBuildData(PlatformBuildClassObject):
if Value[0] == '{' and Value[-1] == '}':
return True
if Value.startswith("L'") and Value.endswith("'") and len(list(Value[2:-1])) > 1:
- print 'foo = ', list(Value[2:-1])
+ print('foo = ', list(Value[2:-1]))
return True
if Value[0] == "'" and Value[-1] == "'" and len(list(Value[1:-1])) > 1:
- print 'bar = ', list(Value[1:-1])
+ print('bar = ', list(Value[1:-1]))
return True
return False
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 17b7e7e1bd62..9bcb017c0c45 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import re
import time
@@ -1607,7 +1608,7 @@ class DscParser(MetaFileParser):
try:
self._ValueList[2] = '|'.join(ValList)
except Exception:
- print ValList
+ print(ValList)
def __ProcessComponent(self):
self._ValueList[0] = ReplaceMacro(self._ValueList[0], self._Macros)
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index b8dc20b1fd22..68dca8e21524 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -16,6 +16,7 @@
##
# Import Modules
#
+from __future__ import print_function
import Common.LongFilePathOs as os
import re
import StringIO
@@ -2190,7 +2191,7 @@ class Build():
toolsFile = os.path.join(FvDir, 'GuidedSectionTools.txt')
toolsFile = open(toolsFile, 'wt')
for guidedSectionTool in guidAttribs:
- print >> toolsFile, ' '.join(guidedSectionTool)
+ print(' '.join(guidedSectionTool), file=toolsFile)
toolsFile.close()
## Returns the full path of the tool.
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 27afd79f2094..c52b8bd94234 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import base64
import os
import os.path
@@ -91,9 +92,9 @@ class BaseToolsTest(unittest.TestCase):
os.remove(path)
def DisplayBinaryData(self, description, data):
- print description, '(base64 encoded):'
+ print(description, '(base64 encoded):')
b64data = base64.b64encode(data)
- print b64data
+ print(b64data)
def DisplayFile(self, fileName):
sys.stdout.write(self.ReadTmpFile(fileName))
diff --git a/BaseTools/Tests/TianoCompress.py b/BaseTools/Tests/TianoCompress.py
index e14136416211..f6a4a6ae9c5d 100644
--- a/BaseTools/Tests/TianoCompress.py
+++ b/BaseTools/Tests/TianoCompress.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from __future__ import print_function
import os
import random
import sys
@@ -52,8 +53,8 @@ class Tests(TestTools.BaseToolsTest):
finish = self.ReadTmpFile('output2')
startEqualsFinish = start == finish
if not startEqualsFinish:
- print
- print 'Original data did not match decompress(compress(data))'
+ print()
+ print('Original data did not match decompress(compress(data))')
self.DisplayBinaryData('original data', start)
self.DisplayBinaryData('after compression', self.ReadTmpFile('output1'))
self.DisplayBinaryData('after decomression', finish)
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 858b4020ef9f..643fec58a457 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -17,6 +17,7 @@
#
+from __future__ import print_function
from optparse import OptionParser
import os
import shutil
@@ -34,7 +35,7 @@ if sys.version_info < (2, 5):
#
# This script (and edk2 BaseTools) require Python 2.5 or newer
#
- print 'Python version 2.5 or later is required.'
+ print('Python version 2.5 or later is required.')
sys.exit(-1)
#
@@ -146,37 +147,37 @@ class Config:
if not self.options.skip_gcc:
building.append('gcc')
if len(building) == 0:
- print "Nothing will be built!"
- print
- print "Please try using --help and then change the configuration."
+ print("Nothing will be built!")
+ print()
+ print("Please try using --help and then change the configuration.")
return False
- print "Current directory:"
- print " ", self.base_dir
- print "Sources download/extraction:", self.Relative(self.src_dir)
- print "Build directory :", self.Relative(self.build_dir)
- print "Prefix (install) directory :", self.Relative(self.prefix)
- print "Create symlinks directory :", self.Relative(self.symlinks)
- print "Building :", ', '.join(building)
- print
+ print("Current directory:")
+ print(" ", self.base_dir)
+ print("Sources download/extraction:", self.Relative(self.src_dir))
+ print("Build directory :", self.Relative(self.build_dir))
+ print("Prefix (install) directory :", self.Relative(self.prefix))
+ print("Create symlinks directory :", self.Relative(self.symlinks))
+ print("Building :", ', '.join(building))
+ print()
answer = raw_input("Is this configuration ok? (default = no): ")
if (answer.lower() not in ('y', 'yes')):
- print
- print "Please try using --help and then change the configuration."
+ print()
+ print("Please try using --help and then change the configuration.")
return False
if self.arch.lower() == 'ipf':
- print
- print 'Please note that the IPF compiler built by this script has'
- print 'not yet been validated!'
- print
+ print()
+ print('Please note that the IPF compiler built by this script has')
+ print('not yet been validated!')
+ print()
answer = raw_input("Are you sure you want to build it? (default = no): ")
if (answer.lower() not in ('y', 'yes')):
- print
- print "Please try using --help and then change the configuration."
+ print()
+ print("Please try using --help and then change the configuration.")
return False
- print
+ print()
return True
def Relative(self, path):
@@ -275,7 +276,7 @@ class SourceFiles:
wDots = (100 * received * blockSize) / fileSize / 10
if wDots > self.dots:
for i in range(wDots - self.dots):
- print '.',
+ print('.', end=' ')
sys.stdout.flush()
self.dots += 1
@@ -286,18 +287,18 @@ class SourceFiles:
self.dots = 0
local_file = os.path.join(self.config.src_dir, fdata['filename'])
url = fdata['url']
- print 'Downloading %s:' % fname, url
+ print('Downloading %s:' % fname, url)
if retries > 0:
- print '(retry)',
+ print('(retry)', end=' ')
sys.stdout.flush()
completed = False
if os.path.exists(local_file):
md5_pass = self.checkHash(fdata)
if md5_pass:
- print '[md5 match]',
+ print('[md5 match]', end=' ')
else:
- print '[md5 mismatch]',
+ print('[md5 mismatch]', end=' ')
sys.stdout.flush()
completed = md5_pass
@@ -313,32 +314,32 @@ class SourceFiles:
if not completed and os.path.exists(local_file):
md5_pass = self.checkHash(fdata)
if md5_pass:
- print '[md5 match]',
+ print('[md5 match]', end=' ')
else:
- print '[md5 mismatch]',
+ print('[md5 mismatch]', end=' ')
sys.stdout.flush()
completed = md5_pass
if completed:
- print '[done]'
+ print('[done]')
break
else:
- print '[failed]'
- print ' Tried to retrieve', url
- print ' to', local_file
- print 'Possible fixes:'
- print '* If you are behind a web-proxy, try setting the',
- print 'http_proxy environment variable'
- print '* You can try to download this file separately',
- print 'and rerun this script'
+ print('[failed]')
+ print(' Tried to retrieve', url)
+ print(' to', local_file)
+ print('Possible fixes:')
+ print('* If you are behind a web-proxy, try setting the', end=' ')
+ print('http_proxy environment variable')
+ print('* You can try to download this file separately', end=' ')
+ print('and rerun this script')
raise Exception()
except KeyboardInterrupt:
- print '[KeyboardInterrupt]'
+ print('[KeyboardInterrupt]')
return False
except Exception as e:
- print e
+ print(e)
if not completed: return False
@@ -396,7 +397,7 @@ class Extracter:
extractedMd5 = open(extracted).read()
if extractedMd5 != moduleMd5:
- print 'Extracting %s:' % self.config.Relative(local_file)
+ print('Extracting %s:' % self.config.Relative(local_file))
tar = tarfile.open(local_file)
tar.extractall(extractDst)
open(extracted, 'w').write(moduleMd5)
@@ -480,7 +481,7 @@ class Builder:
os.chdir(base_dir)
- print '%s module is now built and installed' % module
+ print('%s module is now built and installed' % module)
def RunCommand(self, cmd, module, stage, skipable=False):
if skipable:
@@ -495,13 +496,13 @@ class Builder:
stderr=subprocess.STDOUT
)
- print '%s [%s] ...' % (module, stage),
+ print('%s [%s] ...' % (module, stage), end=' ')
sys.stdout.flush()
p = popen(cmd)
output = p.stdout.read()
p.wait()
if p.returncode != 0:
- print '[failed!]'
+ print('[failed!]')
logFile = os.path.join(self.config.build_dir, 'log.txt')
f = open(logFile, "w")
f.write(output)
@@ -509,7 +510,7 @@ class Builder:
raise Exception, 'Failed to %s %s\n' % (stage, module) + \
'See output log at %s' % self.config.Relative(logFile)
else:
- print '[done]'
+ print('[done]')
if skipable:
self.MarkBuildStepComplete('%s.%s' % (module, stage))
@@ -526,13 +527,13 @@ class Builder:
linkdst = os.path.join(links_dir, link)
if not os.path.lexists(linkdst):
if not startPrinted:
- print 'Making symlinks in %s:' % self.config.Relative(links_dir),
+ print('Making symlinks in %s:' % self.config.Relative(links_dir), end=' ')
startPrinted = True
- print link,
+ print(link, end=' ')
os.symlink(src, linkdst)
if startPrinted:
- print '[done]'
+ print('[done]')
class App:
"""class App
@@ -551,9 +552,9 @@ class App:
sources = SourceFiles(config)
result = sources.GetAll()
if result:
- print 'All files have been downloaded & verified'
+ print('All files have been downloaded & verified')
else:
- print 'An error occured while downloading a file'
+ print('An error occured while downloading a file')
return
Extracter(sources, config).ExtractAll()
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 03/20] BaseTools: Remove the old python "not-equal"
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
2018-02-01 8:35 ` [PATCH v2 01/20] BaseTools: Refactor python except statements Gary Lin
2018-02-01 8:35 ` [PATCH v2 02/20] BaseTools: Refactor python print statements Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 04/20] BaseTools: Use the python3-range functions Gary Lin
` (17 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Replace "<>" with "!=" to be compatible with python3.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Scripts/BinToPcd.py | 4 ++--
BaseTools/Source/Python/AutoGen/AutoGen.py | 4 ++--
BaseTools/Source/Python/AutoGen/BuildEngine.py | 4 ++--
BaseTools/Source/Python/AutoGen/GenC.py | 4 ++--
BaseTools/Source/Python/AutoGen/GenMake.py | 2 +-
BaseTools/Source/Python/Common/Misc.py | 2 +-
BaseTools/Source/Python/GenFds/Fv.py | 2 +-
BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 6 +++---
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 12 ++++++------
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 12 ++++++------
BaseTools/Source/Python/Workspace/DscBuildData.py | 4 ++--
11 files changed, 28 insertions(+), 28 deletions(-)
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index c4e7b8a5c2e2..1867f35e148e 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -41,13 +41,13 @@ if __name__ == '__main__':
return Value
def ValidatePcdName (Argument):
- if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) <> ['','']:
+ if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
Message = '%s is not in the form <PcdTokenSpaceGuidCName>.<PcdCName>' % (Argument)
raise argparse.ArgumentTypeError(Message)
return Argument
def ValidateGuidName (Argument):
- if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) <> ['','']:
+ if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
Message = '%s is not a valid GUID C name' % (Argument)
raise argparse.ArgumentTypeError(Message)
return Argument
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 816dd9e86bd3..acf6dfd3487f 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -3953,7 +3953,7 @@ class ModuleAutoGen(AutoGen):
return
# Skip the following code for modules without any binary files
- if self.BinaryFileList <> None and self.BinaryFileList <> []:
+ if self.BinaryFileList != None and self.BinaryFileList != []:
return
### TODO: How to handles mixed source and binary modules
@@ -4421,7 +4421,7 @@ class ModuleAutoGen(AutoGen):
Dpx = GenDepex.DependencyExpression(self.DepexList[ModuleType], ModuleType, True)
DpxFile = gAutoGenDepexFileName % {"module_name" : self.Name}
- if len(Dpx.PostfixNotation) <> 0:
+ if len(Dpx.PostfixNotation) != 0:
self.DepexGenerated = True
if Dpx.Generate(path.join(self.OutputDir, DpxFile)):
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index 46685967d1ee..f0a973c9f197 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -388,8 +388,8 @@ class BuildRule:
self.RuleContent[Index] = Line
# find the build_rule_version
- if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) <> -1:
- if Line.find("=") <> -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
+ if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) != -1:
+ if Line.find("=") != -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
self._FileVersion = (Line[(Line.find("=") + 1):]).split()[0]
# skip empty or comment line
if Line == "" or Line[0] == "#":
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 3e98506cc807..b8ba687bcda0 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1521,7 +1521,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
}
if Info.ModuleType in ['PEI_CORE', 'DXE_CORE', 'SMM_CORE', 'MM_CORE_STANDALONE']:
- if Info.SourceFileList <> None and Info.SourceFileList <> []:
+ if Info.SourceFileList != None and Info.SourceFileList != []:
if NumEntryPoints != 1:
EdkLogger.error(
"build",
@@ -1683,7 +1683,7 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
AutoGenH.Append("\n// Definition of SkuId Array\n")
AutoGenH.Append("extern UINT64 _gPcd_SkuId_Array[];\n")
# Add extern declarations to AutoGen.h if one or more Token Space GUIDs were found
- if TokenSpaceList <> []:
+ if TokenSpaceList != []:
AutoGenH.Append("\n// Definition of PCD Token Space GUIDs used in this module\n\n")
if Info.ModuleType in ["USER_DEFINED", "BASE"]:
GuidType = "GUID"
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 3f98a34d81ec..8891b1b97d23 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -562,7 +562,7 @@ cleanlib:
# convert source files and binary files to build targets
self.ResultFileList = [str(T.Target) for T in self._AutoGenObject.CodaTargetList]
- if len(self.ResultFileList) == 0 and len(self._AutoGenObject.SourceFileList) <> 0:
+ if len(self.ResultFileList) == 0 and len(self._AutoGenObject.SourceFileList) != 0:
EdkLogger.error("build", AUTOGEN_ERROR, "Nothing to build",
ExtraData="[%s]" % str(self._AutoGenObject))
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index eed86ec98e14..97d76b66936e 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1509,7 +1509,7 @@ def ParseDevPathValue (Value):
def ParseFieldValue (Value):
if type(Value) == type(0):
return Value, (Value.bit_length() + 7) / 8
- if type(Value) <> type(''):
+ if type(Value) != type(''):
raise BadExpression('Type %s is %s' %(Value, type(Value)))
Value = Value.strip()
if Value.startswith('UINT8') and Value.endswith(')'):
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index c0b869d250f1..be8b885d069e 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -341,7 +341,7 @@ class FV (FvClassObject):
if len(self.FvExtEntryType) > 0 or self.UsedSizeEnable:
GenFdsGlobalVariable.ErrorLogger("FV Extension Header Entries declared for %s with no FvNameGuid declaration." % (self.UiFvName))
- if self.FvNameGuid <> None and self.FvNameGuid <> '':
+ if self.FvNameGuid != None and self.FvNameGuid != '':
TotalSize = 16 + 4
Buffer = ''
if self.UsedSizeEnable:
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 4f79d0f82967..11d11700ed99 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -118,7 +118,7 @@ if __name__ == '__main__':
sys.exit(1)
Version = Process.communicate()
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(Process.returncode)
print(Version[0])
@@ -208,7 +208,7 @@ if __name__ == '__main__':
#
Process = subprocess.Popen('%s smime -sign -binary -signer "%s" -outform DER -md sha256 -certfile "%s"' % (OpenSslCommand, args.SignerPrivateCertFileName, args.OtherPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Signature = Process.communicate(input=FullInputFileBuffer)[0]
- if Process.returncode <> 0:
+ if Process.returncode != 0:
sys.exit(Process.returncode)
#
@@ -277,7 +277,7 @@ if __name__ == '__main__':
#
Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName, args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.communicate(input=args.SignatureBuffer)[0]
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: Verification failed')
os.remove (args.OutputFileName)
sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 06ed2610271f..2aa6877c92be 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -80,7 +80,7 @@ if __name__ == '__main__':
sys.exit(1)
Version = Process.communicate()
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(Process.returncode)
print(Version[0])
@@ -90,7 +90,7 @@ if __name__ == '__main__':
#
# Check for output file argument
#
- if args.OutputFile <> None:
+ if args.OutputFile != None:
for Item in args.OutputFile:
#
# Save PEM filename and close output file
@@ -103,14 +103,14 @@ if __name__ == '__main__':
#
Process = subprocess.Popen('%s genrsa -out %s 2048' % (OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.communicate()
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: RSA 2048 key generation failed')
sys.exit(Process.returncode)
#
# Check for input file argument
#
- if args.InputFile <> None:
+ if args.InputFile != None:
for Item in args.InputFile:
#
# Save PEM filename and close input file
@@ -125,7 +125,7 @@ if __name__ == '__main__':
#
Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: Unable to extract public key from private key')
sys.exit(Process.returncode)
PublicKey = ''
@@ -138,7 +138,7 @@ if __name__ == '__main__':
Process = subprocess.Popen('%s dgst -sha256 -binary' % (OpenSslCommand), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.stdin.write (PublicKey)
PublicKeyHash = PublicKeyHash + Process.communicate()[0]
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: Unable to extract SHA 256 hash of public key')
sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 99a5d8aa5a01..8c235ae51e7e 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -101,7 +101,7 @@ if __name__ == '__main__':
sys.exit(1)
Version = Process.communicate()
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: Open SSL command not available. Please verify PATH or set OPENSSL_PATH')
sys.exit(Process.returncode)
print(Version[0])
@@ -157,7 +157,7 @@ if __name__ == '__main__':
while len(PublicKeyHexString) > 0:
PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2],16))
PublicKeyHexString=PublicKeyHexString[2:]
- if Process.returncode <> 0:
+ if Process.returncode != 0:
sys.exit(Process.returncode)
if args.MonotonicCountStr:
@@ -179,7 +179,7 @@ if __name__ == '__main__':
#
Process = subprocess.Popen('%s sha256 -sign "%s"' % (OpenSslCommand, args.PrivateKeyFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Signature = Process.communicate(input=FullInputFileBuffer)[0]
- if Process.returncode <> 0:
+ if Process.returncode != 0:
sys.exit(Process.returncode)
#
@@ -202,14 +202,14 @@ if __name__ == '__main__':
#
# Verify that the Hash Type matches the expected SHA256 type
#
- if uuid.UUID(bytes_le = Header.HashType) <> EFI_HASH_ALGORITHM_SHA256_GUID:
+ if uuid.UUID(bytes_le = Header.HashType) != EFI_HASH_ALGORITHM_SHA256_GUID:
print('ERROR: unsupport hash GUID')
sys.exit(1)
#
# Verify the public key
#
- if Header.PublicKey <> PublicKey:
+ if Header.PublicKey != PublicKey:
print('ERROR: Public key in input file does not match public key from private key file')
sys.exit(1)
@@ -228,7 +228,7 @@ if __name__ == '__main__':
#
Process = subprocess.Popen('%s sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
Process.communicate(input=FullInputFileBuffer)
- if Process.returncode <> 0:
+ if Process.returncode != 0:
print('ERROR: Verification failed')
os.remove (args.OutputFileName)
sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 4d0a7a30ccce..a482c94b6fdc 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1815,7 +1815,7 @@ class DscBuildData(PlatformBuildClassObject):
Messages = StdErr
Messages = Messages.split('\n')
MessageGroup = []
- if returncode <>0:
+ if returncode !=0:
CAppBaseFileName = os.path.join(self.OutputPath, PcdValueInitName)
File = open (CAppBaseFileName + '.c', 'r')
FileData = File.readlines()
@@ -1861,7 +1861,7 @@ class DscBuildData(PlatformBuildClassObject):
Command = PcdValueInitExe + ' -i %s -o %s' % (InputValueFile, OutputValueFile)
returncode, StdOut, StdErr = self.ExecuteCommand (Command)
- if returncode <> 0:
+ if returncode != 0:
EdkLogger.warn('Build', COMMAND_FAILURE, 'Can not collect output from command: %s' % Command)
FileBuffer = []
else:
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 04/20] BaseTools: Use the python3-range functions
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (2 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 03/20] BaseTools: Remove the old python "not-equal" Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 05/20] BaseTools: Remove tuple parameter in python scripts Gary Lin
` (16 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Replace xrange() and range() with the newer range() function
Based on "futurize -f libfuturize.fixes.fix_xrange_with_import"
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Scripts/BinToPcd.py | 3 ++-
BaseTools/Scripts/ConvertMasmToNasm.py | 1 +
BaseTools/Scripts/PatchCheck.py | 5 +++--
BaseTools/Source/Python/AutoGen/AutoGen.py | 1 +
BaseTools/Source/Python/AutoGen/BuildEngine.py | 1 +
BaseTools/Source/Python/AutoGen/GenC.py | 1 +
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 23 ++++++++++----------
BaseTools/Source/Python/AutoGen/GenVar.py | 1 +
BaseTools/Source/Python/AutoGen/InfSectionParser.py | 1 +
BaseTools/Source/Python/AutoGen/StrGather.py | 1 +
BaseTools/Source/Python/AutoGen/UniClassObject.py | 1 +
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 1 +
BaseTools/Source/Python/BPDG/GenVpd.py | 7 +++---
BaseTools/Source/Python/Common/DscClassObject.py | 1 +
BaseTools/Source/Python/Common/Expression.py | 1 +
BaseTools/Source/Python/Common/FdfClassObject.py | 1 +
BaseTools/Source/Python/Common/MigrationUtilities.py | 1 +
BaseTools/Source/Python/Common/Misc.py | 3 ++-
BaseTools/Source/Python/Common/Parsing.py | 1 +
BaseTools/Source/Python/Common/RangeExpression.py | 1 +
BaseTools/Source/Python/Common/String.py | 1 +
BaseTools/Source/Python/Common/ToolDefClassObject.py | 1 +
BaseTools/Source/Python/Ecc/Check.py | 1 +
BaseTools/Source/Python/Ecc/MetaDataParser.py | 3 ++-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 1 +
BaseTools/Source/Python/Eot/FvImage.py | 1 +
BaseTools/Source/Python/Eot/InfParserLite.py | 1 +
BaseTools/Source/Python/GenFds/AprioriSection.py | 1 +
BaseTools/Source/Python/GenFds/FfsFileStatement.py | 1 +
BaseTools/Source/Python/GenFds/Fv.py | 1 +
BaseTools/Source/Python/GenFds/GenFds.py | 1 +
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 1 +
BaseTools/Source/Python/GenFds/Region.py | 3 ++-
BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 1 +
BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 3 ++-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 3 ++-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 3 ++-
BaseTools/Source/Python/Trim/Trim.py | 1 +
BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py | 5 +++--
| 3 ++-
BaseTools/Source/Python/UPT/Library/Misc.py | 5 +++--
BaseTools/Source/Python/UPT/Library/Parsing.py | 3 ++-
BaseTools/Source/Python/UPT/Library/String.py | 1 +
BaseTools/Source/Python/UPT/Library/UniClassObject.py | 3 ++-
BaseTools/Source/Python/UPT/Parser/DecParserMisc.py | 1 +
BaseTools/Source/Python/UPT/Parser/InfSectionParser.py | 3 ++-
BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 1 +
BaseTools/Source/Python/UPT/UPT.py | 1 +
BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py | 1 +
BaseTools/Source/Python/UPT/Xml/IniToXml.py | 1 +
BaseTools/Source/Python/UPT/Xml/XmlParser.py | 1 +
BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py | 3 ++-
BaseTools/Source/Python/Workspace/DscBuildData.py | 1 +
BaseTools/Source/Python/Workspace/InfBuildData.py | 1 +
BaseTools/Source/Python/Workspace/MetaFileParser.py | 1 +
BaseTools/Tests/TestTools.py | 3 ++-
BaseTools/Tests/TianoCompress.py | 1 +
BaseTools/gcc/mingw-gcc-build.py | 1 +
58 files changed, 91 insertions(+), 33 deletions(-)
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 1867f35e148e..7d8cd0a5cc25 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -16,6 +16,7 @@ BinToPcd
'''
from __future__ import print_function
+from builtins import range
import sys
import argparse
import re
@@ -84,7 +85,7 @@ if __name__ == '__main__':
help = "Increase output messages")
parser.add_argument("-q", "--quiet", dest = 'Quiet', action = "store_true",
help = "Reduce output messages")
- parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = range(0,10), default = 0,
+ parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = list(range(0,10)), default = 0,
help = "Set debug level")
#
diff --git a/BaseTools/Scripts/ConvertMasmToNasm.py b/BaseTools/Scripts/ConvertMasmToNasm.py
index 5b83724b3124..e7b5b096fccc 100755
--- a/BaseTools/Scripts/ConvertMasmToNasm.py
+++ b/BaseTools/Scripts/ConvertMasmToNasm.py
@@ -17,6 +17,7 @@ from __future__ import print_function
#
# Import Modules
#
+from builtins import range
import argparse
import io
import os.path
diff --git a/BaseTools/Scripts/PatchCheck.py b/BaseTools/Scripts/PatchCheck.py
index 43bfc2495c6b..51d4adf08b60 100755
--- a/BaseTools/Scripts/PatchCheck.py
+++ b/BaseTools/Scripts/PatchCheck.py
@@ -15,6 +15,7 @@
from __future__ import print_function
+from builtins import range
VersionNumber = '0.1'
__copyright__ = "Copyright (c) 2015 - 2016, Intel Corporation All rights reserved."
@@ -26,7 +27,7 @@ import subprocess
import sys
class Verbose:
- SILENT, ONELINE, NORMAL = range(3)
+ SILENT, ONELINE, NORMAL = list(range(3))
level = NORMAL
class CommitMessageCheck:
@@ -234,7 +235,7 @@ class CommitMessageCheck:
break
last_sig_line = line.strip()
-(START, PRE_PATCH, PATCH) = range(3)
+(START, PRE_PATCH, PATCH) = list(range(3))
class GitDiffCheck:
"""Checks the contents of a git diff."""
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index acf6dfd3487f..73ffff8a70d8 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -14,6 +14,7 @@
## Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os
import re
import os.path as path
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index f0a973c9f197..e8f6788cdc40 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -15,6 +15,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os
import re
import copy
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index b8ba687bcda0..d68160deb4a1 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -13,6 +13,7 @@
## Import Modules
#
+from builtins import range
import string
import collections
import struct
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 82360ae57deb..b4955ea7ebab 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -10,6 +10,7 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from builtins import range
from StringIO import StringIO
from Common.Misc import *
from Common.String import StringToArray
@@ -297,7 +298,7 @@ class DbItemList:
# Variable length, need to calculate one by one
#
assert(Index < len(self.RawDataList))
- for ItemIndex in xrange(Index):
+ for ItemIndex in range(Index):
Offset += len(self.RawDataList[ItemIndex])
else:
for Datas in self.RawDataList:
@@ -394,7 +395,7 @@ class DbComItemList (DbItemList):
assert(False)
else:
assert(Index < len(self.RawDataList))
- for ItemIndex in xrange(Index):
+ for ItemIndex in range(Index):
Offset += len(self.RawDataList[ItemIndex]) * self.ItemSize
return Offset
@@ -478,7 +479,7 @@ class DbStringHeadTableItemList(DbItemList):
# Variable length, need to calculate one by one
#
assert(Index < len(self.RawDataList))
- for ItemIndex in xrange(Index):
+ for ItemIndex in range(Index):
Offset += len(self.RawDataList[ItemIndex])
else:
for innerIndex in range(Index):
@@ -568,14 +569,14 @@ class DbStringItemList (DbComItemList):
assert(len(RawDataList) == len(LenList))
DataList = []
# adjust DataList according to the LenList
- for Index in xrange(len(RawDataList)):
+ for Index in range(len(RawDataList)):
Len = LenList[Index]
RawDatas = RawDataList[Index]
assert(Len >= len(RawDatas))
ActualDatas = []
- for i in xrange(len(RawDatas)):
+ for i in range(len(RawDatas)):
ActualDatas.append(RawDatas[i])
- for i in xrange(len(RawDatas), Len):
+ for i in range(len(RawDatas), Len):
ActualDatas.append(0)
DataList.append(ActualDatas)
self.LenList = LenList
@@ -584,7 +585,7 @@ class DbStringItemList (DbComItemList):
Offset = 0
assert(Index < len(self.LenList))
- for ItemIndex in xrange(Index):
+ for ItemIndex in range(Index):
Offset += self.LenList[ItemIndex]
return Offset
@@ -772,7 +773,7 @@ def BuildExDataBase(Dict):
# Get offset of SkuId table in the database
SkuIdTableOffset = FixedHeaderLen
- for DbIndex in xrange(len(DbTotal)):
+ for DbIndex in range(len(DbTotal)):
if DbTotal[DbIndex] is SkuidValue:
break
SkuIdTableOffset += DbItemTotal[DbIndex].GetListSize()
@@ -784,7 +785,7 @@ def BuildExDataBase(Dict):
for (LocalTokenNumberTableIndex, (Offset, Table)) in enumerate(LocalTokenNumberTable):
DbIndex = 0
DbOffset = FixedHeaderLen
- for DbIndex in xrange(len(DbTotal)):
+ for DbIndex in range(len(DbTotal)):
if DbTotal[DbIndex] is Table:
DbOffset += DbItemTotal[DbIndex].GetInterOffset(Offset)
break
@@ -810,7 +811,7 @@ def BuildExDataBase(Dict):
(VariableHeadGuidIndex, VariableHeadStringIndex, SKUVariableOffset, VariableOffset, VariableRefTable, VariableAttribute) = VariableEntryPerSku[:]
DbIndex = 0
DbOffset = FixedHeaderLen
- for DbIndex in xrange(len(DbTotal)):
+ for DbIndex in range(len(DbTotal)):
if DbTotal[DbIndex] is VariableRefTable:
DbOffset += DbItemTotal[DbIndex].GetInterOffset(VariableOffset)
break
@@ -830,7 +831,7 @@ def BuildExDataBase(Dict):
# calculate various table offset now
DbTotalLength = FixedHeaderLen
- for DbIndex in xrange(len(DbItemTotal)):
+ for DbIndex in range(len(DbItemTotal)):
if DbItemTotal[DbIndex] is DbLocalTokenNumberTable:
LocalTokenNumberTableOffset = DbTotalLength
elif DbItemTotal[DbIndex] is DbExMapTable:
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 1389d7ff6225..8e800c8bc914 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -14,6 +14,7 @@
# #
# Import Modules
#
+from builtins import range
from struct import pack,unpack
import collections
import copy
diff --git a/BaseTools/Source/Python/AutoGen/InfSectionParser.py b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
index cdc9e5e8a849..ee2aae3b70e0 100644
--- a/BaseTools/Source/Python/AutoGen/InfSectionParser.py
+++ b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
@@ -14,6 +14,7 @@
## Import Modules
#
+from builtins import range
import Common.EdkLogger as EdkLogger
from Common.BuildToolError import *
from Common.DataType import *
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index ed33554cd7d2..718cd60514b4 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import re
import Common.EdkLogger as EdkLogger
from Common.BuildToolError import *
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 264cf1546566..cab7623bc4e6 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -17,6 +17,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os, codecs, re
import distutils.util
import Common.EdkLogger as EdkLogger
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 53da9b881f25..ff355d05d79f 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -15,6 +15,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import os
from Common.RangeExpression import RangeExpression
from Common.Misc import *
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index cdfc420c66f7..daf11612d83b 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -13,6 +13,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from builtins import range
import Common.LongFilePathOs as os
import StringIO
import StringTable as st
@@ -229,7 +230,7 @@ class PcdEntry:
ReturnArray = array.array('B')
- for Index in xrange(len(ValueList)):
+ for Index in range(len(ValueList)):
Value = None
if ValueList[Index].lower().startswith('0x'):
# translate hex value
@@ -255,7 +256,7 @@ class PcdEntry:
ReturnArray.append(Value)
- for Index in xrange(len(ValueList), Size):
+ for Index in range(len(ValueList), Size):
ReturnArray.append(0)
self.PcdValue = ReturnArray.tolist()
@@ -290,7 +291,7 @@ class PcdEntry:
"Invalid unicode character %s in unicode string %s(File: %s Line: %s)" % \
(Value, UnicodeString, self.FileName, self.Lineno))
- for Index in xrange(len(UnicodeString) * 2, Size):
+ for Index in range(len(UnicodeString) * 2, Size):
ReturnArray.append(0)
self.PcdValue = ReturnArray.tolist()
diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/Source/Python/Common/DscClassObject.py
index 3a27fbffc934..f42d247cad33 100644
--- a/BaseTools/Source/Python/Common/DscClassObject.py
+++ b/BaseTools/Source/Python/Common/DscClassObject.py
@@ -15,6 +15,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os
import EdkLogger as EdkLogger
import Database
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 145acfc072e7..f7dbb29ee882 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -13,6 +13,7 @@
## Import Modules
#
from __future__ import print_function
+from builtins import range
from Common.GlobalData import *
from CommonDataClass.Exceptions import BadExpression
from CommonDataClass.Exceptions import WrnExpression
diff --git a/BaseTools/Source/Python/Common/FdfClassObject.py b/BaseTools/Source/Python/Common/FdfClassObject.py
index 3e7d44954c88..7ec0235967b2 100644
--- a/BaseTools/Source/Python/Common/FdfClassObject.py
+++ b/BaseTools/Source/Python/Common/FdfClassObject.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
from FdfParserLite import FdfParser
from Table.TableFdf import TableFdf
from CommonDataClass.DataClass import MODEL_FILE_FDF, MODEL_PCD, MODEL_META_DATA_COMPONENT
diff --git a/BaseTools/Source/Python/Common/MigrationUtilities.py b/BaseTools/Source/Python/Common/MigrationUtilities.py
index e9f1cabcb794..2385988247d4 100644
--- a/BaseTools/Source/Python/Common/MigrationUtilities.py
+++ b/BaseTools/Source/Python/Common/MigrationUtilities.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import Common.LongFilePathOs as os
import re
import EdkLogger
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 97d76b66936e..6878522d59d5 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import Common.LongFilePathOs as os
import sys
import string
@@ -1883,7 +1884,7 @@ def SplitOption(OptionString):
def CommonPath(PathList):
P1 = min(PathList).split(os.path.sep)
P2 = max(PathList).split(os.path.sep)
- for Index in xrange(min(len(P1), len(P2))):
+ for Index in range(min(len(P1), len(P2))):
if P1[Index] != P2[Index]:
return os.path.sep.join(P1[:Index])
return os.path.sep.join(P1)
diff --git a/BaseTools/Source/Python/Common/Parsing.py b/BaseTools/Source/Python/Common/Parsing.py
index 584fc7f3c3a0..9caa9424d8ed 100644
--- a/BaseTools/Source/Python/Common/Parsing.py
+++ b/BaseTools/Source/Python/Common/Parsing.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
from String import *
from CommonDataClass.DataClass import *
from DataType import *
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index ee33ae3d3266..4357f240f423 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -13,6 +13,7 @@
# # Import Modules
#
from __future__ import print_function
+from builtins import range
from Common.GlobalData import *
from CommonDataClass.Exceptions import BadExpression
from CommonDataClass.Exceptions import WrnExpression
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index 4a8c03e88e28..e6c7a3b74ee1 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import re
import DataType
import Common.LongFilePathOs as os
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index dc90b4783f2f..6dab179efc01 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import Common.LongFilePathOs as os
import re
import EdkLogger
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index 5864758950ce..92259999853c 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -10,6 +10,7 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from builtins import range
import Common.LongFilePathOs as os
import re
from CommonDataClass.DataClass import *
diff --git a/BaseTools/Source/Python/Ecc/MetaDataParser.py b/BaseTools/Source/Python/Ecc/MetaDataParser.py
index 82ede3eb330c..9b8b96aa4b43 100644
--- a/BaseTools/Source/Python/Ecc/MetaDataParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaDataParser.py
@@ -11,6 +11,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from builtins import range
import Common.LongFilePathOs as os
from CommonDataClass.DataClass import *
from EccToolError import *
@@ -112,7 +113,7 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
#
Last = 0
HeaderCommentStage = HEADER_COMMENT_NOT_STARTED
- for Index in xrange(len(CommentList)-1, 0, -1):
+ for Index in range(len(CommentList)-1, 0, -1):
Line = CommentList[Index][0]
if _IsCopyrightLine(Line):
Last = Index
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 2fef87c4180a..e04b67732141 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import Common.LongFilePathOs as os
import re
import time
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 9d8f0864dc41..64a27217e4a8 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -14,6 +14,7 @@
## Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os
import re
import sys
diff --git a/BaseTools/Source/Python/Eot/InfParserLite.py b/BaseTools/Source/Python/Eot/InfParserLite.py
index f624837f2587..4bdd60a6f71c 100644
--- a/BaseTools/Source/Python/Eot/InfParserLite.py
+++ b/BaseTools/Source/Python/Eot/InfParserLite.py
@@ -15,6 +15,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os
import Common.EdkLogger as EdkLogger
from Common.DataType import *
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 70e2e5a3baf2..27fe2619a35f 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from builtins import range
from struct import *
import Common.LongFilePathOs as os
import StringIO
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index 12ec95b56501..cbfea730ef18 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from builtins import range
import Ffs
import Rule
import Common.LongFilePathOs as os
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index be8b885d069e..615d9e39faf1 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from builtins import range
import Common.LongFilePathOs as os
import subprocess
import StringIO
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 4415b44ef77c..0aadbbd080b3 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -16,6 +16,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
from optparse import OptionParser
import sys
import Common.LongFilePathOs as os
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 393820651a11..94b8fedb233b 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -16,6 +16,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os
import sys
import subprocess
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index c946758cf549..5b9b203cf475 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -15,6 +15,7 @@
##
# Import Modules
#
+from builtins import range
from struct import *
from GenFdsGlobalVariable import GenFdsGlobalVariable
import StringIO
@@ -56,7 +57,7 @@ class Region(RegionClassObject):
PadByte = pack('B', 0xFF)
else:
PadByte = pack('B', 0)
- PadData = ''.join(PadByte for i in xrange(0, Size))
+ PadData = ''.join(PadByte for i in range(0, Size))
Buffer.write(PadData)
## AddToBuffer()
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index 882da81930da..9bb4d43a969f 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import Common.LongFilePathOs as os
from Common.LongFilePathSupport import OpenLongFilePath as open
import sys
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 11d11700ed99..becf3e8eb9e8 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -21,6 +21,7 @@ Pkcs7Sign
'''
from __future__ import print_function
+from builtins import range
import os
import sys
import argparse
@@ -88,7 +89,7 @@ if __name__ == '__main__':
parser.add_argument("--signature-size", dest='SignatureSizeStr', type=str, help="specify the signature size for decode process.")
parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
- parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+ parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
#
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 2aa6877c92be..1641968ace0e 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -24,6 +24,7 @@ Rsa2048Sha256GenerateKeys
'''
from __future__ import print_function
+from builtins import range
import os
import sys
import argparse
@@ -51,7 +52,7 @@ if __name__ == '__main__':
parser.add_argument("--public-key-hash-c", dest='PublicKeyHashCFile', type=argparse.FileType('wb'), help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in C structure format")
parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
- parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+ parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
#
# Parse command line arguments
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 8c235ae51e7e..2a19ad973b91 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -19,6 +19,7 @@ Rsa2048Sha256Sign
'''
from __future__ import print_function
+from builtins import range
import os
import sys
import argparse
@@ -71,7 +72,7 @@ if __name__ == '__main__':
parser.add_argument("--private-key", dest='PrivateKeyFile', type=argparse.FileType('rb'), help="specify the private key filename. If not specified, a test signing key is used.")
parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
- parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+ parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
#
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index 05ba86262133..94f6b1bc707a 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from builtins import range
import Common.LongFilePathOs as os
import sys
import re
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index d7eaf3ea1d12..517f2a6cdecd 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -15,6 +15,7 @@
'''
GenInf
'''
+from builtins import range
import os
import stat
import codecs
@@ -409,7 +410,7 @@ def GenLibraryClasses(ModuleObject):
Statement += '|' + FFE
ModuleList = LibraryClass.GetSupModuleList()
ArchList = LibraryClass.GetSupArchList()
- for Index in xrange(0, len(ArchList)):
+ for Index in range(0, len(ArchList)):
ArchList[Index] = ConvertArchForInstall(ArchList[Index])
ArchList.sort()
SortedArch = ' '.join(ArchList)
@@ -574,7 +575,7 @@ def GenUserExtensions(ModuleObject):
# if not Statement:
# continue
ArchList = UserExtension.GetSupArchList()
- for Index in xrange(0, len(ArchList)):
+ for Index in range(0, len(ArchList)):
ArchList[Index] = ConvertArchForInstall(ArchList[Index])
ArchList.sort()
KeyList = []
--git a/BaseTools/Source/Python/UPT/Library/CommentParsing.py b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
index 9cd7b60e16ab..b97a051137e1 100644
--- a/BaseTools/Source/Python/UPT/Library/CommentParsing.py
+++ b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
@@ -19,6 +19,7 @@ CommentParsing
##
# Import Modules
#
+from builtins import range
import re
from Library.String import GetSplitValueList
@@ -74,7 +75,7 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
# first find the last copyright line
#
Last = 0
- for Index in xrange(len(CommentList)-1, 0, -1):
+ for Index in range(len(CommentList)-1, 0, -1):
Line = CommentList[Index][0]
if _IsCopyrightLine(Line):
Last = Index
diff --git a/BaseTools/Source/Python/UPT/Library/Misc.py b/BaseTools/Source/Python/UPT/Library/Misc.py
index 0d92cb3767c6..24e0a20daf87 100644
--- a/BaseTools/Source/Python/UPT/Library/Misc.py
+++ b/BaseTools/Source/Python/UPT/Library/Misc.py
@@ -19,6 +19,7 @@ Misc
##
# Import Modules
#
+from builtins import range
import os.path
from os import access
from os import F_OK
@@ -437,7 +438,7 @@ class Sdict(IterableUserDict):
def CommonPath(PathList):
Path1 = min(PathList).split(os.path.sep)
Path2 = max(PathList).split(os.path.sep)
- for Index in xrange(min(len(Path1), len(Path2))):
+ for Index in range(min(len(Path1), len(Path2))):
if Path1[Index] != Path2[Index]:
return os.path.sep.join(Path1[:Index])
return os.path.sep.join(Path1)
@@ -890,7 +891,7 @@ def ProcessEdkComment(LineList):
if FindEdkBlockComment:
if FirstPos == -1:
FirstPos = StartPos
- for Index in xrange(StartPos, EndPos+1):
+ for Index in range(StartPos, EndPos+1):
LineList[Index] = ''
FindEdkBlockComment = False
elif Line.find("//") != -1 and not Line.startswith("#"):
diff --git a/BaseTools/Source/Python/UPT/Library/Parsing.py b/BaseTools/Source/Python/UPT/Library/Parsing.py
index c34e7751442a..bac664506f4d 100644
--- a/BaseTools/Source/Python/UPT/Library/Parsing.py
+++ b/BaseTools/Source/Python/UPT/Library/Parsing.py
@@ -20,6 +20,7 @@ Parsing
##
# Import Modules
#
+from builtins import range
import os.path
import re
@@ -973,7 +974,7 @@ def GenSection(SectionName, SectionDict, SplitArch=True, NeedBlankLine=False):
ArchList = GetSplitValueList(SectionAttrs, DataType.TAB_COMMENT_SPLIT)
else:
ArchList = [SectionAttrs]
- for Index in xrange(0, len(ArchList)):
+ for Index in range(0, len(ArchList)):
ArchList[Index] = ConvertArchForInstall(ArchList[Index])
Section = '[' + SectionName + '.' + (', ' + SectionName + '.').join(ArchList) + ']'
else:
diff --git a/BaseTools/Source/Python/UPT/Library/String.py b/BaseTools/Source/Python/UPT/Library/String.py
index 278073e4a379..2f916324bd13 100644
--- a/BaseTools/Source/Python/UPT/Library/String.py
+++ b/BaseTools/Source/Python/UPT/Library/String.py
@@ -18,6 +18,7 @@ String
##
# Import Modules
#
+from builtins import range
import re
import os.path
from string import strip
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index 84958ae38cef..d07c26abd9c2 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -19,6 +19,7 @@ from __future__ import print_function
##
# Import Modules
#
+from builtins import range
import os, codecs, re
import distutils.util
from Logger import ToolError
@@ -515,7 +516,7 @@ class UniFileClassObject(object):
FileIn[LineCount-1] = Line
FileIn[LineCount] = '\r\n'
LineCount -= 1
- for Index in xrange (LineCount + 1, len (FileIn) - 1):
+ for Index in range (LineCount + 1, len (FileIn) - 1):
if (Index == len(FileIn) -1):
FileIn[Index] = '\r\n'
else:
diff --git a/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py b/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
index 22a50680fb8f..14539b0bd6c1 100644
--- a/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
+++ b/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
@@ -17,6 +17,7 @@ DecParserMisc
## Import modules
#
+from builtins import range
import os
import Logger.Log as Logger
from Logger.ToolError import FILE_PARSE_FAILURE
diff --git a/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
index 727164c2c244..ac821deded0a 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
@@ -18,6 +18,7 @@ InfSectionParser
##
# Import Modules
#
+from builtins import range
from copy import deepcopy
import re
@@ -455,7 +456,7 @@ class InfSectionParser(InfDefinSectionParser,
Arch = Match.groups(1)[0].upper()
ArchList.append(Arch)
CommentSoFar = ''
- for Index in xrange(1, len(List)):
+ for Index in range(1, len(List)):
Result = ParseComment(List[Index], DT.ALL_USAGE_TOKENS, TokenDict, [], False)
Usage = Result[0]
Type = Result[1]
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 074aa311f31d..4c28b7f5d22a 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -20,6 +20,7 @@ from __future__ import print_function
##
# Import Modules
#
+from builtins import range
import os.path
from os import sep
import platform
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 0bfcc44e3f19..3296ee3d3d8f 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -19,6 +19,7 @@ UPT
## import modules
#
+from builtins import range
import locale
import sys
encoding = locale.getdefaultlocale()[1]
diff --git a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
index 626f17426de7..2c21823194e2 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
@@ -12,6 +12,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
from __future__ import print_function
+from builtins import range
import os
#import Object.Parser.InfObject as InfObject
from Object.Parser.InfCommonObject import CurrentLine
diff --git a/BaseTools/Source/Python/UPT/Xml/IniToXml.py b/BaseTools/Source/Python/UPT/Xml/IniToXml.py
index 037471056d81..79db9a31a28b 100644
--- a/BaseTools/Source/Python/UPT/Xml/IniToXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/IniToXml.py
@@ -16,6 +16,7 @@
IniToXml
'''
+from builtins import range
import os.path
import re
from time import strftime
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParser.py b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
index 58959081d0ab..b4d52f7bdc1f 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParser.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
@@ -19,6 +19,7 @@ XmlParser
##
# Import Modules
#
+from builtins import range
import re
from Library.Xml.XmlRoutines import XmlNode
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py b/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
index 7e3dc94edf64..28b146ff9183 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
@@ -15,6 +15,7 @@
'''
XmlParserMisc
'''
+from builtins import range
from Object.POM.CommonObject import TextObject
from Logger.StringTable import ERR_XML_PARSER_REQUIRED_ITEM_MISSING
from Logger.ToolError import PARSER_ERROR
@@ -53,7 +54,7 @@ def ConvertVariableName(VariableName):
if SecondByte != 0:
return None
- if FirstByte not in xrange(0x20, 0x7F):
+ if FirstByte not in range(0x20, 0x7F):
return None
TransferedStr += ('%c')%FirstByte
Index = Index + 2
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index a482c94b6fdc..e9fe533b3975 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -18,6 +18,7 @@
# into PlatformBuildClassObject form for easier use for AutoGen.
#
from __future__ import print_function
+from builtins import range
from Common.String import *
from Common.DataType import *
from Common.Misc import *
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 67c08ee47841..9fc2e681b73d 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -12,6 +12,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from builtins import range
from Common.String import *
from Common.DataType import *
from Common.Misc import *
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 9bcb017c0c45..4ad60498488b 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -16,6 +16,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import Common.LongFilePathOs as os
import re
import time
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index c52b8bd94234..1202289616ee 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -16,6 +16,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import base64
import os
import os.path
@@ -162,7 +163,7 @@ class BaseToolsTest(unittest.TestCase):
if maxlen is None: maxlen = minlen
return ''.join(
[chr(random.randint(0,255))
- for x in xrange(random.randint(minlen, maxlen))
+ for x in range(random.randint(minlen, maxlen))
])
def setUp(self):
diff --git a/BaseTools/Tests/TianoCompress.py b/BaseTools/Tests/TianoCompress.py
index f6a4a6ae9c5d..65f783d1be9e 100644
--- a/BaseTools/Tests/TianoCompress.py
+++ b/BaseTools/Tests/TianoCompress.py
@@ -16,6 +16,7 @@
# Import Modules
#
from __future__ import print_function
+from builtins import range
import os
import random
import sys
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 643fec58a457..f7d0308bd9fa 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -18,6 +18,7 @@
from __future__ import print_function
+from builtins import range
from optparse import OptionParser
import os
import shutil
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 05/20] BaseTools: Remove tuple parameter in python scripts
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (3 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 04/20] BaseTools: Use the python3-range functions Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 06/20] BaseTools: Remove the deprecated hash_key() Gary Lin
` (15 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
According to PEP3113, tuple parameter is removed in python 3.
(PEP3113: https://www.python.org/dev/peps/pep-3113/)
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/Common/VpdInfoFile.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index a6c1fb70bd7d..280cdfb536a6 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -219,7 +219,8 @@ class VpdInfoFile:
return None
return self._VpdArray[vpd]
- def GetVpdInfo(self,(PcdTokenName,TokenSpaceName)):
+ def GetVpdInfo(self, arg):
+ (PcdTokenName, TokenSpaceName) = arg
return self._VpdInfo.get((TokenSpaceName, PcdTokenName))
## Call external BPDG tool to process VPD file
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 06/20] BaseTools: Remove the deprecated hash_key()
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (4 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 05/20] BaseTools: Remove tuple parameter in python scripts Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 07/20] BaseTools: Import reduce() from functools Gary Lin
` (14 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Replace "has_key()" with "in" to be compatible with python3.
Based on "futurize -f lib2to3.fixes.fix_has_key"
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 4 ++--
BaseTools/Source/Python/Common/VpdInfoFile.py | 2 +-
BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py | 16 ++++++++--------
BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py | 6 +++---
BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py | 2 +-
BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py | 4 ++--
BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py | 2 +-
BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py | 4 ++--
BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py | 4 ++--
BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py | 4 ++--
BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py | 4 ++--
BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py | 2 +-
BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py | 3 +--
BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py | 4 ++--
BaseTools/Source/Python/build/build.py | 2 +-
15 files changed, 31 insertions(+), 32 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 73ffff8a70d8..18da411f83a0 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -1828,8 +1828,8 @@ class PlatformAutoGen(AutoGen):
# retrieve BPDG tool's path from tool_def.txt according to VPD_TOOL_GUID defined in DSC file.
BPDGToolName = None
for ToolDef in self.ToolDefinition.values():
- if ToolDef.has_key("GUID") and ToolDef["GUID"] == self.Platform.VpdToolGuid:
- if not ToolDef.has_key("PATH"):
+ if "GUID" in ToolDef and ToolDef["GUID"] == self.Platform.VpdToolGuid:
+ if "PATH" not in ToolDef:
EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt" % self.Platform.VpdToolGuid)
BPDGToolName = ToolDef["PATH"]
break
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 280cdfb536a6..84dd7ac563dd 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -212,7 +212,7 @@ class VpdInfoFile:
#
# @param vpd A given VPD PCD
def GetOffset(self, vpd):
- if not self._VpdArray.has_key(vpd):
+ if vpd not in self._VpdArray:
return None
if len(self._VpdArray[vpd]) == 0:
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index 517f2a6cdecd..4a9528b500f2 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -439,14 +439,14 @@ def GenLibraryClasses(ModuleObject):
Statement = '# Guid: ' + LibraryItem.Guid + ' Version: ' + LibraryItem.Version
if len(BinaryFile.SupArchList) == 0:
- if LibraryClassDict.has_key('COMMON') and Statement not in LibraryClassDict['COMMON']:
+ if 'COMMON' in LibraryClassDict and Statement not in LibraryClassDict['COMMON']:
LibraryClassDict['COMMON'].append(Statement)
else:
LibraryClassDict['COMMON'] = ['## @LIB_INSTANCES']
LibraryClassDict['COMMON'].append(Statement)
else:
for Arch in BinaryFile.SupArchList:
- if LibraryClassDict.has_key(Arch):
+ if Arch in LibraryClassDict:
if Statement not in LibraryClassDict[Arch]:
LibraryClassDict[Arch].append(Statement)
else:
@@ -918,14 +918,14 @@ def GenAsBuiltPacthPcdSections(ModuleObject):
if FileNameObjList:
ArchList = FileNameObjList[0].GetSupArchList()
if len(ArchList) == 0:
- if PatchPcdDict.has_key(DT.TAB_ARCH_COMMON):
+ if DT.TAB_ARCH_COMMON in PatchPcdDict:
if Statement not in PatchPcdDict[DT.TAB_ARCH_COMMON]:
PatchPcdDict[DT.TAB_ARCH_COMMON].append(Statement)
else:
PatchPcdDict[DT.TAB_ARCH_COMMON] = [Statement]
else:
for Arch in ArchList:
- if PatchPcdDict.has_key(Arch):
+ if Arch in PatchPcdDict:
if Statement not in PatchPcdDict[Arch]:
PatchPcdDict[Arch].append(Statement)
else:
@@ -968,13 +968,13 @@ def GenAsBuiltPcdExSections(ModuleObject):
ArchList = FileNameObjList[0].GetSupArchList()
if len(ArchList) == 0:
- if PcdExDict.has_key('COMMON'):
+ if 'COMMON' in PcdExDict:
PcdExDict['COMMON'].append(Statement)
else:
PcdExDict['COMMON'] = [Statement]
else:
for Arch in ArchList:
- if PcdExDict.has_key(Arch):
+ if Arch in PcdExDict:
if Statement not in PcdExDict[Arch]:
PcdExDict[Arch].append(Statement)
else:
@@ -1072,7 +1072,7 @@ def GenBuildOptions(ModuleObject):
for BuilOptionItem in BinaryFile.AsBuiltList[0].BinaryBuildFlagList:
Statement = '#' + BuilOptionItem.AsBuiltOptionFlags
if len(BinaryFile.SupArchList) == 0:
- if BuildOptionDict.has_key('COMMON'):
+ if 'COMMON' in BuildOptionDict:
if Statement not in BuildOptionDict['COMMON']:
BuildOptionDict['COMMON'].append(Statement)
else:
@@ -1080,7 +1080,7 @@ def GenBuildOptions(ModuleObject):
BuildOptionDict['COMMON'].append(Statement)
else:
for Arch in BinaryFile.SupArchList:
- if BuildOptionDict.has_key(Arch):
+ if Arch in BuildOptionDict:
if Statement not in BuildOptionDict[Arch]:
BuildOptionDict[Arch].append(Statement)
else:
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
index f968beee6081..a829c0cfe34c 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
@@ -272,7 +272,7 @@ class InfBinariesObject(InfSectionCommonDef):
pass
if InfBianryVerItemObj != None:
- if self.Binaries.has_key((InfBianryVerItemObj)):
+ if (InfBianryVerItemObj) in self.Binaries:
BinariesList = self.Binaries[InfBianryVerItemObj]
BinariesList.append((InfBianryVerItemObj, VerComment))
self.Binaries[InfBianryVerItemObj] = BinariesList
@@ -522,7 +522,7 @@ class InfBinariesObject(InfSectionCommonDef):
# pass
if InfBianryCommonItemObj != None:
- if self.Binaries.has_key((InfBianryCommonItemObj)):
+ if (InfBianryCommonItemObj) in self.Binaries:
BinariesList = self.Binaries[InfBianryCommonItemObj]
BinariesList.append((InfBianryCommonItemObj, ItemComment))
self.Binaries[InfBianryCommonItemObj] = BinariesList
@@ -673,7 +673,7 @@ class InfBinariesObject(InfSectionCommonDef):
# pass
if InfBianryUiItemObj != None:
- if self.Binaries.has_key((InfBianryUiItemObj)):
+ if (InfBianryUiItemObj) in self.Binaries:
BinariesList = self.Binaries[InfBianryUiItemObj]
BinariesList.append((InfBianryUiItemObj, UiComment))
self.Binaries[InfBianryUiItemObj] = BinariesList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
index 1d074ee638fd..2c9ea6ccd2cc 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
@@ -957,7 +957,7 @@ class InfDefObject(InfSectionCommonDef):
SpecValue = Name[Name.find("SPEC") + len("SPEC"):].strip()
Name = "SPEC"
Value = SpecValue + " = " + Value
- if self.Defines.has_key(ArchListString):
+ if ArchListString in self.Defines:
DefineList = self.Defines[ArchListString]
LineInfo[0] = InfDefMemberObj.CurrentLine.GetFileName()
LineInfo[1] = InfDefMemberObj.CurrentLine.GetLineNo()
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
index 23125552e06d..e546127bd3e6 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
@@ -338,7 +338,7 @@ class InfGuidObject():
#
pass
- if self.Guids.has_key((InfGuidItemObj)):
+ if (InfGuidItemObj) in self.Guids:
GuidList = self.Guids[InfGuidItemObj]
GuidList.append(InfGuidItemObj)
self.Guids[InfGuidItemObj] = GuidList
@@ -350,4 +350,4 @@ class InfGuidObject():
return True
def GetGuid(self):
- return self.Guids
\ No newline at end of file
+ return self.Guids
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
index b18c4c381bc0..4c3233b73552 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
@@ -238,7 +238,7 @@ class InfLibraryClassObject():
LibItemObj.SetVersion(LibItem[1])
LibItemObj.SetSupArchList(__SupArchList)
- if self.LibraryClasses.has_key((LibItemObj)):
+ if (LibItemObj) in self.LibraryClasses:
LibraryList = self.LibraryClasses[LibItemObj]
LibraryList.append(LibItemObj)
self.LibraryClasses[LibItemObj] = LibraryList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
index 74099e208860..081e69db5feb 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
@@ -114,7 +114,7 @@ class InfSpecialCommentObject(InfSectionCommonDef):
Type == DT.TYPE_EVENT_SECTION or \
Type == DT.TYPE_BOOTMODE_SECTION:
for Item in SepcialSectionList:
- if self.SpecialComments.has_key(Type):
+ if Type in self.SpecialComments:
ObjList = self.SpecialComments[Type]
ObjList.append(Item)
self.SpecialComments[Type] = ObjList
@@ -145,4 +145,4 @@ def ErrorInInf(Message=None, ErrorCode=None, LineInfo=None, RaiseError=True):
File=LineInfo[0],
Line=LineInfo[1],
ExtraData=LineInfo[2],
- RaiseError=RaiseError)
\ No newline at end of file
+ RaiseError=RaiseError)
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
index 37399134dbf3..164260ffbfef 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
@@ -171,7 +171,7 @@ class InfPackageObject():
#
pass
- if self.Packages.has_key((PackageItemObj)):
+ if (PackageItemObj) in self.Packages:
PackageList = self.Packages[PackageItemObj]
PackageList.append(PackageItemObj)
self.Packages[PackageItemObj] = PackageList
@@ -184,4 +184,4 @@ class InfPackageObject():
def GetPackages(self, Arch = None):
if Arch == None:
- return self.Packages
\ No newline at end of file
+ return self.Packages
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
index 7b07036f91c2..b5ca01f148d1 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
@@ -411,7 +411,7 @@ class InfPcdObject():
else:
PcdItemObj.SetSupportArchList(SupArchList)
- if self.Pcds.has_key((PcdTypeItem, PcdItemObj)):
+ if (PcdTypeItem, PcdItemObj) in self.Pcds:
PcdsList = self.Pcds[PcdTypeItem, PcdItemObj]
PcdsList.append(PcdItemObj)
self.Pcds[PcdTypeItem, PcdItemObj] = PcdsList
@@ -456,7 +456,7 @@ class InfPcdObject():
PackageInfo)
PcdTypeItem = KeysList[0][0]
- if self.Pcds.has_key((PcdTypeItem, PcdItemObj)):
+ if (PcdTypeItem, PcdItemObj) in self.Pcds:
PcdsList = self.Pcds[PcdTypeItem, PcdItemObj]
PcdsList.append(PcdItemObj)
self.Pcds[PcdTypeItem, PcdItemObj] = PcdsList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
index 4df62bb459ff..53e1f342cac5 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
@@ -327,7 +327,7 @@ class InfPpiObject():
#
pass
- if self.Ppis.has_key((InfPpiItemObj)):
+ if (InfPpiItemObj) in self.Ppis:
PpiList = self.Ppis[InfPpiItemObj]
PpiList.append(InfPpiItemObj)
self.Ppis[InfPpiItemObj] = PpiList
@@ -340,4 +340,4 @@ class InfPpiObject():
def GetPpi(self):
- return self.Ppis
\ No newline at end of file
+ return self.Ppis
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
index c94e53c98f87..e552cb627b5e 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
@@ -296,7 +296,7 @@ class InfProtocolObject():
#
pass
- if self.Protocols.has_key((InfProtocolItemObj)):
+ if (InfProtocolItemObj) in self.Protocols:
ProcotolList = self.Protocols[InfProtocolItemObj]
ProcotolList.append(InfProtocolItemObj)
self.Protocols[InfProtocolItemObj] = ProcotolList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
index 9988f8ecfeed..93ae21e16b76 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
@@ -224,7 +224,7 @@ class InfSourcesObject(InfSectionCommonDef):
ItemObj.SetSupArchList(__SupArchList)
- if self.Sources.has_key((ItemObj)):
+ if (ItemObj) in self.Sources:
SourceContent = self.Sources[ItemObj]
SourceContent.append(ItemObj)
self.Sources[ItemObj] = SourceContent
@@ -237,4 +237,3 @@ class InfSourcesObject(InfSectionCommonDef):
def GetSources(self):
return self.Sources
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
index 27a1c6ad25a0..f9db2944a495 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
@@ -103,7 +103,7 @@ class InfUserExtensionObject():
# Line=LineNo,
# ExtraData=None)
- if self.UserExtension.has_key(IdContentItem):
+ if IdContentItem in self.UserExtension:
#
# Each UserExtensions section header must have a unique set
# of UserId, IdString and Arch values.
@@ -130,4 +130,4 @@ class InfUserExtensionObject():
return True
def GetUserExtension(self):
- return self.UserExtension
\ No newline at end of file
+ return self.UserExtension
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 68dca8e21524..c6a37ab1d9a3 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -75,7 +75,7 @@ TmpTableDict = {}
# Otherwise, False is returned
#
def IsToolInPath(tool):
- if os.environ.has_key('PATHEXT'):
+ if 'PATHEXT' in os.environ:
extns = os.environ['PATHEXT'].split(os.path.pathsep)
else:
extns = ('',)
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 07/20] BaseTools: Import reduce() from functools
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (5 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 06/20] BaseTools: Remove the deprecated hash_key() Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 08/20] BaseTools: Replace StandardError with Expression Gary Lin
` (13 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
In python3, reduce() is not a built-in function anymore.
Import it from "functools" to be compatible with python 3.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 1 +
BaseTools/Source/Python/AutoGen/GenVar.py | 1 +
2 files changed, 2 insertions(+)
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index b4955ea7ebab..a989cb34dff3 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -21,6 +21,7 @@ from ValidCheckingInfoObject import VAR_VALID_OBJECT_FACTORY
from Common.VariableAttributes import VariableAttributes
import copy
from struct import unpack
+from functools import reduce
DATABASE_VERSION = 7
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 8e800c8bc914..b82d7e4d2d37 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -21,6 +21,7 @@ import copy
from Common.VariableAttributes import VariableAttributes
from Common.Misc import *
import collections
+from functools import reduce
var_info = collections.namedtuple("uefi_var", "pcdindex,pcdname,defaultstoragename,skuname,var_name, var_guid, var_offset,var_attribute,pcd_default_value, default_value, data_type")
NvStorageHeaderSize = 28
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 08/20] BaseTools: Replace StandardError with Expression
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (6 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 07/20] BaseTools: Import reduce() from functools Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 09/20] BaseTools: Remove types.TypeType Gary Lin
` (12 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
StandardError has been removed from python 3.
Replace it with Exception.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/UPT/UPT.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 3296ee3d3d8f..84b3c353201a 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -310,7 +310,7 @@ def Main():
else:
GlobalData.gDB.Commit()
Mgr.commit()
- except StandardError:
+ except Exception:
Logger.Quiet(ST.MSG_RECOVER_FAIL)
GlobalData.gDB.CloseDb()
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 09/20] BaseTools: Remove types.TypeType
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (7 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 08/20] BaseTools: Replace StandardError with Expression Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 10/20] BaseTools: Refactor python raise statement Gary Lin
` (11 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
"types.TypeType" is now an alias of the built-in "type" and is not
compatible with python 3.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Tests/TestTools.py | 3 +--
1 file changed, 1 insertion(+), 2 deletions(-)
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 1202289616ee..1cf2ce13be2b 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -24,7 +24,6 @@ import random
import shutil
import subprocess
import sys
-import types
import unittest
TestsDir = os.path.realpath(os.path.split(sys.argv[0])[0])
@@ -43,7 +42,7 @@ if PythonSourceDir not in sys.path:
def MakeTheTestSuite(localItems):
tests = []
for name, item in localItems.iteritems():
- if isinstance(item, types.TypeType):
+ if isinstance(item, type):
if issubclass(item, unittest.TestCase):
tests.append(unittest.TestLoader().loadTestsFromTestCase(item))
elif issubclass(item, unittest.TestSuite):
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 10/20] BaseTools: Refactor python raise statement
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (8 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 09/20] BaseTools: Remove types.TypeType Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:35 ` [PATCH v2 11/20] BaseTools: Adjust the spaces around commas and colons Gary Lin
` (10 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Make "raise" to be compatible with python3.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/gcc/mingw-gcc-build.py | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index f7d0308bd9fa..49ff656c066f 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -508,8 +508,8 @@ class Builder:
f = open(logFile, "w")
f.write(output)
f.close()
- raise Exception, 'Failed to %s %s\n' % (stage, module) + \
- 'See output log at %s' % self.config.Relative(logFile)
+ raise Exception('Failed to %s %s\n' % (stage, module) + \
+ 'See output log at %s' % self.config.Relative(logFile))
else:
print('[done]')
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 11/20] BaseTools: Adjust the spaces around commas and colons
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (9 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 10/20] BaseTools: Refactor python raise statement Gary Lin
@ 2018-02-01 8:35 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 12/20] BaseTools: Migrate to the new octal literal Gary Lin
` (9 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:35 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Based on "futurize -f lib2to3.fixes.fix_ws_comma"
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py | 2 +-
BaseTools/Scripts/BinToPcd.py | 8 +-
BaseTools/Scripts/MemoryProfileSymbolGen.py | 6 +-
BaseTools/Scripts/PatchCheck.py | 2 +-
BaseTools/Scripts/RunMakefile.py | 2 +-
BaseTools/Source/Python/AutoGen/AutoGen.py | 54 +++---
BaseTools/Source/Python/AutoGen/GenMake.py | 4 +-
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 114 ++++++-------
BaseTools/Source/Python/AutoGen/GenVar.py | 164 +++++++++---------
BaseTools/Source/Python/BPDG/GenVpd.py | 12 +-
BaseTools/Source/Python/Common/DataType.py | 4 +-
BaseTools/Source/Python/Common/DscClassObject.py | 2 +-
BaseTools/Source/Python/Common/EdkIIWorkspace.py | 2 +-
BaseTools/Source/Python/Common/Expression.py | 6 +-
BaseTools/Source/Python/Common/FdfParserLite.py | 12 +-
BaseTools/Source/Python/Common/Misc.py | 46 ++---
BaseTools/Source/Python/Common/RangeExpression.py | 4 +-
BaseTools/Source/Python/Common/String.py | 2 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 10 +-
BaseTools/Source/Python/Ecc/CParser.py | 28 +--
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 14 +-
BaseTools/Source/Python/Eot/CParser.py | 28 +--
BaseTools/Source/Python/Eot/c.py | 20 +--
BaseTools/Source/Python/GenFds/AprioriSection.py | 2 +-
BaseTools/Source/Python/GenFds/CapsuleData.py | 2 +-
BaseTools/Source/Python/GenFds/EfiSection.py | 6 +-
BaseTools/Source/Python/GenFds/Fd.py | 6 +-
BaseTools/Source/Python/GenFds/FdfParser.py | 26 +--
BaseTools/Source/Python/GenFds/FfsInfStatement.py | 12 +-
BaseTools/Source/Python/GenFds/Fv.py | 4 +-
BaseTools/Source/Python/GenFds/FvImageSection.py | 4 +-
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 4 +-
BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 2 +-
BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 2 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 2 +-
BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 6 +-
BaseTools/Source/Python/TargetTool/TargetTool.py | 12 +-
BaseTools/Source/Python/Trim/Trim.py | 14 +-
BaseTools/Source/Python/UPT/Core/DependencyRules.py | 8 +-
BaseTools/Source/Python/UPT/Core/IpiDb.py | 4 +-
BaseTools/Source/Python/UPT/Library/String.py | 2 +-
BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 2 +-
BaseTools/Source/Python/UPT/UPT.py | 2 +-
BaseTools/Source/Python/UPT/Xml/CommonXml.py | 2 +-
BaseTools/Source/Python/UPT/Xml/XmlParser.py | 24 +--
BaseTools/Source/Python/Workspace/DecBuildData.py | 14 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 178 ++++++++++----------
BaseTools/Source/Python/Workspace/MetaFileParser.py | 36 ++--
BaseTools/Source/Python/Workspace/MetaFileTable.py | 6 +-
BaseTools/Source/Python/Workspace/WorkspaceCommon.py | 2 +-
BaseTools/Source/Python/build/BuildReport.py | 8 +-
BaseTools/Source/Python/build/build.py | 8 +-
BaseTools/Tests/TestTools.py | 2 +-
BaseTools/gcc/mingw-gcc-build.py | 2 +-
54 files changed, 475 insertions(+), 475 deletions(-)
diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
index dd66c7111ac0..b226499e8450 100755
--- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
+++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
@@ -48,7 +48,7 @@ def ConvertCygPathToDos(CygPath):
DosPath = CygPath
# pipes.quote will add the extra \\ for us.
- return DosPath.replace('/','\\')
+ return DosPath.replace('/', '\\')
# we receive our options as a list, but we will be passing them to the shell as a line
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 7d8cd0a5cc25..0997ee408c05 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -42,13 +42,13 @@ if __name__ == '__main__':
return Value
def ValidatePcdName (Argument):
- if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
+ if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
Message = '%s is not in the form <PcdTokenSpaceGuidCName>.<PcdCName>' % (Argument)
raise argparse.ArgumentTypeError(Message)
return Argument
def ValidateGuidName (Argument):
- if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
+ if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
Message = '%s is not a valid GUID C name' % (Argument)
raise argparse.ArgumentTypeError(Message)
return Argument
@@ -71,7 +71,7 @@ if __name__ == '__main__':
help = "Output filename for PCD value or PCD statement")
parser.add_argument("-p", "--pcd", dest = 'PcdName', type = ValidatePcdName,
help = "Name of the PCD in the form <PcdTokenSpaceGuidCName>.<PcdCName>")
- parser.add_argument("-t", "--type", dest = 'PcdType', default = None, choices = ['VPD','HII'],
+ parser.add_argument("-t", "--type", dest = 'PcdType', default = None, choices = ['VPD', 'HII'],
help = "PCD statement type (HII or VPD). Default is standard.")
parser.add_argument("-m", "--max-size", dest = 'MaxSize', type = ValidateUnsignedInteger,
help = "Maximum size of the PCD. Ignored with --type HII.")
@@ -85,7 +85,7 @@ if __name__ == '__main__':
help = "Increase output messages")
parser.add_argument("-q", "--quiet", dest = 'Quiet', action = "store_true",
help = "Reduce output messages")
- parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = list(range(0,10)), default = 0,
+ parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = list(range(0, 10)), default = 0,
help = "Set debug level")
#
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 3bc6a8897bcc..c9158800668d 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -190,7 +190,7 @@ def processLine(newline):
driverPrefixLen = len("Driver - ")
# get driver name
- if cmp(newline[0:driverPrefixLen],"Driver - ") == 0 :
+ if cmp(newline[0:driverPrefixLen], "Driver - ") == 0 :
driverlineList = newline.split(" ")
driverName = driverlineList[2]
#print "Checking : ", driverName
@@ -213,7 +213,7 @@ def processLine(newline):
else :
symbolsFile.symbolsTable[driverName].parse_debug_file (driverName, pdbName)
- elif cmp(newline,"") == 0 :
+ elif cmp(newline, "") == 0 :
driverName = ""
# check entry line
@@ -226,7 +226,7 @@ def processLine(newline):
rvaName = ""
symbolName = ""
- if cmp(rvaName,"") == 0 :
+ if cmp(rvaName, "") == 0 :
return newline
else :
return newline + symbolName
diff --git a/BaseTools/Scripts/PatchCheck.py b/BaseTools/Scripts/PatchCheck.py
index 51d4adf08b60..211db566cb25 100755
--- a/BaseTools/Scripts/PatchCheck.py
+++ b/BaseTools/Scripts/PatchCheck.py
@@ -286,7 +286,7 @@ class GitDiffCheck:
if self.state == START:
if line.startswith('diff --git'):
self.state = PRE_PATCH
- self.filename = line[13:].split(' ',1)[0]
+ self.filename = line[13:].split(' ', 1)[0]
self.is_newfile = False
self.force_crlf = not self.filename.endswith('.sh')
elif len(line.rstrip()) != 0:
diff --git a/BaseTools/Scripts/RunMakefile.py b/BaseTools/Scripts/RunMakefile.py
index 48bc198c7671..6d0c4553c9eb 100644
--- a/BaseTools/Scripts/RunMakefile.py
+++ b/BaseTools/Scripts/RunMakefile.py
@@ -149,7 +149,7 @@ if __name__ == '__main__':
for Item in gArgs.Define:
if '=' not in Item[0]:
continue
- Item = Item[0].split('=',1)
+ Item = Item[0].split('=', 1)
CommandLine.append('%s="%s"' % (Item[0], Item[1]))
CommandLine.append('EXTRA_FLAGS="%s"' % (gArgs.Remaining))
CommandLine.append(gArgs.BuildType)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 18da411f83a0..0017f66e5ec8 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -46,7 +46,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
import InfSectionParser
import datetime
import hashlib
-from GenVar import VariableMgr,var_info
+from GenVar import VariableMgr, var_info
## Regular expression for splitting Dependency Expression string into tokens
gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
@@ -1286,7 +1286,7 @@ class PlatformAutoGen(AutoGen):
ShareFixedAtBuildPcdsSameValue = {}
for Module in LibAuto._ReferenceModules:
for Pcd in Module.FixedAtBuildPcds + LibAuto.FixedAtBuildPcds:
- key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
+ key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
if key not in FixedAtBuildPcds:
ShareFixedAtBuildPcdsSameValue[key] = True
FixedAtBuildPcds[key] = Pcd.DefaultValue
@@ -1294,11 +1294,11 @@ class PlatformAutoGen(AutoGen):
if FixedAtBuildPcds[key] != Pcd.DefaultValue:
ShareFixedAtBuildPcdsSameValue[key] = False
for Pcd in LibAuto.FixedAtBuildPcds:
- key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
- if (Pcd.TokenCName,Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
+ key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+ if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
continue
else:
- DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)]
+ DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
if DscPcd.Type != "FixedAtBuild":
continue
if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:
@@ -1318,12 +1318,12 @@ class PlatformAutoGen(AutoGen):
break
- VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(),self.DscBuildDataObj._GetSkuIds())
+ VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(), self.DscBuildDataObj._GetSkuIds())
VariableInfo.SetVpdRegionMaxSize(VpdRegionSize)
VariableInfo.SetVpdRegionOffset(VpdRegionBase)
Index = 0
for Pcd in DynamicPcdSet:
- pcdname = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
+ pcdname = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
for SkuName in Pcd.SkuInfoList:
Sku = Pcd.SkuInfoList[SkuName]
SkuId = Sku.SkuId
@@ -1333,11 +1333,11 @@ class PlatformAutoGen(AutoGen):
VariableGuidStructure = Sku.VariableGuidValue
VariableGuid = GuidStructureStringToGuidString(VariableGuidStructure)
for StorageName in Sku.DefaultStoreDict:
- VariableInfo.append_variable(var_info(Index,pcdname,StorageName,SkuName, StringToArray(Sku.VariableName),VariableGuid, Sku.VariableOffset, Sku.VariableAttribute , Sku.HiiDefaultValue,Sku.DefaultStoreDict[StorageName],Pcd.DatumType))
+ VariableInfo.append_variable(var_info(Index, pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultStoreDict[StorageName], Pcd.DatumType))
Index += 1
return VariableInfo
- def UpdateNVStoreMaxSize(self,OrgVpdFile):
+ def UpdateNVStoreMaxSize(self, OrgVpdFile):
if self.VariableInfo:
VpdMapFilePath = os.path.join(self.BuildDir, "FV", "%s.map" % self.Platform.VpdToolGuid)
PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
@@ -1350,7 +1350,7 @@ class PlatformAutoGen(AutoGen):
else:
EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
- NvStoreOffset = int(NvStoreOffset,16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
+ NvStoreOffset = int(NvStoreOffset, 16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
default_skuobj = PcdNvStoreDfBuffer[0].SkuInfoList.get("DEFAULT")
maxsize = self.VariableInfo.VpdRegionSize - NvStoreOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultValue.split(","))
var_data = self.VariableInfo.PatchNVStoreDefaultMaxSize(maxsize)
@@ -1598,7 +1598,7 @@ class PlatformAutoGen(AutoGen):
VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
#Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
- PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer","gEfiMdeModulePkgTokenSpaceGuid"))
+ PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid"))
if PcdNvStoreDfBuffer:
self.VariableInfo = self.CollectVariables(self._DynamicPcdList)
vardump = self.VariableInfo.dump()
@@ -1625,10 +1625,10 @@ class PlatformAutoGen(AutoGen):
PcdValue = DefaultSku.DefaultValue
if PcdValue not in SkuValueMap:
SkuValueMap[PcdValue] = []
- VpdFile.Add(Pcd, 'DEFAULT',DefaultSku.VpdOffset)
+ VpdFile.Add(Pcd, 'DEFAULT', DefaultSku.VpdOffset)
SkuValueMap[PcdValue].append(DefaultSku)
- for (SkuName,Sku) in Pcd.SkuInfoList.items():
+ for (SkuName, Sku) in Pcd.SkuInfoList.items():
Sku.VpdOffset = Sku.VpdOffset.strip()
PcdValue = Sku.DefaultValue
if PcdValue == "":
@@ -1654,7 +1654,7 @@ class PlatformAutoGen(AutoGen):
EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Alignment))
if PcdValue not in SkuValueMap:
SkuValueMap[PcdValue] = []
- VpdFile.Add(Pcd, SkuName,Sku.VpdOffset)
+ VpdFile.Add(Pcd, SkuName, Sku.VpdOffset)
SkuValueMap[PcdValue].append(Sku)
# if the offset of a VPD is *, then it need to be fixed up by third party tool.
if not NeedProcessVpdMapFile and Sku.VpdOffset == "*":
@@ -1686,9 +1686,9 @@ class PlatformAutoGen(AutoGen):
SkuObjList = DscPcdEntry.SkuInfoList.items()
DefaultSku = DscPcdEntry.SkuInfoList.get('DEFAULT')
if DefaultSku:
- defaultindex = SkuObjList.index(('DEFAULT',DefaultSku))
- SkuObjList[0],SkuObjList[defaultindex] = SkuObjList[defaultindex],SkuObjList[0]
- for (SkuName,Sku) in SkuObjList:
+ defaultindex = SkuObjList.index(('DEFAULT', DefaultSku))
+ SkuObjList[0], SkuObjList[defaultindex] = SkuObjList[defaultindex], SkuObjList[0]
+ for (SkuName, Sku) in SkuObjList:
Sku.VpdOffset = Sku.VpdOffset.strip()
# Need to iterate DEC pcd information to get the value & datumtype
@@ -1738,7 +1738,7 @@ class PlatformAutoGen(AutoGen):
EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment))
if PcdValue not in SkuValueMap:
SkuValueMap[PcdValue] = []
- VpdFile.Add(DscPcdEntry, SkuName,Sku.VpdOffset)
+ VpdFile.Add(DscPcdEntry, SkuName, Sku.VpdOffset)
SkuValueMap[PcdValue].append(Sku)
if not NeedProcessVpdMapFile and Sku.VpdOffset == "*":
NeedProcessVpdMapFile = True
@@ -1804,17 +1804,17 @@ class PlatformAutoGen(AutoGen):
self._DynamicPcdList.extend(list(UnicodePcdArray))
self._DynamicPcdList.extend(list(HiiPcdArray))
self._DynamicPcdList.extend(list(OtherPcdArray))
- allskuset = [(SkuName,Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName,Sku) in pcd.SkuInfoList.items()]
+ allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName, Sku) in pcd.SkuInfoList.items()]
for pcd in self._DynamicPcdList:
if len(pcd.SkuInfoList) == 1:
- for (SkuName,SkuId) in allskuset:
- if type(SkuId) in (str,unicode) and eval(SkuId) == 0 or SkuId == 0:
+ for (SkuName, SkuId) in allskuset:
+ if type(SkuId) in (str, unicode) and eval(SkuId) == 0 or SkuId == 0:
continue
pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList['DEFAULT'])
pcd.SkuInfoList[SkuName].SkuId = SkuId
self.AllPcdList = self._NonDynamicPcdList + self._DynamicPcdList
- def FixVpdOffset(self,VpdFile ):
+ def FixVpdOffset(self, VpdFile):
FvPath = os.path.join(self.BuildDir, "FV")
if not os.path.exists(FvPath):
try:
@@ -2076,7 +2076,7 @@ class PlatformAutoGen(AutoGen):
if self._NonDynamicPcdDict:
return self._NonDynamicPcdDict
for Pcd in self.NonDynamicPcdList:
- self._NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)] = Pcd
+ self._NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
return self._NonDynamicPcdDict
## Get list of non-dynamic PCDs
@@ -3887,7 +3887,7 @@ class ModuleAutoGen(AutoGen):
try:
fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
except:
- EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
+ EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
# Use a instance of StringIO to cache data
fStringIO = StringIO('')
@@ -3923,7 +3923,7 @@ class ModuleAutoGen(AutoGen):
fInputfile.write (fStringIO.getvalue())
except:
EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
- "file been locked or using by other applications." %UniVfrOffsetFileName,None)
+ "file been locked or using by other applications." %UniVfrOffsetFileName, None)
fStringIO.close ()
fInputfile.close ()
@@ -4370,7 +4370,7 @@ class ModuleAutoGen(AutoGen):
def CopyBinaryFiles(self):
for File in self.Module.Binaries:
SrcPath = File.Path
- DstPath = os.path.join(self.OutputDir , os.path.basename(SrcPath))
+ DstPath = os.path.join(self.OutputDir, os.path.basename(SrcPath))
CopyLongFilePath(SrcPath, DstPath)
## Create autogen code for the module and its dependent libraries
#
@@ -4521,7 +4521,7 @@ class ModuleAutoGen(AutoGen):
if SrcTimeStamp > DstTimeStamp:
return False
- with open(self.GetTimeStampPath(),'r') as f:
+ with open(self.GetTimeStampPath(), 'r') as f:
for source in f:
source = source.rstrip('\n')
if not os.path.exists(source):
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 8891b1b97d23..eb56d0e7c5a3 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -746,7 +746,7 @@ cleanlib:
if CmdName == 'Trim':
SecDepsFileList.append(os.path.join('$(DEBUG_DIR)', os.path.basename(OutputFile).replace('offset', 'efi')))
if OutputFile.endswith('.ui') or OutputFile.endswith('.ver'):
- SecDepsFileList.append(os.path.join('$(MODULE_DIR)','$(MODULE_FILE)'))
+ SecDepsFileList.append(os.path.join('$(MODULE_DIR)', '$(MODULE_FILE)'))
self.FfsOutputFileList.append((OutputFile, ' '.join(SecDepsFileList), SecCmdStr))
if len(SecDepsFileList) > 0:
self.ParseSecCmd(SecDepsFileList, CmdTuple)
@@ -864,7 +864,7 @@ cleanlib:
for Target in BuildTargets:
for i, SingleCommand in enumerate(BuildTargets[Target].Commands):
if FlagDict[Flag]['Macro'] in SingleCommand:
- BuildTargets[Target].Commands[i] = SingleCommand.replace('$(INC)','').replace(FlagDict[Flag]['Macro'], RespMacro)
+ BuildTargets[Target].Commands[i] = SingleCommand.replace('$(INC)', '').replace(FlagDict[Flag]['Macro'], RespMacro)
return RespDict
def ProcessBuildTargetList(self):
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index a989cb34dff3..85e6f44502a2 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -270,7 +270,7 @@ def toHex(s):
hv = '0'+hv
lst.append(hv)
if lst:
- return reduce(lambda x,y:x+y, lst)
+ return reduce(lambda x, y:x+y, lst)
else:
return 'empty'
## DbItemList
@@ -650,22 +650,22 @@ def StringArrayToList(StringArray):
#
def GetTokenTypeValue(TokenType):
TokenTypeDict = {
- "PCD_TYPE_SHIFT":28,
- "PCD_TYPE_DATA":(0x0 << 28),
- "PCD_TYPE_HII":(0x8 << 28),
- "PCD_TYPE_VPD":(0x4 << 28),
+ "PCD_TYPE_SHIFT": 28,
+ "PCD_TYPE_DATA": (0x0 << 28),
+ "PCD_TYPE_HII": (0x8 << 28),
+ "PCD_TYPE_VPD": (0x4 << 28),
# "PCD_TYPE_SKU_ENABLED":(0x2 << 28),
- "PCD_TYPE_STRING":(0x1 << 28),
+ "PCD_TYPE_STRING": (0x1 << 28),
- "PCD_DATUM_TYPE_SHIFT":24,
- "PCD_DATUM_TYPE_POINTER":(0x0 << 24),
- "PCD_DATUM_TYPE_UINT8":(0x1 << 24),
- "PCD_DATUM_TYPE_UINT16":(0x2 << 24),
- "PCD_DATUM_TYPE_UINT32":(0x4 << 24),
- "PCD_DATUM_TYPE_UINT64":(0x8 << 24),
+ "PCD_DATUM_TYPE_SHIFT": 24,
+ "PCD_DATUM_TYPE_POINTER": (0x0 << 24),
+ "PCD_DATUM_TYPE_UINT8": (0x1 << 24),
+ "PCD_DATUM_TYPE_UINT16": (0x2 << 24),
+ "PCD_DATUM_TYPE_UINT32": (0x4 << 24),
+ "PCD_DATUM_TYPE_UINT64": (0x8 << 24),
- "PCD_DATUM_TYPE_SHIFT2":20,
- "PCD_DATUM_TYPE_UINT8_BOOLEAN":(0x1 << 20 | 0x1 << 24),
+ "PCD_DATUM_TYPE_SHIFT2": 20,
+ "PCD_DATUM_TYPE_UINT8_BOOLEAN": (0x1 << 20 | 0x1 << 24),
}
return eval(TokenType, TokenTypeDict)
@@ -719,7 +719,7 @@ def BuildExDataBase(Dict):
DbPcdCNameTable = DbStringItemList(0, RawDataList = PcdCNameTableValue, LenList = PcdCNameLen)
PcdNameOffsetTable = Dict['PCD_NAME_OFFSET']
- DbPcdNameOffsetTable = DbItemList(4,RawDataList = PcdNameOffsetTable)
+ DbPcdNameOffsetTable = DbItemList(4, RawDataList = PcdNameOffsetTable)
SizeTableValue = zip(Dict['SIZE_TABLE_MAXIMUM_LENGTH'], Dict['SIZE_TABLE_CURRENT_LENGTH'])
DbSizeTableValue = DbSizeTableItemList(2, RawDataList = SizeTableValue)
@@ -754,16 +754,16 @@ def BuildExDataBase(Dict):
PcdTokenNumberMap = Dict['PCD_ORDER_TOKEN_NUMBER_MAP']
DbNameTotle = ["SkuidValue", "InitValueUint64", "VardefValueUint64", "InitValueUint32", "VardefValueUint32", "VpdHeadValue", "ExMapTable",
- "LocalTokenNumberTable", "GuidTable", "StringHeadValue", "PcdNameOffsetTable","VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
+ "LocalTokenNumberTable", "GuidTable", "StringHeadValue", "PcdNameOffsetTable", "VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
"SizeTableValue", "InitValueUint16", "VardefValueUint16", "InitValueUint8", "VardefValueUint8", "InitValueBoolean",
"VardefValueBoolean", "UnInitValueUint64", "UnInitValueUint32", "UnInitValueUint16", "UnInitValueUint8", "UnInitValueBoolean"]
DbTotal = [SkuidValue, InitValueUint64, VardefValueUint64, InitValueUint32, VardefValueUint32, VpdHeadValue, ExMapTable,
- LocalTokenNumberTable, GuidTable, StringHeadValue, PcdNameOffsetTable,VariableTable, StringTableLen, PcdTokenTable,PcdCNameTable,
+ LocalTokenNumberTable, GuidTable, StringHeadValue, PcdNameOffsetTable, VariableTable, StringTableLen, PcdTokenTable, PcdCNameTable,
SizeTableValue, InitValueUint16, VardefValueUint16, InitValueUint8, VardefValueUint8, InitValueBoolean,
VardefValueBoolean, UnInitValueUint64, UnInitValueUint32, UnInitValueUint16, UnInitValueUint8, UnInitValueBoolean]
DbItemTotal = [DbSkuidValue, DbInitValueUint64, DbVardefValueUint64, DbInitValueUint32, DbVardefValueUint32, DbVpdHeadValue, DbExMapTable,
- DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue, DbPcdNameOffsetTable,DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
+ DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue, DbPcdNameOffsetTable, DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
DbSizeTableValue, DbInitValueUint16, DbVardefValueUint16, DbInitValueUint8, DbVardefValueUint8, DbInitValueBoolean,
DbVardefValueBoolean, DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean]
@@ -822,7 +822,7 @@ def BuildExDataBase(Dict):
DbOffset += (8 - DbOffset % 8)
else:
assert(False)
- if isinstance(VariableRefTable[0],list):
+ if isinstance(VariableRefTable[0], list):
DbOffset += skuindex * 4
skuindex += 1
if DbIndex >= InitTableNum:
@@ -984,46 +984,46 @@ def CreatePcdDataBase(PcdDBData):
basedata = {}
if not PcdDBData:
return ""
- for skuname,skuid in PcdDBData:
- if len(PcdDBData[(skuname,skuid)][1]) != len(PcdDBData[("DEFAULT","0")][1]):
+ for skuname, skuid in PcdDBData:
+ if len(PcdDBData[(skuname, skuid)][1]) != len(PcdDBData[("DEFAULT", "0")][1]):
EdkLogger.ERROR("The size of each sku in one pcd are not same")
- for skuname,skuid in PcdDBData:
+ for skuname, skuid in PcdDBData:
if skuname == "DEFAULT":
continue
- delta[(skuname,skuid)] = [(index,data,hex(data)) for index,data in enumerate(PcdDBData[(skuname,skuid)][1]) if PcdDBData[(skuname,skuid)][1][index] != PcdDBData[("DEFAULT","0")][1][index]]
- basedata[(skuname,skuid)] = [(index,PcdDBData[("DEFAULT","0")][1][index],hex(PcdDBData[("DEFAULT","0")][1][index])) for index,data in enumerate(PcdDBData[(skuname,skuid)][1]) if PcdDBData[(skuname,skuid)][1][index] != PcdDBData[("DEFAULT","0")][1][index]]
- databasebuff = PcdDBData[("DEFAULT","0")][0]
+ delta[(skuname, skuid)] = [(index, data, hex(data)) for index, data in enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1][index] != PcdDBData[("DEFAULT", "0")][1][index]]
+ basedata[(skuname, skuid)] = [(index, PcdDBData[("DEFAULT", "0")][1][index], hex(PcdDBData[("DEFAULT", "0")][1][index])) for index, data in enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1][index] != PcdDBData[("DEFAULT", "0")][1][index]]
+ databasebuff = PcdDBData[("DEFAULT", "0")][0]
- for skuname,skuid in delta:
+ for skuname, skuid in delta:
# 8 byte align
if len(databasebuff) % 8 > 0:
for i in range(8 - (len(databasebuff) % 8)):
- databasebuff += pack("=B",0)
+ databasebuff += pack("=B", 0)
databasebuff += pack('=Q', int(skuid))
databasebuff += pack('=Q', 0)
- databasebuff += pack('=L', 8+8+4+4*len(delta[(skuname,skuid)]))
- for item in delta[(skuname,skuid)]:
- databasebuff += pack("=L",item[0])
- databasebuff = databasebuff[:-1] + pack("=B",item[1])
+ databasebuff += pack('=L', 8+8+4+4*len(delta[(skuname, skuid)]))
+ for item in delta[(skuname, skuid)]:
+ databasebuff += pack("=L", item[0])
+ databasebuff = databasebuff[:-1] + pack("=B", item[1])
totallen = len(databasebuff)
- totallenbuff = pack("=L",totallen)
+ totallenbuff = pack("=L", totallen)
newbuffer = databasebuff[:32]
for i in range(4):
newbuffer += totallenbuff[i]
- for i in range(36,totallen):
+ for i in range(36, totallen):
newbuffer += databasebuff[i]
return newbuffer
def CreateVarCheckBin(VarCheckTab):
- return VarCheckTab[('DEFAULT',"0")]
+ return VarCheckTab[('DEFAULT', "0")]
def CreateAutoGen(PcdDriverAutoGenData):
autogenC = TemplateString()
- for skuname,skuid in PcdDriverAutoGenData:
+ for skuname, skuid in PcdDriverAutoGenData:
autogenC.Append("//SKUID: %s" % skuname)
- autogenC.Append(PcdDriverAutoGenData[(skuname,skuid)][1].String)
- return (PcdDriverAutoGenData[(skuname,skuid)][0],autogenC)
-def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform,Phase):
- def prune_sku(pcd,skuname):
+ autogenC.Append(PcdDriverAutoGenData[(skuname, skuid)][1].String)
+ return (PcdDriverAutoGenData[(skuname, skuid)][0], autogenC)
+def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase):
+ def prune_sku(pcd, skuname):
new_pcd = copy.deepcopy(pcd)
new_pcd.SkuInfoList = {skuname:pcd.SkuInfoList[skuname]}
new_pcd.isinit = 'INIT'
@@ -1041,28 +1041,28 @@ def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform,Phase):
new_pcd.isinit = "UNINIT"
return new_pcd
DynamicPcds = Platform.DynamicPcdList
- DynamicPcdSet_Sku = {(SkuName,skuobj.SkuId):[] for pcd in DynamicPcds for (SkuName,skuobj) in pcd.SkuInfoList.items() }
- for skuname,skuid in DynamicPcdSet_Sku:
- DynamicPcdSet_Sku[(skuname,skuid)] = [prune_sku(pcd,skuname) for pcd in DynamicPcds]
+ DynamicPcdSet_Sku = {(SkuName, skuobj.SkuId):[] for pcd in DynamicPcds for (SkuName, skuobj) in pcd.SkuInfoList.items() }
+ for skuname, skuid in DynamicPcdSet_Sku:
+ DynamicPcdSet_Sku[(skuname, skuid)] = [prune_sku(pcd, skuname) for pcd in DynamicPcds]
PcdDBData = {}
PcdDriverAutoGenData = {}
VarCheckTableData = {}
if DynamicPcdSet_Sku:
- for skuname,skuid in DynamicPcdSet_Sku:
- AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform,DynamicPcdSet_Sku[(skuname,skuid)], Phase)
+ for skuname, skuid in DynamicPcdSet_Sku:
+ AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdSet_Sku[(skuname, skuid)], Phase)
final_data = ()
for item in PcdDbBuffer:
- final_data += unpack("B",item)
- PcdDBData[(skuname,skuid)] = (PcdDbBuffer, final_data)
- PcdDriverAutoGenData[(skuname,skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
- VarCheckTableData[(skuname,skuid)] = VarCheckTab
+ final_data += unpack("B", item)
+ PcdDBData[(skuname, skuid)] = (PcdDbBuffer, final_data)
+ PcdDriverAutoGenData[(skuname, skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
+ VarCheckTableData[(skuname, skuid)] = VarCheckTab
if Platform.Platform.VarCheckFlag:
dest = os.path.join(Platform.BuildDir, 'FV')
VarCheckTable = CreateVarCheckBin(VarCheckTableData)
VarCheckTable.dump(dest, Phase)
AdditionalAutoGenH, AdditionalAutoGenC = CreateAutoGen(PcdDriverAutoGenData)
else:
- AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform,{}, Phase)
+ AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, {}, Phase)
PcdDbBuffer = CreatePcdDataBase(PcdDBData)
return AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer
@@ -1103,20 +1103,20 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['PCD_INFO_FLAG'] = Platform.Platform.PcdInfoFlag
- for DatumType in ['UINT64','UINT32','UINT16','UINT8','BOOLEAN', "VOID*"]:
+ for DatumType in ['UINT64', 'UINT32', 'UINT16', 'UINT8', 'BOOLEAN', "VOID*"]:
Dict['VARDEF_CNAME_' + DatumType] = []
Dict['VARDEF_GUID_' + DatumType] = []
Dict['VARDEF_SKUID_' + DatumType] = []
Dict['VARDEF_VALUE_' + DatumType] = []
Dict['VARDEF_DB_VALUE_' + DatumType] = []
- for Init in ['INIT','UNINIT']:
+ for Init in ['INIT', 'UNINIT']:
Dict[Init+'_CNAME_DECL_' + DatumType] = []
Dict[Init+'_GUID_DECL_' + DatumType] = []
Dict[Init+'_NUMSKUS_DECL_' + DatumType] = []
Dict[Init+'_VALUE_' + DatumType] = []
Dict[Init+'_DB_VALUE_'+DatumType] = []
- for Type in ['STRING_HEAD','VPD_HEAD','VARIABLE_HEAD']:
+ for Type in ['STRING_HEAD', 'VPD_HEAD', 'VARIABLE_HEAD']:
Dict[Type + '_CNAME_DECL'] = []
Dict[Type + '_GUID_DECL'] = []
Dict[Type + '_NUMSKUS_DECL'] = []
@@ -1284,7 +1284,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['STRING_TABLE_INDEX'].append('')
else:
Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
- VarNameSize = len(VariableNameStructure.replace(',',' ').split())
+ VarNameSize = len(VariableNameStructure.replace(',', ' ').split())
Dict['STRING_TABLE_LENGTH'].append(VarNameSize )
Dict['STRING_TABLE_VALUE'].append(VariableNameStructure)
StringHeadOffsetList.append(str(StringTableSize) + 'U')
@@ -1292,7 +1292,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
VarStringDbOffsetList.append(StringTableSize)
Dict['STRING_DB_VALUE'].append(VarStringDbOffsetList)
StringTableIndex += 1
- StringTableSize += len(VariableNameStructure.replace(',',' ').split())
+ StringTableSize += len(VariableNameStructure.replace(',', ' ').split())
VariableHeadStringIndex = 0
for Index in range(Dict['STRING_TABLE_VALUE'].index(VariableNameStructure)):
VariableHeadStringIndex += Dict['STRING_TABLE_LENGTH'][Index]
@@ -1331,7 +1331,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
elif Pcd.DatumType in ("UINT32", "UINT16", "UINT8"):
Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "U")
elif Pcd.DatumType == "BOOLEAN":
- if eval(Sku.HiiDefaultValue) in [1,0]:
+ if eval(Sku.HiiDefaultValue) in [1, 0]:
Dict['VARDEF_VALUE_'+Pcd.DatumType].append(str(eval(Sku.HiiDefaultValue)) + "U")
else:
Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
@@ -1381,7 +1381,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
if Sku.DefaultValue[0] == 'L':
DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
- Size = len(DefaultValueBinStructure.replace(',',' ').split())
+ Size = len(DefaultValueBinStructure.replace(',', ' ').split())
Dict['STRING_TABLE_VALUE'].append(DefaultValueBinStructure)
elif Sku.DefaultValue[0] == '"':
DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
@@ -1696,7 +1696,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
# print Phase
Buffer = BuildExDataBase(Dict)
- return AutoGenH, AutoGenC, Buffer,VarCheckTab
+ return AutoGenH, AutoGenC, Buffer, VarCheckTab
def GetOrderedDynamicPcdList(DynamicPcdList, PcdTokenNumberList):
ReorderedDyPcdList = [None for i in range(len(DynamicPcdList))]
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index b82d7e4d2d37..a0b3497207b7 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -15,7 +15,7 @@
# Import Modules
#
from builtins import range
-from struct import pack,unpack
+from struct import pack, unpack
import collections
import copy
from Common.VariableAttributes import VariableAttributes
@@ -49,7 +49,7 @@ def PackGUID(Guid):
return GuidBuffer
class VariableMgr(object):
- def __init__(self, DefaultStoreMap,SkuIdMap):
+ def __init__(self, DefaultStoreMap, SkuIdMap):
self.VarInfo = []
self.DefaultStoreMap = DefaultStoreMap
self.SkuIdMap = SkuIdMap
@@ -59,19 +59,19 @@ class VariableMgr(object):
self.VarDefaultBuff = None
self.VarDeltaBuff = None
- def append_variable(self,uefi_var):
+ def append_variable(self, uefi_var):
self.VarInfo.append(uefi_var)
- def SetVpdRegionMaxSize(self,maxsize):
+ def SetVpdRegionMaxSize(self, maxsize):
self.VpdRegionSize = maxsize
- def SetVpdRegionOffset(self,vpdoffset):
+ def SetVpdRegionOffset(self, vpdoffset):
self.VpdRegionOffset = vpdoffset
- def PatchNVStoreDefaultMaxSize(self,maxsize):
+ def PatchNVStoreDefaultMaxSize(self, maxsize):
if not self.NVHeaderBuff:
return ""
- self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q",maxsize)
+ self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q", maxsize)
default_var_bin = self.format_data(self.NVHeaderBuff + self.VarDefaultBuff + self.VarDeltaBuff)
value_str = "{"
default_var_bin_strip = [ data.strip("""'""") for data in default_var_bin]
@@ -118,7 +118,7 @@ class VariableMgr(object):
for item in self.VarInfo:
if item.pcdindex not in indexedvarinfo:
indexedvarinfo[item.pcdindex] = dict()
- indexedvarinfo[item.pcdindex][(item.skuname,item.defaultstoragename)] = item
+ indexedvarinfo[item.pcdindex][(item.skuname, item.defaultstoragename)] = item
for index in indexedvarinfo:
sku_var_info = indexedvarinfo[index]
@@ -126,44 +126,44 @@ class VariableMgr(object):
default_data_buffer = ""
others_data_buffer = ""
tail = None
- default_sku_default = indexedvarinfo.get(index).get(("DEFAULT","STANDARD"))
+ default_sku_default = indexedvarinfo.get(index).get(("DEFAULT", "STANDARD"))
- if default_sku_default.data_type not in ["UINT8","UINT16","UINT32","UINT64","BOOLEAN"]:
+ if default_sku_default.data_type not in ["UINT8", "UINT16", "UINT32", "UINT64", "BOOLEAN"]:
var_max_len = max([len(var_item.default_value.split(",")) for var_item in sku_var_info.values()])
if len(default_sku_default.default_value.split(",")) < var_max_len:
tail = ",".join([ "0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(",")))])
- default_data_buffer = self.PACK_VARIABLES_DATA(default_sku_default.default_value,default_sku_default.data_type,tail)
+ default_data_buffer = self.PACK_VARIABLES_DATA(default_sku_default.default_value, default_sku_default.data_type, tail)
default_data_array = ()
for item in default_data_buffer:
- default_data_array += unpack("B",item)
+ default_data_array += unpack("B", item)
- if ("DEFAULT","STANDARD") not in var_data:
- var_data[("DEFAULT","STANDARD")] = collections.OrderedDict()
- var_data[("DEFAULT","STANDARD")][index] = (default_data_buffer,sku_var_info[("DEFAULT","STANDARD")])
+ if ("DEFAULT", "STANDARD") not in var_data:
+ var_data[("DEFAULT", "STANDARD")] = collections.OrderedDict()
+ var_data[("DEFAULT", "STANDARD")][index] = (default_data_buffer, sku_var_info[("DEFAULT", "STANDARD")])
- for (skuid,defaultstoragename) in indexedvarinfo.get(index):
+ for (skuid, defaultstoragename) in indexedvarinfo.get(index):
tail = None
- if (skuid,defaultstoragename) == ("DEFAULT","STANDARD"):
+ if (skuid, defaultstoragename) == ("DEFAULT", "STANDARD"):
continue
- other_sku_other = indexedvarinfo.get(index).get((skuid,defaultstoragename))
+ other_sku_other = indexedvarinfo.get(index).get((skuid, defaultstoragename))
- if default_sku_default.data_type not in ["UINT8","UINT16","UINT32","UINT64","BOOLEAN"]:
+ if default_sku_default.data_type not in ["UINT8", "UINT16", "UINT32", "UINT64", "BOOLEAN"]:
if len(other_sku_other.default_value.split(",")) < var_max_len:
tail = ",".join([ "0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(",")))])
- others_data_buffer = self.PACK_VARIABLES_DATA(other_sku_other.default_value,other_sku_other.data_type,tail)
+ others_data_buffer = self.PACK_VARIABLES_DATA(other_sku_other.default_value, other_sku_other.data_type, tail)
others_data_array = ()
for item in others_data_buffer:
- others_data_array += unpack("B",item)
+ others_data_array += unpack("B", item)
data_delta = self.calculate_delta(default_data_array, others_data_array)
- if (skuid,defaultstoragename) not in var_data:
- var_data[(skuid,defaultstoragename)] = collections.OrderedDict()
- var_data[(skuid,defaultstoragename)][index] = (data_delta,sku_var_info[(skuid,defaultstoragename)])
+ if (skuid, defaultstoragename) not in var_data:
+ var_data[(skuid, defaultstoragename)] = collections.OrderedDict()
+ var_data[(skuid, defaultstoragename)][index] = (data_delta, sku_var_info[(skuid, defaultstoragename)])
return var_data
def new_process_varinfo(self):
@@ -174,17 +174,17 @@ class VariableMgr(object):
if not var_data:
return []
- pcds_default_data = var_data.get(("DEFAULT","STANDARD"),{})
+ pcds_default_data = var_data.get(("DEFAULT", "STANDARD"), {})
NvStoreDataBuffer = ""
var_data_offset = collections.OrderedDict()
offset = NvStorageHeaderSize
- for default_data,default_info in pcds_default_data.values():
+ for default_data, default_info in pcds_default_data.values():
var_name_buffer = self.PACK_VARIABLE_NAME(default_info.var_name)
vendorguid = default_info.var_guid.split('-')
if default_info.var_attribute:
- var_attr_value,_ = VariableAttributes.GetVarAttributes(default_info.var_attribute)
+ var_attr_value, _ = VariableAttributes.GetVarAttributes(default_info.var_attribute)
else:
var_attr_value = 0x07
@@ -203,22 +203,22 @@ class VariableMgr(object):
nv_default_part = self.AlignData(self.PACK_DEFAULT_DATA(0, 0, self.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
data_delta_structure_buffer = ""
- for skuname,defaultstore in var_data:
- if (skuname,defaultstore) == ("DEFAULT","STANDARD"):
+ for skuname, defaultstore in var_data:
+ if (skuname, defaultstore) == ("DEFAULT", "STANDARD"):
continue
- pcds_sku_data = var_data.get((skuname,defaultstore))
+ pcds_sku_data = var_data.get((skuname, defaultstore))
delta_data_set = []
for pcdindex in pcds_sku_data:
offset = var_data_offset[pcdindex]
- delta_data,_ = pcds_sku_data[pcdindex]
+ delta_data, _ = pcds_sku_data[pcdindex]
delta_data = [(item[0] + offset, item[1]) for item in delta_data]
delta_data_set.extend(delta_data)
- data_delta_structure_buffer += self.AlignData(self.PACK_DELTA_DATA(skuname,defaultstore,delta_data_set), 8)
+ data_delta_structure_buffer += self.AlignData(self.PACK_DELTA_DATA(skuname, defaultstore, delta_data_set), 8)
size = len(nv_default_part + data_delta_structure_buffer) + 16
maxsize = self.VpdRegionSize if self.VpdRegionSize else size
- NV_Store_Default_Header = self.PACK_NV_STORE_DEFAULT_HEADER(size,maxsize)
+ NV_Store_Default_Header = self.PACK_NV_STORE_DEFAULT_HEADER(size, maxsize)
self.NVHeaderBuff = NV_Store_Default_Header
self.VarDefaultBuff =nv_default_part
@@ -226,14 +226,14 @@ class VariableMgr(object):
return self.format_data(NV_Store_Default_Header + nv_default_part + data_delta_structure_buffer)
- def format_data(self,data):
+ def format_data(self, data):
return [hex(item) for item in self.unpack_data(data)]
- def unpack_data(self,data):
+ def unpack_data(self, data):
final_data = ()
for item in data:
- final_data += unpack("B",item)
+ final_data += unpack("B", item)
return final_data
def calculate_delta(self, default, theother):
@@ -242,7 +242,7 @@ class VariableMgr(object):
data_delta = []
for i in range(len(default)):
if default[i] != theother[i]:
- data_delta.append((i,theother[i]))
+ data_delta.append((i, theother[i]))
return data_delta
def dump(self):
@@ -256,40 +256,40 @@ class VariableMgr(object):
return value_str
return ""
- def PACK_VARIABLE_STORE_HEADER(self,size):
+ def PACK_VARIABLE_STORE_HEADER(self, size):
#Signature: gEfiVariableGuid
Guid = "{ 0xddcf3616, 0x3275, 0x4164, { 0x98, 0xb6, 0xfe, 0x85, 0x70, 0x7f, 0xfe, 0x7d }}"
Guid = GuidStructureStringToGuidString(Guid)
GuidBuffer = PackGUID(Guid.split('-'))
- SizeBuffer = pack('=L',size)
- FormatBuffer = pack('=B',0x5A)
- StateBuffer = pack('=B',0xFE)
- reservedBuffer = pack('=H',0)
- reservedBuffer += pack('=L',0)
+ SizeBuffer = pack('=L', size)
+ FormatBuffer = pack('=B', 0x5A)
+ StateBuffer = pack('=B', 0xFE)
+ reservedBuffer = pack('=H', 0)
+ reservedBuffer += pack('=L', 0)
return GuidBuffer + SizeBuffer + FormatBuffer + StateBuffer + reservedBuffer
- def PACK_NV_STORE_DEFAULT_HEADER(self,size,maxsize):
- Signature = pack('=B',ord('N'))
- Signature += pack("=B",ord('S'))
- Signature += pack("=B",ord('D'))
- Signature += pack("=B",ord('B'))
+ def PACK_NV_STORE_DEFAULT_HEADER(self, size, maxsize):
+ Signature = pack('=B', ord('N'))
+ Signature += pack("=B", ord('S'))
+ Signature += pack("=B", ord('D'))
+ Signature += pack("=B", ord('B'))
- SizeBuffer = pack("=L",size)
- MaxSizeBuffer = pack("=Q",maxsize)
+ SizeBuffer = pack("=L", size)
+ MaxSizeBuffer = pack("=Q", maxsize)
return Signature + SizeBuffer + MaxSizeBuffer
- def PACK_VARIABLE_HEADER(self,attribute,namesize,datasize,vendorguid):
+ def PACK_VARIABLE_HEADER(self, attribute, namesize, datasize, vendorguid):
- Buffer = pack('=H',0x55AA) # pack StartID
- Buffer += pack('=B',0x3F) # pack State
- Buffer += pack('=B',0) # pack reserved
+ Buffer = pack('=H', 0x55AA) # pack StartID
+ Buffer += pack('=B', 0x3F) # pack State
+ Buffer += pack('=B', 0) # pack reserved
- Buffer += pack('=L',attribute)
- Buffer += pack('=L',namesize)
- Buffer += pack('=L',datasize)
+ Buffer += pack('=L', attribute)
+ Buffer += pack('=L', namesize)
+ Buffer += pack('=L', datasize)
Buffer += PackGUID(vendorguid)
@@ -300,63 +300,63 @@ class VariableMgr(object):
data_len = 0
if data_type == "VOID*":
for value_char in var_value.strip("{").strip("}").split(","):
- Buffer += pack("=B",int(value_char,16))
+ Buffer += pack("=B", int(value_char, 16))
data_len += len(var_value.split(","))
if tail:
for value_char in tail.split(","):
- Buffer += pack("=B",int(value_char,16))
+ Buffer += pack("=B", int(value_char, 16))
data_len += len(tail.split(","))
elif data_type == "BOOLEAN":
- Buffer += pack("=B",True) if var_value.upper() == "TRUE" else pack("=B",False)
+ Buffer += pack("=B", True) if var_value.upper() == "TRUE" else pack("=B", False)
data_len += 1
elif data_type == "UINT8":
- Buffer += pack("=B",GetIntegerValue(var_value))
+ Buffer += pack("=B", GetIntegerValue(var_value))
data_len += 1
elif data_type == "UINT16":
- Buffer += pack("=H",GetIntegerValue(var_value))
+ Buffer += pack("=H", GetIntegerValue(var_value))
data_len += 2
elif data_type == "UINT32":
- Buffer += pack("=L",GetIntegerValue(var_value))
+ Buffer += pack("=L", GetIntegerValue(var_value))
data_len += 4
elif data_type == "UINT64":
- Buffer += pack("=Q",GetIntegerValue(var_value))
+ Buffer += pack("=Q", GetIntegerValue(var_value))
data_len += 8
return Buffer
- def PACK_DEFAULT_DATA(self, defaultstoragename,skuid,var_value):
+ def PACK_DEFAULT_DATA(self, defaultstoragename, skuid, var_value):
Buffer = ""
- Buffer += pack("=L",4+8+8)
- Buffer += pack("=Q",int(skuid))
- Buffer += pack("=Q",int(defaultstoragename))
+ Buffer += pack("=L", 4+8+8)
+ Buffer += pack("=Q", int(skuid))
+ Buffer += pack("=Q", int(defaultstoragename))
for item in var_value:
- Buffer += pack("=B",item)
+ Buffer += pack("=B", item)
- Buffer = pack("=L",len(Buffer)+4) + Buffer
+ Buffer = pack("=L", len(Buffer)+4) + Buffer
return Buffer
- def GetSkuId(self,skuname):
+ def GetSkuId(self, skuname):
if skuname not in self.SkuIdMap:
return None
return self.SkuIdMap.get(skuname)[0]
- def GetDefaultStoreId(self,dname):
+ def GetDefaultStoreId(self, dname):
if dname not in self.DefaultStoreMap:
return None
return self.DefaultStoreMap.get(dname)[0]
- def PACK_DELTA_DATA(self,skuname,defaultstoragename,delta_list):
+ def PACK_DELTA_DATA(self, skuname, defaultstoragename, delta_list):
skuid = self.GetSkuId(skuname)
defaultstorageid = self.GetDefaultStoreId(defaultstoragename)
Buffer = ""
- Buffer += pack("=L",4+8+8)
- Buffer += pack("=Q",int(skuid))
- Buffer += pack("=Q",int(defaultstorageid))
- for (delta_offset,value) in delta_list:
- Buffer += pack("=L",delta_offset)
- Buffer = Buffer[:-1] + pack("=B",value)
+ Buffer += pack("=L", 4+8+8)
+ Buffer += pack("=Q", int(skuid))
+ Buffer += pack("=Q", int(defaultstorageid))
+ for (delta_offset, value) in delta_list:
+ Buffer += pack("=L", delta_offset)
+ Buffer = Buffer[:-1] + pack("=B", value)
- Buffer = pack("=L",len(Buffer) + 4) + Buffer
+ Buffer = pack("=L", len(Buffer) + 4) + Buffer
return Buffer
@@ -364,13 +364,13 @@ class VariableMgr(object):
mybuffer = data
if (len(data) % align) > 0:
for i in range(align - (len(data) % align)):
- mybuffer += pack("=B",0)
+ mybuffer += pack("=B", 0)
return mybuffer
def PACK_VARIABLE_NAME(self, var_name):
Buffer = ""
for name_char in var_name.strip("{").strip("}").split(","):
- Buffer += pack("=B",int(name_char,16))
+ Buffer += pack("=B", int(name_char, 16))
return Buffer
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index daf11612d83b..1bb37d744ec9 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -350,7 +350,7 @@ class GenVPD :
#
# Enhanced for support "|" character in the string.
#
- ValueList = ['', '', '', '','']
+ ValueList = ['', '', '', '', '']
ValueRe = re.compile(r'\s*L?\".*\|.*\"\s*$')
PtrValue = ValueRe.findall(line)
@@ -400,7 +400,7 @@ class GenVPD :
count = 0
for line in self.FileLinesList:
if line != None :
- PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4],line[5], self.InputFileName)
+ PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4], line[5], self.InputFileName)
# Strip the space char
PCD.PcdCName = PCD.PcdCName.strip(' ')
PCD.SkuId = PCD.SkuId.strip(' ')
@@ -514,10 +514,10 @@ class GenVPD :
index =0
for pcd in self.PcdUnknownOffsetList:
index += 1
- if pcd.PcdCName == ".".join(("gEfiMdeModulePkgTokenSpaceGuid","PcdNvStoreDefaultValueBuffer")):
+ if pcd.PcdCName == ".".join(("gEfiMdeModulePkgTokenSpaceGuid", "PcdNvStoreDefaultValueBuffer")):
if index != len(self.PcdUnknownOffsetList):
for i in range(len(self.PcdUnknownOffsetList) - index):
- self.PcdUnknownOffsetList[index+i -1 ] , self.PcdUnknownOffsetList[index+i] = self.PcdUnknownOffsetList[index+i] , self.PcdUnknownOffsetList[index+i -1]
+ self.PcdUnknownOffsetList[index+i -1 ], self.PcdUnknownOffsetList[index+i] = self.PcdUnknownOffsetList[index+i], self.PcdUnknownOffsetList[index+i -1]
#
# Process all Offset value are "*"
@@ -598,7 +598,7 @@ class GenVPD :
eachUnfixedPcd.PcdOffset = str(hex(LastOffset))
eachUnfixedPcd.PcdBinOffset = LastOffset
# Insert this pcd into fixed offset pcd list.
- self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount,eachUnfixedPcd)
+ self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount, eachUnfixedPcd)
# Delete the item's offset that has been fixed and added into fixed offset list
self.PcdUnknownOffsetList.pop(countOfUnfixedList)
@@ -686,7 +686,7 @@ class GenVPD :
for eachPcd in self.PcdFixedOffsetSizeList :
# write map file
try :
- fMapFile.write("%s | %s | %s | %s | %s \n" % (eachPcd.PcdCName, eachPcd.SkuId,eachPcd.PcdOffset, eachPcd.PcdSize,eachPcd.PcdUnpackValue))
+ fMapFile.write("%s | %s | %s | %s | %s \n" % (eachPcd.PcdCName, eachPcd.SkuId, eachPcd.PcdOffset, eachPcd.PcdSize, eachPcd.PcdUnpackValue))
except:
EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.MapFileName, None)
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 0bc2306ea61a..d69908dabfec 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -497,8 +497,8 @@ PCDS_DYNAMICEX_DEFAULT = "PcdsDynamicExDefault"
PCDS_DYNAMICEX_VPD = "PcdsDynamicExVpd"
PCDS_DYNAMICEX_HII = "PcdsDynamicExHii"
-SECTIONS_HAVE_ITEM_PCD = [PCDS_DYNAMIC_DEFAULT.upper(),PCDS_DYNAMIC_VPD.upper(),PCDS_DYNAMIC_HII.upper(), \
- PCDS_DYNAMICEX_DEFAULT.upper(),PCDS_DYNAMICEX_VPD.upper(),PCDS_DYNAMICEX_HII.upper()]
+SECTIONS_HAVE_ITEM_PCD = [PCDS_DYNAMIC_DEFAULT.upper(), PCDS_DYNAMIC_VPD.upper(), PCDS_DYNAMIC_HII.upper(), \
+ PCDS_DYNAMICEX_DEFAULT.upper(), PCDS_DYNAMICEX_VPD.upper(), PCDS_DYNAMICEX_HII.upper()]
# Section allowed to have items after arch
SECTIONS_HAVE_ITEM_AFTER_ARCH = [TAB_LIBRARY_CLASSES.upper(), TAB_DEPEX.upper(), TAB_USER_EXTENSIONS.upper(),
PCDS_DYNAMIC_DEFAULT.upper(),
diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/Source/Python/Common/DscClassObject.py
index f42d247cad33..e6abc1f036ac 100644
--- a/BaseTools/Source/Python/Common/DscClassObject.py
+++ b/BaseTools/Source/Python/Common/DscClassObject.py
@@ -1307,7 +1307,7 @@ class Dsc(DscObject):
# Parse '!else'
#
if LineValue.upper().find(TAB_ELSE.upper()) > -1:
- Key = IfDefList[-1][0].split(' ' , 1)[0].strip()
+ Key = IfDefList[-1][0].split(' ', 1)[0].strip()
self.InsertConditionalStatement(Filename, FileID, Model, IfDefList, StartLine, Arch)
IfDefList.append((Key, StartLine, MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE))
continue
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspace.py b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
index ed85e4ee0b06..52f63ae53df8 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspace.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
@@ -114,7 +114,7 @@ class EdkIIWorkspace:
# @retval string The full path filename
#
def WorkspaceFile(self, FileName):
- return os.path.realpath(mws.join(self.WorkspaceDir,FileName))
+ return os.path.realpath(mws.join(self.WorkspaceDir, FileName))
## Convert to a real path filename
#
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index f7dbb29ee882..90ef92a14f41 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -164,7 +164,7 @@ class ValueExpression(object):
if Oprand1[0] in ['"', "'"] or Oprand1.startswith('L"') or Oprand1.startswith("L'")or Oprand1.startswith('UINT'):
Oprand1, Size = ParseFieldValue(Oprand1)
else:
- Oprand1,Size = ParseFieldValue('"' + Oprand1 + '"')
+ Oprand1, Size = ParseFieldValue('"' + Oprand1 + '"')
if type(Oprand2) == type(''):
if Oprand2[0] in ['"', "'"] or Oprand2.startswith('L"') or Oprand2.startswith("L'") or Oprand2.startswith('UINT'):
Oprand2, Size = ParseFieldValue(Oprand2)
@@ -493,7 +493,7 @@ class ValueExpression(object):
IsArray = IsGuid = False
if len(Token.split(',')) == 11 and len(Token.split(',{')) == 2 \
and len(Token.split('},')) == 1:
- HexLen = [11,6,6,5,4,4,4,4,4,4,6]
+ HexLen = [11, 6, 6, 5, 4, 4, 4, 4, 4, 4, 6]
HexList= Token.split(',')
if HexList[3].startswith('{') and \
not [Index for Index, Hex in enumerate(HexList) if len(Hex) > HexLen[Index]]:
@@ -688,7 +688,7 @@ class ValueExpression(object):
# Parse operator
def _GetOperator(self):
self.__SkipWS()
- LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?',':']
+ LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?', ':']
self._Token = ''
Expr = self._Expr[self._Idx:]
diff --git a/BaseTools/Source/Python/Common/FdfParserLite.py b/BaseTools/Source/Python/Common/FdfParserLite.py
index f2741616c46f..6b7612303730 100644
--- a/BaseTools/Source/Python/Common/FdfParserLite.py
+++ b/BaseTools/Source/Python/Common/FdfParserLite.py
@@ -2341,7 +2341,7 @@ class FdfParser(object):
AlignValue = None
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
AlignValue = self.__Token
@@ -2610,7 +2610,7 @@ class FdfParser(object):
AlignValue = None
if self.__GetAlignment():
- if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
AlignValue = self.__Token
@@ -2927,7 +2927,7 @@ class FdfParser(object):
AlignValue = ""
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment At Line ", self.FileName, self.CurrentLineNumber)
AlignValue = self.__Token
@@ -2992,7 +2992,7 @@ class FdfParser(object):
CheckSum = True
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment At Line ", self.FileName, self.CurrentLineNumber)
if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
@@ -3067,7 +3067,7 @@ class FdfParser(object):
FvImageSectionObj.FvFileType = self.__Token
if self.__GetAlignment():
- if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment At Line ", self.FileName, self.CurrentLineNumber)
FvImageSectionObj.Alignment = self.__Token
@@ -3135,7 +3135,7 @@ class FdfParser(object):
EfiSectionObj.BuildNum = self.__Token
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 6878522d59d5..10cb95559822 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -125,7 +125,7 @@ def _parseForGCC(lines, efifilepath, varnames):
if Str:
m = re.match('^([\da-fA-Fx]+) +([\da-fA-Fx]+)', Str.strip())
if m != None:
- varoffset.append((varname, int(m.groups(0)[0], 16) , int(sections[-1][1], 16), sections[-1][0]))
+ varoffset.append((varname, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
if not varoffset:
return []
@@ -1475,15 +1475,15 @@ def AnalyzePcdExpression(Setting):
return FieldList
def ParseDevPathValue (Value):
- DevPathList = [ "Path","HardwarePath","Pci","PcCard","MemoryMapped","VenHw","Ctrl","BMC","AcpiPath","Acpi","PciRoot",
- "PcieRoot","Floppy","Keyboard","Serial","ParallelPort","AcpiEx","AcpiExp","AcpiAdr","Msg","Ata","Scsi",
- "Fibre","FibreEx","I1394","USB","I2O","Infiniband","VenMsg","VenPcAnsi","VenVt100","VenVt100Plus",
- "VenUtf8","UartFlowCtrl","SAS","SasEx","NVMe","UFS","SD","eMMC","DebugPort","MAC","IPv4","IPv6","Uart",
- "UsbClass","UsbAudio","UsbCDCControl","UsbHID","UsbImage","UsbPrinter","UsbMassStorage","UsbHub",
- "UsbCDCData","UsbSmartCard","UsbVideo","UsbDiagnostic","UsbWireless","UsbDeviceFirmwareUpdate",
- "UsbIrdaBridge","UsbTestAndMeasurement","UsbWwid","Unit","iSCSI","Vlan","Uri","Bluetooth","Wi-Fi",
- "MediaPath","HD","CDROM","VenMedia","Media","Fv","FvFile","Offset","RamDisk","VirtualDisk","VirtualCD",
- "PersistentVirtualDisk","PersistentVirtualCD","BbsPath","BBS","Sata" ]
+ DevPathList = [ "Path", "HardwarePath", "Pci", "PcCard", "MemoryMapped", "VenHw", "Ctrl", "BMC", "AcpiPath", "Acpi", "PciRoot",
+ "PcieRoot", "Floppy", "Keyboard", "Serial", "ParallelPort", "AcpiEx", "AcpiExp", "AcpiAdr", "Msg", "Ata", "Scsi",
+ "Fibre", "FibreEx", "I1394", "USB", "I2O", "Infiniband", "VenMsg", "VenPcAnsi", "VenVt100", "VenVt100Plus",
+ "VenUtf8", "UartFlowCtrl", "SAS", "SasEx", "NVMe", "UFS", "SD", "eMMC", "DebugPort", "MAC", "IPv4", "IPv6", "Uart",
+ "UsbClass", "UsbAudio", "UsbCDCControl", "UsbHID", "UsbImage", "UsbPrinter", "UsbMassStorage", "UsbHub",
+ "UsbCDCData", "UsbSmartCard", "UsbVideo", "UsbDiagnostic", "UsbWireless", "UsbDeviceFirmwareUpdate",
+ "UsbIrdaBridge", "UsbTestAndMeasurement", "UsbWwid", "Unit", "iSCSI", "Vlan", "Uri", "Bluetooth", "Wi-Fi",
+ "MediaPath", "HD", "CDROM", "VenMedia", "Media", "Fv", "FvFile", "Offset", "RamDisk", "VirtualDisk", "VirtualCD",
+ "PersistentVirtualDisk", "PersistentVirtualCD", "BbsPath", "BBS", "Sata" ]
if '\\' in Value:
Value.replace('\\', '/').replace(' ', '')
for Item in Value.split('/'):
@@ -1665,7 +1665,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
# Value, Size = ParseFieldValue(Value)
if Size:
try:
- int(Size,16) if Size.upper().startswith("0X") else int(Size)
+ int(Size, 16) if Size.upper().startswith("0X") else int(Size)
except:
IsValid = False
Size = -1
@@ -1694,7 +1694,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
if Size:
try:
- int(Size,16) if Size.upper().startswith("0X") else int(Size)
+ int(Size, 16) if Size.upper().startswith("0X") else int(Size)
except:
IsValid = False
Size = -1
@@ -1716,7 +1716,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
IsValid = (len(FieldList) <= 3)
if Size:
try:
- int(Size,16) if Size.upper().startswith("0X") else int(Size)
+ int(Size, 16) if Size.upper().startswith("0X") else int(Size)
except:
IsValid = False
Size = -1
@@ -1920,7 +1920,7 @@ def ConvertStringToByteArray(Value):
Value = eval(Value) # translate escape character
NewValue = '{'
- for Index in range(0,len(Value)):
+ for Index in range(0, len(Value)):
if Unicode:
NewValue = NewValue + str(ord(Value[Index]) % 0x10000) + ','
else:
@@ -2164,28 +2164,28 @@ class PeImageClass():
return Value
class DefaultStore():
- def __init__(self,DefaultStores ):
+ def __init__(self, DefaultStores):
self.DefaultStores = DefaultStores
- def DefaultStoreID(self,DefaultStoreName):
- for key,value in self.DefaultStores.items():
+ def DefaultStoreID(self, DefaultStoreName):
+ for key, value in self.DefaultStores.items():
if value == DefaultStoreName:
return key
return None
def GetDefaultDefault(self):
if not self.DefaultStores or "0" in self.DefaultStores:
- return "0",TAB_DEFAULT_STORES_DEFAULT
+ return "0", TAB_DEFAULT_STORES_DEFAULT
else:
minvalue = min([int(value_str) for value_str in self.DefaultStores.keys()])
return (str(minvalue), self.DefaultStores[str(minvalue)])
- def GetMin(self,DefaultSIdList):
+ def GetMin(self, DefaultSIdList):
if not DefaultSIdList:
return "STANDARD"
storeidset = {storeid for storeid, storename in self.DefaultStores.values() if storename in DefaultSIdList}
if not storeidset:
return ""
minid = min(storeidset )
- for sid,name in self.DefaultStores.values():
+ for sid, name in self.DefaultStores.values():
if sid == minid:
return name
class SkuClass():
@@ -2200,7 +2200,7 @@ class SkuClass():
for SkuName in SkuIds:
SkuId = SkuIds[SkuName][0]
- skuid_num = int(SkuId,16) if SkuId.upper().startswith("0X") else int(SkuId)
+ skuid_num = int(SkuId, 16) if SkuId.upper().startswith("0X") else int(SkuId)
if skuid_num > 0xFFFFFFFFFFFFFFFF:
EdkLogger.error("build", PARAMETER_INVALID,
ExtraData = "SKU-ID [%s] value %s exceeds the max value of UINT64"
@@ -2249,9 +2249,9 @@ class SkuClass():
self.__SkuInherit = {}
for item in self.SkuData.values():
self.__SkuInherit[item[1]]=item[2] if item[2] else "DEFAULT"
- return self.__SkuInherit.get(skuname,"DEFAULT")
+ return self.__SkuInherit.get(skuname, "DEFAULT")
- def GetSkuChain(self,sku):
+ def GetSkuChain(self, sku):
if sku == "DEFAULT":
return ["DEFAULT"]
skulist = [sku]
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 4357f240f423..496961554e87 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -176,7 +176,7 @@ class EQOperatorObject(object):
raise BadExpression(ERR_SNYTAX % Expr)
rangeId1 = str(uuid.uuid1())
rangeContainer = RangeContainer()
- rangeContainer.push(RangeObject(int(Operand) , int(Operand)))
+ rangeContainer.push(RangeObject(int(Operand), int(Operand)))
SymbolTable[rangeId1] = rangeContainer
return rangeId1
@@ -473,7 +473,7 @@ class RangeExpression(object):
# [!]*A
def _RelExpr(self):
- if self._IsOperator(["NOT" , "LE", "GE", "LT", "GT", "EQ", "XOR"]):
+ if self._IsOperator(["NOT", "LE", "GE", "LT", "GT", "EQ", "XOR"]):
Token = self._Token
Val = self._NeExpr()
try:
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index e6c7a3b74ee1..358e7b8d7c31 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -739,7 +739,7 @@ def SplitString(String):
# @param StringList: A list for strings to be converted
#
def ConvertToSqlString(StringList):
- return map(lambda s: s.replace("'", "''") , StringList)
+ return map(lambda s: s.replace("'", "''"), StringList)
## Convert To Sql String
#
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 84dd7ac563dd..d59697c64b68 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -89,7 +89,7 @@ class VpdInfoFile:
#
# @param offset integer value for VPD's offset in specific SKU.
#
- def Add(self, Vpd, skuname,Offset):
+ def Add(self, Vpd, skuname, Offset):
if (Vpd == None):
EdkLogger.error("VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOWN_ERROR, "Invalid VPD PCD entry.")
@@ -141,7 +141,7 @@ class VpdInfoFile:
if PcdValue == "" :
PcdValue = Pcd.DefaultValue
- Content += "%s.%s|%s|%s|%s|%s \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname,str(self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(),PcdValue)
+ Content += "%s.%s|%s|%s|%s|%s \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname, str(self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(), PcdValue)
i += 1
return SaveFileOnChange(FilePath, Content, False)
@@ -170,8 +170,8 @@ class VpdInfoFile:
# the line must follow output format defined in BPDG spec.
#
try:
- PcdName, SkuId,Offset, Size, Value = Line.split("#")[0].split("|")
- PcdName, SkuId,Offset, Size, Value = PcdName.strip(), SkuId.strip(),Offset.strip(), Size.strip(), Value.strip()
+ PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
+ PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
TokenSpaceName, PcdTokenName = PcdName.split(".")
except:
EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Fail to parse VPD information file %s" % FilePath)
@@ -180,7 +180,7 @@ class VpdInfoFile:
if (TokenSpaceName, PcdTokenName) not in self._VpdInfo:
self._VpdInfo[(TokenSpaceName, PcdTokenName)] = []
- self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId,Offset, Value))
+ self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId, Offset, Value))
for VpdObject in self._VpdArray.keys():
VpdObjectTokenCName = VpdObject.TokenCName
for PcdItem in GlobalData.MixedPcd:
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index 2df8fc3e0c26..bd4f10e1edff 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -785,10 +785,10 @@ class CParser(Parser):
if self.backtracking == 0:
if d != None:
- self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
+ self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
else:
self.function_definition_stack[-1].ModifierText = ''
- self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start,declarator1.stop)
+ self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
self.function_definition_stack[-1].DeclLine = declarator1.start.line
self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
if a != None:
@@ -922,9 +922,9 @@ class CParser(Parser):
if self.backtracking == 0:
if b != None:
- self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
+ self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
else:
- self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
+ self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
@@ -959,7 +959,7 @@ class CParser(Parser):
if self.backtracking == 0:
if t != None:
- self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
+ self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
@@ -1403,7 +1403,7 @@ class CParser(Parser):
if self.backtracking == 0:
if s.stop != None:
- self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
+ self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
@@ -1418,7 +1418,7 @@ class CParser(Parser):
if self.backtracking == 0:
if e.stop != None:
- self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -5401,7 +5401,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
+ self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
# C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
while True: #loop65
@@ -5501,7 +5501,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
+ self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
@@ -8277,7 +8277,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -16384,7 +16384,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
self.following.append(self.FOLLOW_statement_in_selection_statement2284)
self.statement()
@@ -16503,7 +16503,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -16535,7 +16535,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -16582,7 +16582,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index e04b67732141..145c7435cd12 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -562,7 +562,7 @@ class InfParser(MetaFileParser):
NmakeLine = ''
# section content
- self._ValueList = ['','','']
+ self._ValueList = ['', '', '']
# parse current line, result will be put in self._ValueList
self._SectionParser[self._SectionType](self)
if self._ValueList == None or self._ItemType == MODEL_META_DATA_DEFINE:
@@ -921,7 +921,7 @@ class DscParser(MetaFileParser):
## Directive statement parser
def _DirectiveParser(self):
- self._ValueList = ['','','']
+ self._ValueList = ['', '', '']
TokenList = GetSplitValueList(self._CurrentLine, ' ', 1)
self._ValueList[0:len(TokenList)] = TokenList
@@ -1111,7 +1111,7 @@ class DscParser(MetaFileParser):
## Override parent's method since we'll do all macro replacements in parser
def _GetMacros(self):
- Macros = dict( [('ARCH','IA32'), ('FAMILY','MSFT'),('TOOL_CHAIN_TAG','VS2008x86'),('TARGET','DEBUG')])
+ Macros = dict( [('ARCH', 'IA32'), ('FAMILY', 'MSFT'), ('TOOL_CHAIN_TAG', 'VS2008x86'), ('TARGET', 'DEBUG')])
Macros.update(self._FileLocalMacros)
Macros.update(self._GetApplicableSectionMacro())
Macros.update(GlobalData.gEdkGlobal)
@@ -1226,7 +1226,7 @@ class DscParser(MetaFileParser):
self._RawTable.Drop()
self._Table.Drop()
for Record in RecordList:
- EccGlobalData.gDb.TblDsc.Insert(Record[1],Record[2],Record[3],Record[4],Record[5],Record[6],Record[7],Record[8],Record[9],Record[10],Record[11],Record[12],Record[13],Record[14])
+ EccGlobalData.gDb.TblDsc.Insert(Record[1], Record[2], Record[3], Record[4], Record[5], Record[6], Record[7], Record[8], Record[9], Record[10], Record[11], Record[12], Record[13], Record[14])
GlobalData.gPlatformDefines.update(self._FileLocalMacros)
self._PostProcessed = True
self._Content = None
@@ -1247,7 +1247,7 @@ class DscParser(MetaFileParser):
def __RetrievePcdValue(self):
Records = self._RawTable.Query(MODEL_PCD_FEATURE_FLAG, BelongsToItem=-1.0)
- for TokenSpaceGuid,PcdName,Value,Dummy2,Dummy3,ID,Line in Records:
+ for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
Value, DatumType, MaxDatumSize = AnalyzePcdData(Value)
# Only use PCD whose value is straitforward (no macro and PCD)
if self.SymbolPattern.findall(Value):
@@ -1572,7 +1572,7 @@ class DecParser(MetaFileParser):
continue
# section content
- self._ValueList = ['','','']
+ self._ValueList = ['', '', '']
self._SectionParser[self._SectionType[0]](self)
if self._ValueList == None or self._ItemType == MODEL_META_DATA_DEFINE:
self._ItemType = -1
@@ -1718,7 +1718,7 @@ class DecParser(MetaFileParser):
GuidValue = GuidValue.lstrip(' {')
HexList.append('0x' + str(GuidValue[2:]))
Index += 1
- self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (HexList[0], HexList[1], HexList[2],HexList[3],HexList[4],HexList[5],HexList[6],HexList[7],HexList[8],HexList[9],HexList[10])
+ self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (HexList[0], HexList[1], HexList[2], HexList[3], HexList[4], HexList[5], HexList[6], HexList[7], HexList[8], HexList[9], HexList[10])
else:
EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value format",
ExtraData=self._CurrentLine + \
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index 2df8fc3e0c26..bd4f10e1edff 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -785,10 +785,10 @@ class CParser(Parser):
if self.backtracking == 0:
if d != None:
- self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
+ self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
else:
self.function_definition_stack[-1].ModifierText = ''
- self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start,declarator1.stop)
+ self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
self.function_definition_stack[-1].DeclLine = declarator1.start.line
self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
if a != None:
@@ -922,9 +922,9 @@ class CParser(Parser):
if self.backtracking == 0:
if b != None:
- self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
+ self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
else:
- self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
+ self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
@@ -959,7 +959,7 @@ class CParser(Parser):
if self.backtracking == 0:
if t != None:
- self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
+ self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
@@ -1403,7 +1403,7 @@ class CParser(Parser):
if self.backtracking == 0:
if s.stop != None:
- self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
+ self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
@@ -1418,7 +1418,7 @@ class CParser(Parser):
if self.backtracking == 0:
if e.stop != None:
- self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -5401,7 +5401,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
+ self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
# C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
while True: #loop65
@@ -5501,7 +5501,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
+ self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
@@ -8277,7 +8277,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -16384,7 +16384,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
self.following.append(self.FOLLOW_statement_in_selection_statement2284)
self.statement()
@@ -16503,7 +16503,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -16535,7 +16535,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
@@ -16582,7 +16582,7 @@ class CParser(Parser):
if self.failed:
return
if self.backtracking == 0:
- self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+ self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index c70f62f393a9..ceefc952237f 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -128,11 +128,11 @@ def GetIdentifierList():
for pp in FileProfile.PPDirectiveList:
Type = GetIdType(pp.Content)
- IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0],pp.StartPos[1],pp.EndPos[0],pp.EndPos[1])
+ IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1])
IdList.append(IdPP)
for ae in FileProfile.AssignmentExpressionList:
- IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.StartPos[0],ae.StartPos[1],ae.EndPos[0],ae.EndPos[1])
+ IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.StartPos[0], ae.StartPos[1], ae.EndPos[0], ae.EndPos[1])
IdList.append(IdAE)
FuncDeclPattern = GetFuncDeclPattern()
@@ -154,7 +154,7 @@ def GetIdentifierList():
var.Modifier += ' ' + FuncNamePartList[Index]
var.Declarator = var.Declarator.lstrip().lstrip(FuncNamePartList[Index])
Index += 1
- IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+ IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
IdList.append(IdVar)
continue
@@ -167,7 +167,7 @@ def GetIdentifierList():
var.Modifier += ' ' + Name[LSBPos:]
Name = Name[0:LSBPos]
- IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+ IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
IdList.append(IdVar)
else:
DeclList = var.Declarator.split('=')
@@ -176,7 +176,7 @@ def GetIdentifierList():
LSBPos = var.Declarator.find('[')
var.Modifier += ' ' + Name[LSBPos:]
Name = Name[0:LSBPos]
- IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+ IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
IdList.append(IdVar)
for enum in FileProfile.EnumerationDefinitionList:
@@ -184,7 +184,7 @@ def GetIdentifierList():
RBPos = enum.Content.find('}')
Name = enum.Content[4:LBPos].strip()
Value = enum.Content[LBPos+1:RBPos]
- IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0],enum.StartPos[1],enum.EndPos[0],enum.EndPos[1])
+ IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0], enum.StartPos[1], enum.EndPos[0], enum.EndPos[1])
IdList.append(IdEnum)
for su in FileProfile.StructUnionDefinitionList:
@@ -201,7 +201,7 @@ def GetIdentifierList():
else:
Name = su.Content[SkipLen:LBPos].strip()
Value = su.Content[LBPos+1:RBPos]
- IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0],su.StartPos[1],su.EndPos[0],su.EndPos[1])
+ IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1])
IdList.append(IdPE)
TdFuncPointerPattern = GetTypedefFuncPointerPattern()
@@ -224,11 +224,11 @@ def GetIdentifierList():
Name = TmpStr[0:RBPos]
Value = 'FP' + TmpStr[RBPos + 1:]
- IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0],td.StartPos[1],td.EndPos[0],td.EndPos[1])
+ IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0], td.StartPos[1], td.EndPos[0], td.EndPos[1])
IdList.append(IdTd)
for funcCall in FileProfile.FunctionCallingList:
- IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0],funcCall.StartPos[1],funcCall.EndPos[0],funcCall.EndPos[1])
+ IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndPos[1])
IdList.append(IdFC)
return IdList
@@ -330,7 +330,7 @@ def GetFunctionList():
FuncDef.Modifier += ' ' + FuncNamePartList[Index]
Index += 1
- FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0],FuncDef.StartPos[1],FuncDef.EndPos[0],FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
+ FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0], FuncDef.StartPos[1], FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
FuncObjList.append(FuncObj)
return FuncObjList
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 27fe2619a35f..b678079b3785 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -23,7 +23,7 @@ import FfsFileStatement
from GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import AprioriSectionClassObject
from Common.String import *
-from Common.Misc import SaveFileOnChange,PathClass
+from Common.Misc import SaveFileOnChange, PathClass
from Common import EdkLogger
from Common.BuildToolError import *
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index 5b806d9e4482..1fa202149b25 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -207,7 +207,7 @@ class CapsulePayload(CapsuleData):
#
Guid = self.ImageTypeId.split('-')
Buffer = pack('=ILHHBBBBBBBBBBBBIIQ',
- int(self.Version,16),
+ int(self.Version, 16),
int(Guid[0], 16),
int(Guid[1], 16),
int(Guid[2], 16),
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index 5029ec7a1823..d24df30cb734 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -130,7 +130,7 @@ class EfiSection (EfiSectionClassObject):
elif FileList != []:
for File in FileList:
Index = Index + 1
- Num = '%s.%d' %(SecNum , Index)
+ Num = '%s.%d' %(SecNum, Index)
OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + Ffs.SectionSuffix.get(SectionType))
f = open(File, 'r')
VerString = f.read()
@@ -187,7 +187,7 @@ class EfiSection (EfiSectionClassObject):
elif FileList != []:
for File in FileList:
Index = Index + 1
- Num = '%s.%d' %(SecNum , Index)
+ Num = '%s.%d' %(SecNum, Index)
OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + Ffs.SectionSuffix.get(SectionType))
f = open(File, 'r')
UiString = f.read()
@@ -228,7 +228,7 @@ class EfiSection (EfiSectionClassObject):
for File in FileList:
""" Copy Map file to FFS output path """
Index = Index + 1
- Num = '%s.%d' %(SecNum , Index)
+ Num = '%s.%d' %(SecNum, Index)
OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + Num + Ffs.SectionSuffix.get(SectionType))
File = GenFdsGlobalVariable.MacroExtend(File, Dict)
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index f735d3b5b015..21060625217e 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -136,7 +136,7 @@ class FD(FDClassObject):
# Call each region's AddToBuffer function
#
GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
- RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict,Flag=Flag)
+ RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict, Flag=Flag)
#
# Write the buffer contents to Fd file
#
@@ -162,7 +162,7 @@ class FD(FDClassObject):
if len(RegionObj.RegionDataList) == 1:
RegionData = RegionObj.RegionDataList[0]
FvList.append(RegionData.upper())
- FvAddDict[RegionData.upper()] = (int(self.BaseAddress,16) + \
+ FvAddDict[RegionData.upper()] = (int(self.BaseAddress, 16) + \
RegionObj.Offset, RegionObj.Size)
else:
Offset = RegionObj.Offset
@@ -177,7 +177,7 @@ class FD(FDClassObject):
Size = 0
for blockStatement in FvObj.BlockSizeList:
Size = Size + blockStatement[0] * blockStatement[1]
- FvAddDict[RegionData.upper()] = (int(self.BaseAddress,16) + \
+ FvAddDict[RegionData.upper()] = (int(self.BaseAddress, 16) + \
Offset, Size)
Offset = Offset + Size
#
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index d4ba485bcdff..43f849b07172 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -1855,7 +1855,7 @@ class FdfParser:
return long(
ValueExpression(Expr,
self.__CollectMacroPcd()
- )(True),0)
+ )(True), 0)
except Exception:
self.SetFileBufferPos(StartPos)
return None
@@ -2768,7 +2768,7 @@ class FdfParser:
while True:
AlignValue = None
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
#For FFS, Auto is default option same to ""
@@ -2828,7 +2828,7 @@ class FdfParser:
FfsFileObj.CheckSum = True
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
#For FFS, Auto is default option same to ""
@@ -2900,7 +2900,7 @@ class FdfParser:
AlignValue = None
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
AlignValue = self.__Token
@@ -3190,7 +3190,7 @@ class FdfParser:
AlignValue = None
if self.__GetAlignment():
- if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
AlignValue = self.__Token
@@ -3583,7 +3583,7 @@ class FdfParser:
AfileName = self.__Token
AfileBaseName = os.path.basename(AfileName)
- if os.path.splitext(AfileBaseName)[1] not in [".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"]:
+ if os.path.splitext(AfileBaseName)[1] not in [".bin", ".BIN", ".Bin", ".dat", ".DAT", ".Dat", ".data", ".DATA", ".Data"]:
raise Warning('invalid binary file type, should be one of "bin","BIN","Bin","dat","DAT","Dat","data","DATA","Data"', \
self.FileName, self.CurrentLineNumber)
@@ -3782,7 +3782,7 @@ class FdfParser:
AlignValue = ""
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
#For FFS, Auto is default option same to ""
@@ -3832,7 +3832,7 @@ class FdfParser:
SectAlignment = ""
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
@@ -3912,7 +3912,7 @@ class FdfParser:
FvImageSectionObj.FvFileType = self.__Token
if self.__GetAlignment():
- if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
FvImageSectionObj.Alignment = self.__Token
@@ -3980,7 +3980,7 @@ class FdfParser:
EfiSectionObj.BuildNum = self.__Token
if self.__GetAlignment():
- if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+ if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
"256K", "512K", "1M", "2M", "4M", "8M", "16M"):
raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
@@ -4720,7 +4720,7 @@ class FdfParser:
FvInFdList = self.__GetFvInFd(RefFdName)
if FvInFdList != []:
for FvNameInFd in FvInFdList:
- LogStr += "FD %s contains FV %s\n" % (RefFdName,FvNameInFd)
+ LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
if FvNameInFd not in RefFvStack:
RefFvStack.append(FvNameInFd)
@@ -4776,7 +4776,7 @@ class FdfParser:
CapInFdList = self.__GetCapInFd(RefFdName)
if CapInFdList != []:
for CapNameInFd in CapInFdList:
- LogStr += "FD %s contains Capsule %s\n" % (RefFdName,CapNameInFd)
+ LogStr += "FD %s contains Capsule %s\n" % (RefFdName, CapNameInFd)
if CapNameInFd not in RefCapStack:
RefCapStack.append(CapNameInFd)
@@ -4787,7 +4787,7 @@ class FdfParser:
FvInFdList = self.__GetFvInFd(RefFdName)
if FvInFdList != []:
for FvNameInFd in FvInFdList:
- LogStr += "FD %s contains FV %s\n" % (RefFdName,FvNameInFd)
+ LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
if FvNameInFd not in RefFvList:
RefFvList.append(FvNameInFd)
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index b0b242be8d71..3a781d6d3a97 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -430,7 +430,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
self.__InfParse__(Dict)
Arch = self.GetCurrentArch()
- SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir , self.InfFileName);
+ SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName);
DestFile = os.path.join( self.OutputPath, self.ModuleGuid + '.ffs')
SrcFileDir = "."
@@ -676,13 +676,13 @@ class FfsInfStatement(FfsInfStatementClassObject):
Arch = self.CurrentArch
OutputPath = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch],
- Arch ,
+ Arch,
ModulePath,
FileName,
'OUTPUT'
)
DebugPath = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch],
- Arch ,
+ Arch,
ModulePath,
FileName,
'DEBUG'
@@ -944,9 +944,9 @@ class FfsInfStatement(FfsInfStatementClassObject):
Sect.FvParentAddr = FvParentAddr
if Rule.KeyStringList != []:
- SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
+ SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
else :
- SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
+ SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
if not HasGeneratedFlag:
UniVfrOffsetFileSection = ""
@@ -1124,7 +1124,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
try :
SaveFileOnChange(UniVfrOffsetFileName, fStringIO.getvalue())
except:
- EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName,None)
+ EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName, None)
fStringIO.close ()
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 615d9e39faf1..c64c0c80e299 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -386,8 +386,8 @@ class FV (FvClassObject):
# check if the file path exists or not
if not os.path.isfile(FileFullPath):
GenFdsGlobalVariable.ErrorLogger("Error opening FV Extension Header Entry file %s." % (self.FvExtEntryData[Index]))
- FvExtFile = open (FileFullPath,'rb')
- FvExtFile.seek(0,2)
+ FvExtFile = open (FileFullPath, 'rb')
+ FvExtFile.seek(0, 2)
Size = FvExtFile.tell()
if Size >= 0x10000:
GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 916ff919176c..ac5d5891df70 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -64,7 +64,7 @@ class FvImageSection(FvImageSectionClassObject):
for FvFileName in FileList:
FvAlignmentValue = 0
if os.path.isfile(FvFileName):
- FvFileObj = open (FvFileName,'rb')
+ FvFileObj = open (FvFileName, 'rb')
FvFileObj.seek(0)
# PI FvHeader is 0x48 byte
FvHeaderBuffer = FvFileObj.read(0x48)
@@ -112,7 +112,7 @@ class FvImageSection(FvImageSectionClassObject):
if self.FvFileName != None:
FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvFileName)
if os.path.isfile(FvFileName):
- FvFileObj = open (FvFileName,'rb')
+ FvFileObj = open (FvFileName, 'rb')
FvFileObj.seek(0)
# PI FvHeader is 0x48 byte
FvHeaderBuffer = FvFileObj.read(0x48)
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 94b8fedb233b..d7fd58c7482f 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -342,7 +342,7 @@ class GenFdsGlobalVariable:
for Arch in ArchList:
GenFdsGlobalVariable.OutputDirDict[Arch] = os.path.normpath(
os.path.join(GlobalData.gWorkspace,
- WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,GlobalData.gGlobalDefines['TARGET'],
+ WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GlobalData.gGlobalDefines['TARGET'],
GlobalData.gGlobalDefines['TOOLCHAIN']].OutputDirectory,
GlobalData.gGlobalDefines['TARGET'] +'_' + GlobalData.gGlobalDefines['TOOLCHAIN']))
GenFdsGlobalVariable.OutputDirFromDscDict[Arch] = os.path.normpath(
@@ -549,7 +549,7 @@ class GenFdsGlobalVariable:
GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
if MakefilePath:
- if (tuple(Cmd),tuple(GenFdsGlobalVariable.SecCmdList),tuple(GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict.keys():
+ if (tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict.keys():
GenFdsGlobalVariable.FfsCmdDict[tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)] = MakefilePath
GenFdsGlobalVariable.SecCmdList = []
GenFdsGlobalVariable.CopyList = []
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index 127385228fcf..dbbb4312f47e 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -109,7 +109,7 @@ def _parseForGCC(lines, efifilepath):
PcdName = m.groups(0)[0]
m = re.match('^([\da-fA-Fx]+) +([\da-fA-Fx]+)', lines[index + 1].strip())
if m != None:
- bpcds.append((PcdName, int(m.groups(0)[0], 16) , int(sections[-1][1], 16), sections[-1][0]))
+ bpcds.append((PcdName, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
# get section information from efi file
efisecs = PeImageClass(efifilepath).SectionHeaderList
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index becf3e8eb9e8..1e07e23baeee 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -89,7 +89,7 @@ if __name__ == '__main__':
parser.add_argument("--signature-size", dest='SignatureSizeStr', type=str, help="specify the signature size for decode process.")
parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
- parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
+ parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0, 10)), default=0, help="set debug level")
parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
#
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 1641968ace0e..7d11758a795f 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -52,7 +52,7 @@ if __name__ == '__main__':
parser.add_argument("--public-key-hash-c", dest='PublicKeyHashCFile', type=argparse.FileType('wb'), help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in C structure format")
parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
- parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
+ parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0, 10)), default=0, help="set debug level")
#
# Parse command line arguments
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 2a19ad973b91..e5f5a38bbc49 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -51,7 +51,7 @@ EFI_HASH_ALGORITHM_SHA256_GUID = uuid.UUID('{51aa59de-fdf2-4ea3-bc63-875fb7842ee
# UINT8 Signature[256];
# } EFI_CERT_BLOCK_RSA_2048_SHA256;
#
-EFI_CERT_BLOCK_RSA_2048_SHA256 = collections.namedtuple('EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType','PublicKey','Signature'])
+EFI_CERT_BLOCK_RSA_2048_SHA256 = collections.namedtuple('EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType', 'PublicKey', 'Signature'])
EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT = struct.Struct('16s256s256s')
#
@@ -72,7 +72,7 @@ if __name__ == '__main__':
parser.add_argument("--private-key", dest='PrivateKeyFile', type=argparse.FileType('rb'), help="specify the private key filename. If not specified, a test signing key is used.")
parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
- parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
+ parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0, 10)), default=0, help="set debug level")
parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
#
@@ -156,7 +156,7 @@ if __name__ == '__main__':
PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
PublicKey = ''
while len(PublicKeyHexString) > 0:
- PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2],16))
+ PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2], 16))
PublicKeyHexString=PublicKeyHexString[2:]
if Process.returncode != 0:
sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index ebed7a0ea7b8..fe74abb28901 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -59,11 +59,11 @@ class TargetTool():
def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCharacter):
"""Convert a text file to a dictionary of (name:value) pairs."""
try:
- f = open(FileName,'r')
+ f = open(FileName, 'r')
for Line in f:
if Line.startswith(CommentCharacter) or Line.strip() == '':
continue
- LineList = Line.split(KeySplitCharacter,1)
+ LineList = Line.split(KeySplitCharacter, 1)
if len(LineList) >= 2:
Key = LineList[0].strip()
if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary.keys():
@@ -104,7 +104,7 @@ class TargetTool():
if Line.startswith(CommentCharacter) or Line.strip() == '':
fw.write(Line)
else:
- LineList = Line.split(KeySplitCharacter,1)
+ LineList = Line.split(KeySplitCharacter, 1)
if len(LineList) >= 2:
Key = LineList[0].strip()
if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary.keys():
@@ -203,14 +203,14 @@ def RangeCheckCallback(option, opt_str, value, parser):
parser.error("Option %s only allows one instance in command line!" % option)
def MyOptionParser():
- parser = OptionParser(version=__version__,prog="TargetTool.exe",usage=__usage__,description=__copyright__)
- parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32','X64','IPF','EBC', 'ARM', 'AARCH64','0'], dest="TARGET_ARCH",
+ parser = OptionParser(version=__version__, prog="TargetTool.exe", usage=__usage__, description=__copyright__)
+ parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32', 'X64', 'IPF', 'EBC', 'ARM', 'AARCH64', '0'], dest="TARGET_ARCH",
help="ARCHS is one of list: IA32, X64, IPF, ARM, AARCH64 or EBC, which replaces target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
parser.add_option("-p", "--platform", action="callback", type="string", dest="DSCFILE", callback=SingleCheckCallback,
help="Specify a DSC file, which replace target.txt's ACTIVE_PLATFORM definition. 0 will clear this setting in target.txt and can't combine with other value.")
parser.add_option("-c", "--tooldef", action="callback", type="string", dest="TOOL_DEFINITION_FILE", callback=SingleCheckCallback,
help="Specify the WORKSPACE relative path of tool_def.txt file, which replace target.txt's TOOL_CHAIN_CONF definition. 0 will clear this setting in target.txt and can't combine with other value.")
- parser.add_option("-t", "--target", action="append", type="choice", choices=['DEBUG','RELEASE','0'], dest="TARGET",
+ parser.add_option("-t", "--target", action="append", type="choice", choices=['DEBUG', 'RELEASE', '0'], dest="TARGET",
help="TARGET is one of list: DEBUG, RELEASE, which replaces target.txt's TARGET definition. To specify more TARGET, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
parser.add_option("-n", "--tagname", action="callback", type="string", dest="TOOL_CHAIN_TAG", callback=SingleCheckCallback,
help="Specify the Tool Chain Tagname, which replaces target.txt's TOOL_CHAIN_TAG definition. 0 will clear this setting in target.txt and can't combine with other value.")
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index 94f6b1bc707a..af1bf9de3e00 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -261,7 +261,7 @@ def TrimPreprocessedVfr(Source, Target):
CreateDirectory(os.path.dirname(Target))
try:
- f = open (Source,'r')
+ f = open (Source, 'r')
except:
EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
# read whole file
@@ -310,7 +310,7 @@ def TrimPreprocessedVfr(Source, Target):
# save all lines trimmed
try:
- f = open (Target,'w')
+ f = open (Target, 'w')
except:
EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
f.writelines(Lines)
@@ -407,7 +407,7 @@ def TrimAslFile(Source, Target, IncludePathFile):
if IncludePathFile:
try:
LineNum = 0
- for Line in open(IncludePathFile,'r'):
+ for Line in open(IncludePathFile, 'r'):
LineNum += 1
if Line.startswith("/I") or Line.startswith ("-I"):
IncludePathList.append(Line[2:].strip())
@@ -425,7 +425,7 @@ def TrimAslFile(Source, Target, IncludePathFile):
# save all lines trimmed
try:
- f = open (Target,'w')
+ f = open (Target, 'w')
except:
EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
@@ -560,7 +560,7 @@ def TrimEdkSourceCode(Source, Target):
CreateDirectory(os.path.dirname(Target))
try:
- f = open (Source,'rb')
+ f = open (Source, 'rb')
except:
EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
# read whole file
@@ -568,7 +568,7 @@ def TrimEdkSourceCode(Source, Target):
f.close()
NewLines = None
- for Re,Repl in gImportCodePatterns:
+ for Re, Repl in gImportCodePatterns:
if NewLines == None:
NewLines = Re.sub(Repl, Lines)
else:
@@ -579,7 +579,7 @@ def TrimEdkSourceCode(Source, Target):
return
try:
- f = open (Target,'wb')
+ f = open (Target, 'wb')
except:
EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
f.write(NewLines)
diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
index 3a7c9809e31a..203f973669f3 100644
--- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py
+++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
@@ -285,8 +285,8 @@ class DependencyRules(object):
pass
DecPath = dirname(DecFile)
if DecPath.find(WorkSP) > -1:
- InstallPath = GetRelativePath(DecPath,WorkSP)
- DecFileRelaPath = GetRelativePath(DecFile,WorkSP)
+ InstallPath = GetRelativePath(DecPath, WorkSP)
+ DecFileRelaPath = GetRelativePath(DecFile, WorkSP)
else:
InstallPath = DecPath
DecFileRelaPath = DecFile
@@ -348,8 +348,8 @@ class DependencyRules(object):
pass
DecPath = dirname(DecFile)
if DecPath.find(WorkSP) > -1:
- InstallPath = GetRelativePath(DecPath,WorkSP)
- DecFileRelaPath = GetRelativePath(DecFile,WorkSP)
+ InstallPath = GetRelativePath(DecPath, WorkSP)
+ DecFileRelaPath = GetRelativePath(DecFile, WorkSP)
else:
InstallPath = DecPath
DecFileRelaPath = DecFile
diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/Python/UPT/Core/IpiDb.py
index baf687ef99ba..44187a1ee40f 100644
--- a/BaseTools/Source/Python/UPT/Core/IpiDb.py
+++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py
@@ -459,7 +459,7 @@ class IpiDatabase(object):
(select InstallPath from ModInPkgInfo where
ModInPkgInfo.PackageGuid ='%s'
and ModInPkgInfo.PackageVersion = '%s')""" \
- % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1],Pkg[0], Pkg[1])
+ % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1])
self.Cur.execute(SqlCommand)
#
@@ -921,7 +921,7 @@ class IpiDatabase(object):
def __ConvertToSqlString(self, StringList):
if self.DpTable:
pass
- return map(lambda s: s.replace("'", "''") , StringList)
+ return map(lambda s: s.replace("'", "''"), StringList)
diff --git a/BaseTools/Source/Python/UPT/Library/String.py b/BaseTools/Source/Python/UPT/Library/String.py
index 2f916324bd13..de3035279f01 100644
--- a/BaseTools/Source/Python/UPT/Library/String.py
+++ b/BaseTools/Source/Python/UPT/Library/String.py
@@ -633,7 +633,7 @@ def SplitString(String):
# @param StringList: A list for strings to be converted
#
def ConvertToSqlString(StringList):
- return map(lambda s: s.replace("'", "''") , StringList)
+ return map(lambda s: s.replace("'", "''"), StringList)
## Convert To Sql String
#
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 4c28b7f5d22a..1e0c79d6677d 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -649,7 +649,7 @@ class DecPomAlignment(PackageObject):
ContainerFile,
(Item.TokenSpaceGuidCName, Item.TokenCName,
Item.DefaultValue, Item.DatumType, Item.TokenValue,
- Type, Item.GetHeadComment(), Item.GetTailComment(),''),
+ Type, Item.GetHeadComment(), Item.GetTailComment(), ''),
Language,
self.DecParser.GetDefineSectionMacro()
)
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 84b3c353201a..12f091dd421b 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -315,7 +315,7 @@ def Main():
GlobalData.gDB.CloseDb()
if pf.system() == 'Windows':
- os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\',''))
+ os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\', ''))
return ReturnCode
diff --git a/BaseTools/Source/Python/UPT/Xml/CommonXml.py b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
index e28aec5b9b05..498fe938aeab 100644
--- a/BaseTools/Source/Python/UPT/Xml/CommonXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
@@ -355,7 +355,7 @@ class PackageHeaderXml(object):
def FromXml(self, Item, Key, PackageObject2):
if not Item:
XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea']
- CheckDict = {'PackageHeader':None, }
+ CheckDict = {'PackageHeader': None, }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
self.PackagePath = XmlElement(Item, '%s/PackagePath' % Key)
self.Header.FromXml(Item, Key)
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParser.py b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
index b4d52f7bdc1f..bd7be102057a 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParser.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
@@ -104,7 +104,7 @@ class DistributionPackageXml(object):
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
else:
XmlTreeLevel = ['DistributionPackage', 'DistributionHeader']
- CheckDict = CheckDict = {'DistributionHeader':'', }
+ CheckDict = CheckDict = {'DistributionHeader': '', }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
#
@@ -124,16 +124,16 @@ class DistributionPackageXml(object):
#
if self.DistP.Tools:
XmlTreeLevel = ['DistributionPackage', 'Tools', 'Header']
- CheckDict = {'Name':self.DistP.Tools.GetName(), }
+ CheckDict = {'Name': self.DistP.Tools.GetName(), }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
if not self.DistP.Tools.GetFileList():
XmlTreeLevel = ['DistributionPackage', 'Tools']
- CheckDict = {'FileName':None, }
+ CheckDict = {'FileName': None, }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
for Item in self.DistP.Tools.GetFileList():
XmlTreeLevel = ['DistributionPackage', 'Tools']
- CheckDict = {'FileName':Item.GetURI(), }
+ CheckDict = {'FileName': Item.GetURI(), }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
#
@@ -141,16 +141,16 @@ class DistributionPackageXml(object):
#
if self.DistP.MiscellaneousFiles:
XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles', 'Header']
- CheckDict = {'Name':self.DistP.MiscellaneousFiles.GetName(), }
+ CheckDict = {'Name': self.DistP.MiscellaneousFiles.GetName(), }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
if not self.DistP.MiscellaneousFiles.GetFileList():
XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
- CheckDict = {'FileName':None, }
+ CheckDict = {'FileName': None, }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
for Item in self.DistP.MiscellaneousFiles.GetFileList():
XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
- CheckDict = {'FileName':Item.GetURI(), }
+ CheckDict = {'FileName': Item.GetURI(), }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
#
@@ -158,7 +158,7 @@ class DistributionPackageXml(object):
#
for Item in self.DistP.UserExtensions:
XmlTreeLevel = ['DistributionPackage', 'UserExtensions']
- CheckDict = {'UserId':Item.GetUserID(), }
+ CheckDict = {'UserId': Item.GetUserID(), }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
@@ -450,10 +450,10 @@ def ValidateMS1(Module, TopXmlTreeLevel):
XmlTreeLevel = TopXmlTreeLevel + ['MiscellaneousFiles']
for Item in Module.GetMiscFileList():
if not Item.GetFileList():
- CheckDict = {'Filename':'', }
+ CheckDict = {'Filename': '', }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
for File in Item.GetFileList():
- CheckDict = {'Filename':File.GetURI(), }
+ CheckDict = {'Filename': File.GetURI(), }
## ValidateMS2
#
@@ -916,10 +916,10 @@ def ValidatePS2(Package):
XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'MiscellaneousFiles']
for Item in Package.GetMiscFileList():
if not Item.GetFileList():
- CheckDict = {'Filename':'', }
+ CheckDict = {'Filename': '', }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
for File in Item.GetFileList():
- CheckDict = {'Filename':File.GetURI(), }
+ CheckDict = {'Filename': File.GetURI(), }
IsRequiredItemListNull(CheckDict, XmlTreeLevel)
## ValidatePackageSurfaceArea
diff --git a/BaseTools/Source/Python/Workspace/DecBuildData.py b/BaseTools/Source/Python/Workspace/DecBuildData.py
index 2fd3820dcc86..629df18fcbff 100644
--- a/BaseTools/Source/Python/Workspace/DecBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DecBuildData.py
@@ -365,16 +365,16 @@ class DecBuildData(PackageBuildClassObject):
def ProcessStructurePcd(self, StructurePcdRawDataSet):
s_pcd_set = dict()
- for s_pcd,LineNo in StructurePcdRawDataSet:
+ for s_pcd, LineNo in StructurePcdRawDataSet:
if s_pcd.TokenSpaceGuidCName not in s_pcd_set:
s_pcd_set[s_pcd.TokenSpaceGuidCName] = []
- s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd,LineNo))
+ s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd, LineNo))
str_pcd_set = []
for pcdname in s_pcd_set:
dep_pkgs = []
struct_pcd = StructurePcd()
- for item,LineNo in s_pcd_set[pcdname]:
+ for item, LineNo in s_pcd_set[pcdname]:
if "<HeaderFiles>" in item.TokenCName:
struct_pcd.StructuredPcdIncludeFile.append(item.DefaultValue)
elif "<Packages>" in item.TokenCName:
@@ -386,7 +386,7 @@ class DecBuildData(PackageBuildClassObject):
struct_pcd.PcdDefineLineNo = LineNo
struct_pcd.PkgPath = self.MetaFile.File
else:
- struct_pcd.AddDefaultValue(item.TokenCName, item.DefaultValue,self.MetaFile.File,LineNo)
+ struct_pcd.AddDefaultValue(item.TokenCName, item.DefaultValue, self.MetaFile.File, LineNo)
struct_pcd.PackageDecs = dep_pkgs
@@ -409,7 +409,7 @@ class DecBuildData(PackageBuildClassObject):
StrPcdSet = []
RecordList = self._RawData[Type, self._Arch]
for TokenSpaceGuid, PcdCName, Setting, Arch, PrivateFlag, Dummy1, Dummy2 in RecordList:
- PcdDict[Arch, PcdCName, TokenSpaceGuid] = (Setting,Dummy2)
+ PcdDict[Arch, PcdCName, TokenSpaceGuid] = (Setting, Dummy2)
if not (PcdCName, TokenSpaceGuid) in PcdSet:
PcdSet.append((PcdCName, TokenSpaceGuid))
@@ -418,7 +418,7 @@ class DecBuildData(PackageBuildClassObject):
# limit the ARCH to self._Arch, if no self._Arch found, tdict
# will automatically turn to 'common' ARCH and try again
#
- Setting,LineNo = PcdDict[self._Arch, PcdCName, TokenSpaceGuid]
+ Setting, LineNo = PcdDict[self._Arch, PcdCName, TokenSpaceGuid]
if Setting == None:
continue
@@ -440,7 +440,7 @@ class DecBuildData(PackageBuildClassObject):
list(expressions)
)
if "." in TokenSpaceGuid:
- StrPcdSet.append((PcdObj,LineNo))
+ StrPcdSet.append((PcdObj, LineNo))
else:
Pcds[PcdCName, TokenSpaceGuid, self._PCD_TYPE_STRING_[Type]] = PcdObj
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index e9fe533b3975..b08bdfbc4f4e 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -591,12 +591,12 @@ class DscBuildData(PlatformBuildClassObject):
File=self.MetaFile, Line=Record[-1])
self._SkuIds[Record[1].upper()] = (str(self.ToInt(Record[0])), Record[1].upper(), Record[2].upper())
if 'DEFAULT' not in self._SkuIds:
- self._SkuIds['DEFAULT'] = ("0","DEFAULT","DEFAULT")
+ self._SkuIds['DEFAULT'] = ("0", "DEFAULT", "DEFAULT")
if 'COMMON' not in self._SkuIds:
- self._SkuIds['COMMON'] = ("0","DEFAULT","DEFAULT")
+ self._SkuIds['COMMON'] = ("0", "DEFAULT", "DEFAULT")
return self._SkuIds
- def ToInt(self,intstr):
- return int(intstr,16) if intstr.upper().startswith("0X") else int(intstr)
+ def ToInt(self, intstr):
+ return int(intstr, 16) if intstr.upper().startswith("0X") else int(intstr)
def _GetDefaultStores(self):
if self.DefaultStores == None:
self.DefaultStores = sdict()
@@ -616,9 +616,9 @@ class DscBuildData(PlatformBuildClassObject):
if not IsValidWord(Record[1]):
EdkLogger.error('build', FORMAT_INVALID, "The format of the DefaultStores ID name is invalid. The correct format is '(a-zA-Z0-9_)(a-zA-Z0-9_-.)*'",
File=self.MetaFile, Line=Record[-1])
- self.DefaultStores[Record[1].upper()] = (self.ToInt(Record[0]),Record[1].upper())
+ self.DefaultStores[Record[1].upper()] = (self.ToInt(Record[0]), Record[1].upper())
if TAB_DEFAULT_STORES_DEFAULT not in self.DefaultStores:
- self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (0,TAB_DEFAULT_STORES_DEFAULT)
+ self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (0, TAB_DEFAULT_STORES_DEFAULT)
GlobalData.gDefaultStores = self.DefaultStores.keys()
if GlobalData.gDefaultStores:
GlobalData.gDefaultStores.sort()
@@ -678,7 +678,7 @@ class DscBuildData(PlatformBuildClassObject):
for Type in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, \
MODEL_PCD_FEATURE_FLAG, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX]:
RecordList = self._RawData[Type, self._Arch, None, ModuleId]
- for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+ for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
TokenList = GetSplitValueList(Setting)
DefaultValue = TokenList[0]
if len(TokenList) > 1:
@@ -702,7 +702,7 @@ class DscBuildData(PlatformBuildClassObject):
# get module private build options
RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, None, ModuleId]
- for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+ for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
if (ToolChainFamily, ToolChain) not in Module.BuildOptions:
Module.BuildOptions[ToolChainFamily, ToolChain] = Option
else:
@@ -742,7 +742,7 @@ class DscBuildData(PlatformBuildClassObject):
RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Arch, None, -1]
Macros = self._Macros
for Record in RecordList:
- LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Dummy,Dummy, LineNo = Record
+ LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Dummy, Dummy, LineNo = Record
if LibraryClass == '' or LibraryClass == 'NULL':
self._NullLibraryNumber += 1
LibraryClass = 'NULL%d' % self._NullLibraryNumber
@@ -809,7 +809,7 @@ class DscBuildData(PlatformBuildClassObject):
ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
PkgSet.update(ModuleData.Packages)
- self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain,PkgSet)
+ self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
if (PcdCName, TokenSpaceGuid) not in self._DecPcds:
@@ -854,14 +854,14 @@ class DscBuildData(PlatformBuildClassObject):
ExtraData="%s.%s" % (TokenSpaceGuid, PcdCName))
if PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
if self._DecPcds[PcdCName, TokenSpaceGuid].DatumType.strip() != ValueList[1].strip():
- EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtype used in DSC file is not the same as its declaration in DEC file." , File=self.MetaFile, Line=LineNo,
+ EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtype used in DSC file is not the same as its declaration in DEC file.", File=self.MetaFile, Line=LineNo,
ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
if (TokenSpaceGuid + '.' + PcdCName) in GlobalData.gPlatformPcds:
if GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] != ValueList[Index]:
GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] = ValueList[Index]
return ValueList
- def _FilterPcdBySkuUsage(self,Pcds):
+ def _FilterPcdBySkuUsage(self, Pcds):
available_sku = self.SkuIdMgr.AvailableSkuIdSet
sku_usage = self.SkuIdMgr.SkuUsageType
if sku_usage == SkuClass.SINGLE:
@@ -877,7 +877,7 @@ class DscBuildData(PlatformBuildClassObject):
if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
return Pcds
- def CompleteHiiPcdsDefaultStores(self,Pcds):
+ def CompleteHiiPcdsDefaultStores(self, Pcds):
HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]]
DefaultStoreMgr = DefaultStore(self.DefaultStores)
for pcd in HiiPcd:
@@ -894,15 +894,15 @@ class DscBuildData(PlatformBuildClassObject):
if GlobalData.BuildOptionPcd:
for pcd in GlobalData.BuildOptionPcd:
if pcd[2] == "":
- pcdset.append((pcd[0],pcd[1],pcd[3]))
+ pcdset.append((pcd[0], pcd[1], pcd[3]))
else:
- pcdobj = self._Pcds.get((pcd[1],pcd[0]))
+ pcdobj = self._Pcds.get((pcd[1], pcd[0]))
if pcdobj:
- pcdset.append((pcd[0],pcd[1], pcdobj.DefaultValue))
+ pcdset.append((pcd[0], pcd[1], pcdobj.DefaultValue))
else:
- pcdset.append((pcd[0],pcd[1],pcd[3]))
+ pcdset.append((pcd[0], pcd[1], pcd[3]))
GlobalData.BuildOptionPcd = pcdset
- def GetFieldValueFromComm(self,ValueStr,TokenSpaceGuidCName, TokenCName, FieldName):
+ def GetFieldValueFromComm(self, ValueStr, TokenSpaceGuidCName, TokenCName, FieldName):
PredictedFieldType = "VOID*"
if ValueStr.startswith('L'):
if not ValueStr[1]:
@@ -941,10 +941,10 @@ class DscBuildData(PlatformBuildClassObject):
if not pcdvalue:
EdkLogger.error('build', AUTOGEN_ERROR, "No Value specified for the PCD %s." % (pcdname))
if '.' in pcdname:
- (Name1, Name2) = pcdname.split('.',1)
+ (Name1, Name2) = pcdname.split('.', 1)
if "." in Name2:
- (Name3, FieldName) = Name2.split(".",1)
- if ((Name3,Name1)) in self.DecPcds:
+ (Name3, FieldName) = Name2.split(".", 1)
+ if ((Name3, Name1)) in self.DecPcds:
HasTokenSpace = True
TokenCName = Name3
TokenSpaceGuidCName = Name1
@@ -954,7 +954,7 @@ class DscBuildData(PlatformBuildClassObject):
TokenSpaceGuidCName = ''
HasTokenSpace = False
else:
- if ((Name2,Name1)) in self.DecPcds:
+ if ((Name2, Name1)) in self.DecPcds:
HasTokenSpace = True
TokenCName = Name2
TokenSpaceGuidCName = Name1
@@ -990,7 +990,7 @@ class DscBuildData(PlatformBuildClassObject):
FoundFlag = True
if FieldName:
NewValue = self.GetFieldValueFromComm(pcdvalue, TokenSpaceGuidCName, TokenCName, FieldName)
- GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, FieldName,NewValue,("build command options",1))
+ GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, FieldName, NewValue, ("build command options", 1))
else:
for key in self.DecPcds:
PcdItem = self.DecPcds[key]
@@ -1029,7 +1029,7 @@ class DscBuildData(PlatformBuildClassObject):
AUTOGEN_ERROR,
"The Pcd %s is found under multiple different TokenSpaceGuid: %s and %s." % (TokenCName, PcdItem.TokenSpaceGuidCName, TokenSpaceGuidCNameList[0])
)
- GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, FieldName,NewValue,("build command options",1))
+ GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, FieldName, NewValue, ("build command options", 1))
if not FoundFlag:
if HasTokenSpace:
EdkLogger.error('build', AUTOGEN_ERROR, "The Pcd %s.%s is not found in the DEC file." % (TokenSpaceGuidCName, TokenCName))
@@ -1065,17 +1065,17 @@ class DscBuildData(PlatformBuildClassObject):
self.RecoverCommandLinePcd()
return self._Pcds
- def _dumpPcdInfo(self,Pcds):
+ def _dumpPcdInfo(self, Pcds):
for pcd in Pcds:
pcdobj = Pcds[pcd]
if not pcdobj.TokenCName.startswith("Test"):
continue
for skuid in pcdobj.SkuInfoList:
- if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]):
+ if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]):
for storename in pcdobj.SkuInfoList[skuid].DefaultStoreDict:
- print("PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,storename,str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename])))
+ print("PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid, storename, str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename])))
else:
- print("PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,str(pcdobj.SkuInfoList[skuid].DefaultValue)))
+ print("PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid, str(pcdobj.SkuInfoList[skuid].DefaultValue)))
## Retrieve [BuildOptions]
def _GetBuildOptions(self):
if self._BuildOptions == None:
@@ -1085,7 +1085,7 @@ class DscBuildData(PlatformBuildClassObject):
#
for CodeBase in (EDKII_NAME, EDK_NAME):
RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, CodeBase]
- for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+ for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
if Dummy3.upper() != 'COMMON':
continue
CurKey = (ToolChainFamily, ToolChain, CodeBase)
@@ -1108,7 +1108,7 @@ class DscBuildData(PlatformBuildClassObject):
DriverType = '%s.%s' % (Edk, ModuleType)
CommonDriverType = '%s.%s' % ('COMMON', ModuleType)
RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch]
- for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+ for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
Type = Dummy2 + '.' + Dummy3
if Type.upper() == DriverType.upper() or Type.upper() == CommonDriverType.upper():
Key = (ToolChainFamily, ToolChain, Edk)
@@ -1122,28 +1122,28 @@ class DscBuildData(PlatformBuildClassObject):
def GetStructurePcdInfo(self, PcdSet):
structure_pcd_data = {}
for item in PcdSet:
- if (item[0],item[1]) not in structure_pcd_data:
- structure_pcd_data[(item[0],item[1])] = []
- structure_pcd_data[(item[0],item[1])].append(item)
+ if (item[0], item[1]) not in structure_pcd_data:
+ structure_pcd_data[(item[0], item[1])] = []
+ structure_pcd_data[(item[0], item[1])].append(item)
return structure_pcd_data
- def OverrideByFdfComm(self,StruPcds):
- StructurePcdInCom = {(item[0],item[1],item[2] ):(item[3],item[4]) for item in GlobalData.BuildOptionPcd if len(item) == 5 and (item[1],item[0]) in StruPcds } if GlobalData.BuildOptionPcd else {}
- GlobalPcds = set([(item[0],item[1]) for item in StructurePcdInCom.keys()])
+ def OverrideByFdfComm(self, StruPcds):
+ StructurePcdInCom = {(item[0], item[1], item[2] ):(item[3], item[4]) for item in GlobalData.BuildOptionPcd if len(item) == 5 and (item[1], item[0]) in StruPcds } if GlobalData.BuildOptionPcd else {}
+ GlobalPcds = set([(item[0], item[1]) for item in StructurePcdInCom.keys()])
for Pcd in StruPcds.values():
- if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) not in GlobalPcds:
+ if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) not in GlobalPcds:
continue
- FieldValues = {item[2]:StructurePcdInCom[item] for item in StructurePcdInCom if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) == (item[0],item[1]) and item[2]}
+ FieldValues = {item[2]:StructurePcdInCom[item] for item in StructurePcdInCom if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) == (item[0], item[1]) and item[2]}
for sku in Pcd.SkuOverrideValues:
for defaultstore in Pcd.SkuOverrideValues[sku]:
for field in FieldValues:
if field not in Pcd.SkuOverrideValues[sku][defaultstore]:
- Pcd.SkuOverrideValues[sku][defaultstore][field] = ["","",""]
+ Pcd.SkuOverrideValues[sku][defaultstore][field] = ["", "", ""]
Pcd.SkuOverrideValues[sku][defaultstore][field][0] = FieldValues[field][0]
Pcd.SkuOverrideValues[sku][defaultstore][field][1] = FieldValues[field][1][0]
Pcd.SkuOverrideValues[sku][defaultstore][field][2] = FieldValues[field][1][1]
return StruPcds
- def OverrideByFdfCommOverAll(self,AllPcds):
+ def OverrideByFdfCommOverAll(self, AllPcds):
def CheckStructureInComm(commpcds):
if not commpcds:
return False
@@ -1152,29 +1152,29 @@ class DscBuildData(PlatformBuildClassObject):
return False
if CheckStructureInComm(GlobalData.BuildOptionPcd):
- StructurePcdInCom = {(item[0],item[1],item[2] ):(item[3],item[4]) for item in GlobalData.BuildOptionPcd } if GlobalData.BuildOptionPcd else {}
- NoFiledValues = {(item[0],item[1]):StructurePcdInCom[item] for item in StructurePcdInCom if not item[2]}
+ StructurePcdInCom = {(item[0], item[1], item[2] ):(item[3], item[4]) for item in GlobalData.BuildOptionPcd } if GlobalData.BuildOptionPcd else {}
+ NoFiledValues = {(item[0], item[1]):StructurePcdInCom[item] for item in StructurePcdInCom if not item[2]}
else:
- NoFiledValues = {(item[0],item[1]):[item[2]] for item in GlobalData.BuildOptionPcd}
- for Guid,Name in NoFiledValues:
- if (Name,Guid) in AllPcds:
- Pcd = AllPcds.get((Name,Guid))
- Pcd.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+ NoFiledValues = {(item[0], item[1]):[item[2]] for item in GlobalData.BuildOptionPcd}
+ for Guid, Name in NoFiledValues:
+ if (Name, Guid) in AllPcds:
+ Pcd = AllPcds.get((Name, Guid))
+ Pcd.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
for sku in Pcd.SkuInfoList:
SkuInfo = Pcd.SkuInfoList[sku]
if SkuInfo.DefaultValue:
- SkuInfo.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+ SkuInfo.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
else:
- SkuInfo.HiiDefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+ SkuInfo.HiiDefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
for defaultstore in SkuInfo.DefaultStoreDict:
- SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+ SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
else:
- PcdInDec = self.DecPcds.get((Name,Guid))
+ PcdInDec = self.DecPcds.get((Name, Guid))
if PcdInDec:
if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
self.Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
- self.Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid,Name)][0]
+ self.Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid, Name)][0]
return AllPcds
def UpdateStructuredPcds(self, TypeList, AllPcds):
@@ -1198,7 +1198,7 @@ class DscBuildData(PlatformBuildClassObject):
for Type in TypeList:
RecordList.extend(self._RawData[Type, self._Arch])
- for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_store, Dummy4,Dummy5 in RecordList:
+ for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_store, Dummy4, Dummy5 in RecordList:
SkuName = SkuName.upper()
default_store = default_store.upper()
SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
@@ -1206,7 +1206,7 @@ class DscBuildData(PlatformBuildClassObject):
continue
if SkuName in SkuIds and "." in TokenSpaceGuid:
- S_PcdSet.append([ TokenSpaceGuid.split(".")[0],TokenSpaceGuid.split(".")[1], PcdCName,SkuName, default_store,Dummy5, AnalyzePcdExpression(Setting)[0]])
+ S_PcdSet.append([ TokenSpaceGuid.split(".")[0], TokenSpaceGuid.split(".")[1], PcdCName, SkuName, default_store, Dummy5, AnalyzePcdExpression(Setting)[0]])
# handle pcd value override
StrPcdSet = self.GetStructurePcdInfo(S_PcdSet)
@@ -1217,7 +1217,7 @@ class DscBuildData(PlatformBuildClassObject):
if not isinstance (str_pcd_dec, StructurePcd):
EdkLogger.error('build', PARSER_ERROR,
"Pcd (%s.%s) is not declared as Structure PCD in DEC files. Arch: ['%s']" % (str_pcd[0], str_pcd[1], self._Arch),
- File=self.MetaFile,Line = StrPcdSet[str_pcd][0][5])
+ File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
if str_pcd_dec:
str_pcd_obj_str = StructurePcd()
str_pcd_obj_str.copy(str_pcd_dec)
@@ -1226,12 +1226,12 @@ class DscBuildData(PlatformBuildClassObject):
str_pcd_obj_str.DefaultFromDSC = str_pcd_obj_str.DefaultValue
for str_pcd_data in StrPcdSet[str_pcd]:
if str_pcd_data[3] in SkuIds:
- str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], str(str_pcd_data[6]), 'DEFAULT' if str_pcd_data[3] == 'COMMON' else str_pcd_data[3],'STANDARD' if str_pcd_data[4] == 'COMMON' else str_pcd_data[4], self.MetaFile.File,LineNo=str_pcd_data[5])
+ str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], str(str_pcd_data[6]), 'DEFAULT' if str_pcd_data[3] == 'COMMON' else str_pcd_data[3], 'STANDARD' if str_pcd_data[4] == 'COMMON' else str_pcd_data[4], self.MetaFile.File, LineNo=str_pcd_data[5])
S_pcd_set[str_pcd[1], str_pcd[0]] = str_pcd_obj_str
else:
EdkLogger.error('build', PARSER_ERROR,
"Pcd (%s.%s) defined in DSC is not declared in DEC files. Arch: ['%s']" % (str_pcd[0], str_pcd[1], self._Arch),
- File=self.MetaFile,Line = StrPcdSet[str_pcd][0][5])
+ File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
# Add the Structure PCD that only defined in DEC, don't have override in DSC file
for Pcd in self.DecPcds:
if type (self._DecPcds[Pcd]) is StructurePcd:
@@ -1279,7 +1279,7 @@ class DscBuildData(PlatformBuildClassObject):
S_pcd_set = self.OverrideByFdfComm(S_pcd_set)
Str_Pcd_Values = self.GenerateByteArrayValue(S_pcd_set)
if Str_Pcd_Values:
- for (skuname,StoreName,PcdGuid,PcdName,PcdValue) in Str_Pcd_Values:
+ for (skuname, StoreName, PcdGuid, PcdName, PcdValue) in Str_Pcd_Values:
str_pcd_obj = S_pcd_set.get((PcdName, PcdGuid))
if str_pcd_obj is None:
print(PcdName, PcdGuid)
@@ -1331,7 +1331,7 @@ class DscBuildData(PlatformBuildClassObject):
elif 'DEFAULT' in pcd.SkuInfoList.keys() and 'COMMON' in pcd.SkuInfoList.keys():
del(pcd.SkuInfoList['COMMON'])
- map(self.FilterSkuSettings,[Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType])
+ map(self.FilterSkuSettings, [Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType])
return Pcds
## Retrieve non-dynamic PCD settings
@@ -1353,7 +1353,7 @@ class DscBuildData(PlatformBuildClassObject):
# Find out all possible PCD candidates for self._Arch
RecordList = self._RawData[Type, self._Arch]
PcdValueDict = sdict()
- for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+ for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
SkuName = SkuName.upper()
SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
if SkuName not in AvailableSkuIdSet:
@@ -1404,7 +1404,7 @@ class DscBuildData(PlatformBuildClassObject):
return Pcds
- def __UNICODE2OCTList(self,Value):
+ def __UNICODE2OCTList(self, Value):
Value = Value.strip()
Value = Value[2:-1]
List = []
@@ -1415,7 +1415,7 @@ class DscBuildData(PlatformBuildClassObject):
List.append('0x00')
List.append('0x00')
return List
- def __STRING2OCTList(self,Value):
+ def __STRING2OCTList(self, Value):
OCTList = []
Value = Value.strip('"')
for char in Value:
@@ -1502,7 +1502,7 @@ class DscBuildData(PlatformBuildClassObject):
CApp = CApp + '\n'
if SkuName in Pcd.SkuInfoList:
- DefaultValue = Pcd.SkuInfoList[SkuName].DefaultStoreDict.get(DefaultStoreName,Pcd.SkuInfoList[SkuName].HiiDefaultValue) if Pcd.SkuInfoList[SkuName].HiiDefaultValue else Pcd.SkuInfoList[SkuName].DefaultValue
+ DefaultValue = Pcd.SkuInfoList[SkuName].DefaultStoreDict.get(DefaultStoreName, Pcd.SkuInfoList[SkuName].HiiDefaultValue) if Pcd.SkuInfoList[SkuName].HiiDefaultValue else Pcd.SkuInfoList[SkuName].DefaultValue
else:
DefaultValue = Pcd.DefaultValue
PcdDefaultValue = StringToArray(DefaultValue.strip())
@@ -1593,7 +1593,7 @@ class DscBuildData(PlatformBuildClassObject):
try:
Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
except Exception:
- EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldName][2]))
+ EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
if isinstance(Value, str):
CApp = CApp + ' Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
elif IsArray:
@@ -1610,7 +1610,7 @@ class DscBuildData(PlatformBuildClassObject):
CApp = CApp + ' Pcd->%s = %d; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
for skuname in self.SkuIdMgr.GetSkuChain(SkuName):
inherit_OverrideValues = Pcd.SkuOverrideValues[skuname]
- for FieldList in [Pcd.DefaultFromDSC,inherit_OverrideValues.get(DefaultStoreName)]:
+ for FieldList in [Pcd.DefaultFromDSC, inherit_OverrideValues.get(DefaultStoreName)]:
if not FieldList:
continue
if Pcd.DefaultFromDSC and FieldList == Pcd.DefaultFromDSC:
@@ -1631,7 +1631,7 @@ class DscBuildData(PlatformBuildClassObject):
try:
Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
except Exception:
- EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldName][2]))
+ EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
if isinstance(Value, str):
CApp = CApp + ' Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
elif IsArray:
@@ -1834,7 +1834,7 @@ class DscBuildData(PlatformBuildClassObject):
if FileLine.isdigit():
error_line = FileData[int (FileLine) - 1]
if r"//" in error_line:
- c_line,dsc_line = error_line.split(r"//")
+ c_line, dsc_line = error_line.split(r"//")
else:
dsc_line = error_line
message_itmes = Message.split(":")
@@ -1874,7 +1874,7 @@ class DscBuildData(PlatformBuildClassObject):
for Pcd in FileBuffer:
PcdValue = Pcd.split ('|')
PcdInfo = PcdValue[0].split ('.')
- StructurePcdSet.append((PcdInfo[0],PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
+ StructurePcdSet.append((PcdInfo[0], PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
return StructurePcdSet
## Retrieve dynamic PCD settings
@@ -1898,7 +1898,7 @@ class DscBuildData(PlatformBuildClassObject):
AvailableSkuIdSet = copy.copy(self.SkuIds)
- for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+ for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
SkuName = SkuName.upper()
SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
if SkuName not in AvailableSkuIdSet:
@@ -1960,7 +1960,7 @@ class DscBuildData(PlatformBuildClassObject):
elif 'DEFAULT' in pcd.SkuInfoList.keys() and 'COMMON' in pcd.SkuInfoList.keys():
del(pcd.SkuInfoList['COMMON'])
- map(self.FilterSkuSettings,Pcds.values())
+ map(self.FilterSkuSettings, Pcds.values())
return Pcds
@@ -1990,10 +1990,10 @@ class DscBuildData(PlatformBuildClassObject):
return True
else:
return False
- def CompletePcdValues(self,PcdSet):
+ def CompletePcdValues(self, PcdSet):
Pcds = {}
DefaultStoreObj = DefaultStore(self._GetDefaultStores())
- SkuIds = {skuname:skuid for skuname,skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname !='COMMON'}
+ SkuIds = {skuname:skuid for skuname, skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname !='COMMON'}
DefaultStores = set([storename for pcdobj in PcdSet.values() for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict.keys()])
for PcdCName, TokenSpaceGuid in PcdSet:
PcdObj = PcdSet[(PcdCName, TokenSpaceGuid)]
@@ -2014,7 +2014,7 @@ class DscBuildData(PlatformBuildClassObject):
if defaultstorename not in skuobj.DefaultStoreDict:
skuobj.DefaultStoreDict[defaultstorename] = copy.deepcopy(skuobj.DefaultStoreDict[mindefaultstorename])
skuobj.HiiDefaultValue = skuobj.DefaultStoreDict[mindefaultstorename]
- for skuname,skuid in SkuIds.items():
+ for skuname, skuid in SkuIds.items():
if skuname not in PcdObj.SkuInfoList:
nextskuid = self.SkuIdMgr.GetNextSkuId(skuname)
while nextskuid not in PcdObj.SkuInfoList:
@@ -2048,7 +2048,7 @@ class DscBuildData(PlatformBuildClassObject):
AvailableSkuIdSet = copy.copy(self.SkuIds)
DefaultStoresDefine = self._GetDefaultStores()
- for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore, Dummy4,Dummy5 in RecordList:
+ for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore, Dummy4, Dummy5 in RecordList:
SkuName = SkuName.upper()
SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
DefaultStore = DefaultStore.upper()
@@ -2061,14 +2061,14 @@ class DscBuildData(PlatformBuildClassObject):
EdkLogger.error('build', PARAMETER_INVALID, 'DefaultStores %s is not defined in [DefaultStores] section' % DefaultStore,
File=self.MetaFile, Line=Dummy5)
if "." not in TokenSpaceGuid:
- PcdSet.add((PcdCName, TokenSpaceGuid, SkuName,DefaultStore, Dummy5))
- PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid,DefaultStore] = Setting
+ PcdSet.add((PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy5))
+ PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore] = Setting
# Remove redundant PCD candidates, per the ARCH and SKU
- for PcdCName, TokenSpaceGuid, SkuName,DefaultStore, Dummy4 in PcdSet:
+ for PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy4 in PcdSet:
- Setting = PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceGuid,DefaultStore]
+ Setting = PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore]
if Setting == None:
continue
VariableName, VariableGuid, VariableOffset, DefaultValue, VarAttribute = self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
@@ -2112,10 +2112,10 @@ class DscBuildData(PlatformBuildClassObject):
Skuitem = pcdObject.SkuInfoList[SkuName]
Skuitem.DefaultStoreDict.update({DefaultStore:DefaultValue})
else:
- SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute,DefaultStore={DefaultStore:DefaultValue})
+ SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
pcdObject.SkuInfoList[SkuName] = SkuInfo
else:
- SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute,DefaultStore={DefaultStore:DefaultValue})
+ SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
Pcds[PcdCName, TokenSpaceGuid] = PcdClassObject(
PcdCName,
TokenSpaceGuid,
@@ -2142,7 +2142,7 @@ class DscBuildData(PlatformBuildClassObject):
sku.HiiDefaultValue = pcdDecObject.DefaultValue
if 'DEFAULT' not in pcd.SkuInfoList.keys() and 'COMMON' not in pcd.SkuInfoList.keys():
valuefromDec = pcdDecObject.DefaultValue
- SkuInfo = SkuInfoClass('DEFAULT', '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec,VariableAttribute=SkuInfoObj.VariableAttribute,DefaultStore={DefaultStore:valuefromDec})
+ SkuInfo = SkuInfoClass('DEFAULT', '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec, VariableAttribute=SkuInfoObj.VariableAttribute, DefaultStore={DefaultStore:valuefromDec})
pcd.SkuInfoList['DEFAULT'] = SkuInfo
elif 'DEFAULT' not in pcd.SkuInfoList.keys() and 'COMMON' in pcd.SkuInfoList.keys():
pcd.SkuInfoList['DEFAULT'] = pcd.SkuInfoList['COMMON']
@@ -2170,19 +2170,19 @@ class DscBuildData(PlatformBuildClassObject):
invalidpcd = ",".join(invalidhii)
EdkLogger.error('build', PCD_VARIABLE_INFO_ERROR, Message='The same HII PCD must map to the same EFI variable for all SKUs', File=self.MetaFile, ExtraData=invalidpcd)
- map(self.FilterSkuSettings,Pcds.values())
+ map(self.FilterSkuSettings, Pcds.values())
return Pcds
- def CheckVariableNameAssignment(self,Pcds):
+ def CheckVariableNameAssignment(self, Pcds):
invalidhii = []
for pcdname in Pcds:
pcd = Pcds[pcdname]
- varnameset = set([sku.VariableName for (skuid,sku) in pcd.SkuInfoList.items()])
+ varnameset = set([sku.VariableName for (skuid, sku) in pcd.SkuInfoList.items()])
if len(varnameset) > 1:
- invalidhii.append(".".join((pcdname[1],pcdname[0])))
+ invalidhii.append(".".join((pcdname[1], pcdname[0])))
if len(invalidhii):
- return False,invalidhii
+ return False, invalidhii
else:
return True, []
## Retrieve dynamic VPD PCD settings
@@ -2206,7 +2206,7 @@ class DscBuildData(PlatformBuildClassObject):
RecordList = self._RawData[Type, self._Arch]
AvailableSkuIdSet = copy.copy(self.SkuIds)
- for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+ for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
SkuName = SkuName.upper()
SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
if SkuName not in AvailableSkuIdSet:
@@ -2273,7 +2273,7 @@ class DscBuildData(PlatformBuildClassObject):
del(pcd.SkuInfoList['COMMON'])
- map(self.FilterSkuSettings,Pcds.values())
+ map(self.FilterSkuSettings, Pcds.values())
return Pcds
## Add external modules
@@ -2338,7 +2338,7 @@ class DscBuildData(PlatformBuildClassObject):
continue
ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
PkgSet.update(ModuleData.Packages)
- self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain,PkgSet)
+ self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
return self._DecPcds
_Macros = property(_GetMacros)
Arch = property(_GetArch, _SetArch)
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 4ad60498488b..8ceedf5aec78 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -299,7 +299,7 @@ class MetaFileParser(object):
for Item in GetSplitValueList(self._CurrentLine[1:-1], TAB_COMMA_SPLIT):
if Item == '':
continue
- ItemList = GetSplitValueList(Item, TAB_SPLIT,3)
+ ItemList = GetSplitValueList(Item, TAB_SPLIT, 3)
# different section should not mix in one section
if self._SectionName != '' and self._SectionName != ItemList[0].upper():
EdkLogger.error('Parser', FORMAT_INVALID, "Different section names in the same section",
@@ -417,7 +417,7 @@ class MetaFileParser(object):
## Construct section Macro dict
def _ConstructSectionMacroDict(self, Name, Value):
- ScopeKey = [(Scope[0], Scope[1],Scope[2]) for Scope in self._Scope]
+ ScopeKey = [(Scope[0], Scope[1], Scope[2]) for Scope in self._Scope]
ScopeKey = tuple(ScopeKey)
SectionDictKey = self._SectionType, ScopeKey
#
@@ -449,20 +449,20 @@ class MetaFileParser(object):
continue
for ActiveScope in self._Scope:
- Scope0, Scope1 ,Scope2= ActiveScope[0], ActiveScope[1],ActiveScope[2]
- if(Scope0, Scope1,Scope2) not in Scope:
+ Scope0, Scope1, Scope2= ActiveScope[0], ActiveScope[1], ActiveScope[2]
+ if(Scope0, Scope1, Scope2) not in Scope:
break
else:
SpeSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
for ActiveScope in self._Scope:
- Scope0, Scope1,Scope2 = ActiveScope[0], ActiveScope[1],ActiveScope[2]
- if(Scope0, Scope1,Scope2) not in Scope and (Scope0, "COMMON","COMMON") not in Scope and ("COMMON", Scope1,"COMMON") not in Scope:
+ Scope0, Scope1, Scope2 = ActiveScope[0], ActiveScope[1], ActiveScope[2]
+ if(Scope0, Scope1, Scope2) not in Scope and (Scope0, "COMMON", "COMMON") not in Scope and ("COMMON", Scope1, "COMMON") not in Scope:
break
else:
ComSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
- if ("COMMON", "COMMON","COMMON") in Scope:
+ if ("COMMON", "COMMON", "COMMON") in Scope:
ComComMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
Macros.update(ComComMacroDict)
@@ -634,7 +634,7 @@ class InfParser(MetaFileParser):
# Model, Value1, Value2, Value3, Arch, Platform, BelongsToItem=-1,
# LineBegin=-1, ColumnBegin=-1, LineEnd=-1, ColumnEnd=-1, Enabled=-1
#
- for Arch, Platform,_ in self._Scope:
+ for Arch, Platform, _ in self._Scope:
LastItem = self._Store(self._SectionType,
self._ValueList[0],
self._ValueList[1],
@@ -944,7 +944,7 @@ class DscParser(MetaFileParser):
self._DirectiveParser()
continue
if Line[0] == TAB_OPTION_START and not self._InSubsection:
- EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (Line, Index+1),ExtraData=self.MetaFile)
+ EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (Line, Index+1), ExtraData=self.MetaFile)
if self._InSubsection:
SectionType = self._SubsectionType
@@ -1024,7 +1024,7 @@ class DscParser(MetaFileParser):
ExtraData=self._CurrentLine)
ItemType = self.DataType[DirectiveName]
- Scope = [['COMMON', 'COMMON','COMMON']]
+ Scope = [['COMMON', 'COMMON', 'COMMON']]
if ItemType == MODEL_META_DATA_INCLUDE:
Scope = self._Scope
if ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF:
@@ -1099,7 +1099,7 @@ class DscParser(MetaFileParser):
@ParseMacro
def _SkuIdParser(self):
TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
- if len(TokenList) not in (2,3):
+ if len(TokenList) not in (2, 3):
EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '<Number>|<UiName>[|<UiName>]'",
ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
self._ValueList[0:len(TokenList)] = TokenList
@@ -1159,7 +1159,7 @@ class DscParser(MetaFileParser):
# Validate the datum type of Dynamic Defaul PCD and DynamicEx Default PCD
ValueList = GetSplitValueList(self._ValueList[2])
- if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8 , TAB_UINT16, TAB_UINT32 , TAB_UINT64] \
+ if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64] \
and self._ItemType in [MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]:
EdkLogger.error('Parser', FORMAT_INVALID, "The datum type '%s' of PCD is wrong" % ValueList[1],
ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
@@ -1167,7 +1167,7 @@ class DscParser(MetaFileParser):
# Validate the VariableName of DynamicHii and DynamicExHii for PCD Entry must not be an empty string
if self._ItemType in [MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII]:
DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
- if len(DscPcdValueList[0].replace('L','').replace('"','').strip()) == 0:
+ if len(DscPcdValueList[0].replace('L', '').replace('"', '').strip()) == 0:
EdkLogger.error('Parser', FORMAT_INVALID, "The VariableName field in the HII format PCD entry must not be an empty string",
ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
@@ -1296,7 +1296,7 @@ class DscParser(MetaFileParser):
self._ContentIndex = 0
self._InSubsection = False
while self._ContentIndex < len(self._Content) :
- Id, self._ItemType, V1, V2, V3, S1, S2, S3,Owner, self._From, \
+ Id, self._ItemType, V1, V2, V3, S1, S2, S3, Owner, self._From, \
LineStart, ColStart, LineEnd, ColEnd, Enabled = self._Content[self._ContentIndex]
if self._From < 0:
@@ -1314,8 +1314,8 @@ class DscParser(MetaFileParser):
break
Record = self._Content[self._ContentIndex]
if LineStart == Record[10] and LineEnd == Record[12]:
- if [Record[5], Record[6],Record[7]] not in self._Scope:
- self._Scope.append([Record[5], Record[6],Record[7]])
+ if [Record[5], Record[6], Record[7]] not in self._Scope:
+ self._Scope.append([Record[5], Record[6], Record[7]])
self._ContentIndex += 1
else:
break
@@ -1404,7 +1404,7 @@ class DscParser(MetaFileParser):
MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_DYNAMIC_EX_HII,
MODEL_PCD_DYNAMIC_EX_VPD):
Records = self._RawTable.Query(PcdType, BelongsToItem= -1.0)
- for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4,ID, Line in Records:
+ for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4, ID, Line in Records:
Name = TokenSpaceGuid + '.' + PcdName
if Name not in GlobalData.gPlatformOtherPcds:
PcdLine = Line
@@ -1778,7 +1778,7 @@ class DecParser(MetaFileParser):
if self._DefinesCount > 1:
EdkLogger.error('Parser', FORMAT_INVALID, 'Multiple [Defines] section is exist.', self.MetaFile )
if self._DefinesCount == 0:
- EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] section exist.',self.MetaFile)
+ EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] section exist.', self.MetaFile)
self._Done()
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index 92fcf6dd2b22..9416065b284f 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -258,8 +258,8 @@ class PackageTable(MetaFileTable):
ValidType = "@ValidList"
if oricomment.startswith("@Expression"):
ValidType = "@Expression"
- EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s of PCD %s.%s is incorrect" % (ValidType,TokenSpaceGuid, PcdCName),
- ExtraData=oricomment,File=self.MetaFile, Line=LineNum)
+ EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s of PCD %s.%s is incorrect" % (ValidType, TokenSpaceGuid, PcdCName),
+ ExtraData=oricomment, File=self.MetaFile, Line=LineNum)
return set(), set(), set()
return set(validateranges), set(validlists), set(expressions)
## Python class representation of table storing platform data
@@ -308,7 +308,7 @@ class PlatformTable(MetaFileTable):
#
def Insert(self, Model, Value1, Value2, Value3, Scope1='COMMON', Scope2='COMMON', Scope3=TAB_DEFAULT_STORES_DEFAULT,BelongsToItem=-1,
FromItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=1):
- (Value1, Value2, Value3, Scope1, Scope2,Scope3) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2,Scope3))
+ (Value1, Value2, Value3, Scope1, Scope2, Scope3) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2, Scope3))
return Table.Insert(
self,
Model,
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
index c760e57b8f64..6b5e0edb0a4d 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
@@ -45,7 +45,7 @@ def GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain):
# @retval: A dictionary contains instances of PcdClassObject with key (PcdCName, TokenSpaceGuid)
# @retval: A dictionary contains real GUIDs of TokenSpaceGuid
#
-def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain,additionalPkgs):
+def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, additionalPkgs):
PkgList = GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain)
PkgList = set(PkgList)
PkgList |= additionalPkgs
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index e71c0abc25b9..aa357e4ed62b 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1213,16 +1213,16 @@ class PcdReport(object):
else:
if IsByteArray:
if self.SkuSingle:
- FileWrite(File, ' %-*s : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', "{"))
+ FileWrite(File, ' %-*s : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', "{"))
else:
- FileWrite(File, ' %-*s : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
+ FileWrite(File, ' %-*s : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
for Array in ArrayList:
FileWrite(File, '%s' % (Array))
else:
if self.SkuSingle:
- FileWrite(File, ' %-*s : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', Value))
+ FileWrite(File, ' %-*s : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', Value))
else:
- FileWrite(File, ' %-*s : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
+ FileWrite(File, ' %-*s : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
if TypeName in ('DYNVPD', 'DEXVPD'):
FileWrite(File, '%*s' % (self.MaxLen + 4, SkuInfo.VpdOffset))
if IsStructure:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index c6a37ab1d9a3..6fbaad4c0fb6 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -838,7 +838,7 @@ class Build():
self.HashSkipModules = []
self.Db_Flag = False
self.LaunchPrebuildFlag = False
- self.PlatformBuildPath = os.path.join(GlobalData.gConfDirectory,'.cache', '.PlatformBuild')
+ self.PlatformBuildPath = os.path.join(GlobalData.gConfDirectory, '.cache', '.PlatformBuild')
if BuildOptions.CommandLength:
GlobalData.gCommandMaxLength = BuildOptions.CommandLength
@@ -1131,7 +1131,7 @@ class Build():
# and preserve them for the rest of the main build step, because the child process environment will
# evaporate as soon as it exits, we cannot get it in build step.
#
- PrebuildEnvFile = os.path.join(GlobalData.gConfDirectory,'.cache','.PrebuildEnv')
+ PrebuildEnvFile = os.path.join(GlobalData.gConfDirectory, '.cache', '.PrebuildEnv')
if os.path.isfile(PrebuildEnvFile):
os.remove(PrebuildEnvFile)
if os.path.isfile(self.PlatformBuildPath):
@@ -1171,7 +1171,7 @@ class Build():
f = open(PrebuildEnvFile)
envs = f.readlines()
f.close()
- envs = itertools.imap(lambda l: l.split('=',1), envs)
+ envs = itertools.imap(lambda l: l.split('=', 1), envs)
envs = itertools.ifilter(lambda l: len(l) == 2, envs)
envs = itertools.imap(lambda l: [i.strip() for i in l], envs)
os.environ.update(dict(envs))
@@ -2352,7 +2352,7 @@ def MyOptionParser():
Parser.add_option("-D", "--define", action="append", type="string", dest="Macros", help="Macro: \"Name [= Value]\".")
Parser.add_option("-y", "--report-file", action="store", dest="ReportFile", help="Create/overwrite the report to the specified filename.")
- Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD','LIBRARY','FLASH','DEPEX','BUILD_FLAGS','FIXED_ADDRESS','HASH','EXECUTION_ORDER'], dest="ReportType", default=[],
+ Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD', 'LIBRARY', 'FLASH', 'DEPEX', 'BUILD_FLAGS', 'FIXED_ADDRESS', 'HASH', 'EXECUTION_ORDER'], dest="ReportType", default=[],
help="Flags that control the type of build report to generate. Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HASH, EXECUTION_ORDER]. "\
"To specify more than one flag, repeat this option on the command line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BUILD_FLAGS, FIXED_ADDRESS]")
Parser.add_option("-F", "--flag", action="store", type="string", dest="Flag",
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 1cf2ce13be2b..1eafecefbacd 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -161,7 +161,7 @@ class BaseToolsTest(unittest.TestCase):
if minlen is None: minlen = 1024
if maxlen is None: maxlen = minlen
return ''.join(
- [chr(random.randint(0,255))
+ [chr(random.randint(0, 255))
for x in range(random.randint(minlen, maxlen))
])
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 49ff656c066f..3bf524123d0f 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -187,7 +187,7 @@ class Config:
return path
def MakeDirs(self):
- for path in (self.src_dir, self.build_dir,self.prefix, self.symlinks):
+ for path in (self.src_dir, self.build_dir, self.prefix, self.symlinks):
if not os.path.exists(path):
os.makedirs(path)
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 12/20] BaseTools: Migrate to the new octal literal
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (10 preceding siblings ...)
2018-02-01 8:35 ` [PATCH v2 11/20] BaseTools: Adjust the spaces around commas and colons Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 13/20] BaseTools: Unify long int and int in python scripts Gary Lin
` (8 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Change the octal literals according to PEP3127
https://www.python.org/dev/peps/pep-3127/
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/Common/LongFilePathOs.py | 2 +-
BaseTools/Source/Python/UPT/Core/FileHook.py | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/BaseTools/Source/Python/Common/LongFilePathOs.py b/BaseTools/Source/Python/Common/LongFilePathOs.py
index 2e530f9dd774..47d63faeb995 100644
--- a/BaseTools/Source/Python/Common/LongFilePathOs.py
+++ b/BaseTools/Source/Python/Common/LongFilePathOs.py
@@ -33,7 +33,7 @@ def rmdir(path):
def mkdir(path):
return os.mkdir(LongFilePath(path))
-def makedirs(name, mode=0777):
+def makedirs(name, mode=0o777):
return os.makedirs(LongFilePath(name), mode)
def rename(old, new):
diff --git a/BaseTools/Source/Python/UPT/Core/FileHook.py b/BaseTools/Source/Python/UPT/Core/FileHook.py
index d8736a872366..67e86f4f7454 100644
--- a/BaseTools/Source/Python/UPT/Core/FileHook.py
+++ b/BaseTools/Source/Python/UPT/Core/FileHook.py
@@ -166,7 +166,7 @@ def _hookrm(path):
else:
__built_in_remove__(path)
-def _hookmkdir(path, mode=0777):
+def _hookmkdir(path, mode=0o777):
if GlobalData.gRECOVERMGR:
GlobalData.gRECOVERMGR.bkmkdir(path, mode)
else:
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 13/20] BaseTools: Unify long int and int in python scripts
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (11 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 12/20] BaseTools: Migrate to the new octal literal Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 14/20] BaseTools: Adjust old python2 idioms Gary Lin
` (7 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
According to PEP237, long int and int are unified.
https://www.python.org/dev/peps/pep-0237/
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/Common/Expression.py | 5 ++---
1 file changed, 2 insertions(+), 3 deletions(-)
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 90ef92a14f41..af5baeb2f5e1 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -179,7 +179,6 @@ class ValueExpression(object):
Oprand2 = IntToStr(Oprand2)
TypeDict = {
type(0) : 0,
- type(0L) : 0,
type('') : 1,
type(True) : 2
}
@@ -795,7 +794,7 @@ class ValueExpressionEx(ValueExpression):
raise BadExpression('Type %s PCD Value Size is Larger than 8 byte' % self.PcdType)
else:
try:
- TmpValue = long(PcdValue)
+ TmpValue = int(PcdValue)
TmpList = []
if TmpValue.bit_length() == 0:
PcdValue = '{0x00}'
@@ -825,7 +824,7 @@ class ValueExpressionEx(ValueExpression):
else:
ListItem = PcdValue.split(',')
- if type(ListItem) == type(0) or type(ListItem) == type(0L):
+ if type(ListItem) == type(0):
for Index in range(0, Size):
ValueStr += '0x%02X' % (int(ListItem) & 255)
ListItem >>= 8
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 14/20] BaseTools: Adjust old python2 idioms
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (12 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 13/20] BaseTools: Unify long int and int in python scripts Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 15/20] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
` (6 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Based on "futurize -f lib2to3.fixes.fix_idioms"
* Change some type comparisons to isinstance() calls:
type(x) == T -> isinstance(x, T)
type(x) is T -> isinstance(x, T)
type(x) != T -> not isinstance(x, T)
type(x) is not T -> not isinstance(x, T)
* Change "while 1:" into "while True:".
* Change both
v = list(EXPR)
v.sort()
foo(v)
and the more general
v = EXPR
v.sort()
foo(v)
into
v = sorted(EXPR)
foo(v)
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Scripts/MemoryProfileSymbolGen.py | 2 +-
BaseTools/Scripts/UpdateBuildVersions.py | 6 +--
BaseTools/Source/Python/AutoGen/AutoGen.py | 3 +-
BaseTools/Source/Python/AutoGen/BuildEngine.py | 2 +-
BaseTools/Source/Python/AutoGen/GenDepex.py | 2 +-
BaseTools/Source/Python/Common/Dictionary.py | 2 +-
BaseTools/Source/Python/Common/Expression.py | 46 ++++++++++----------
BaseTools/Source/Python/Common/Misc.py | 13 +++---
BaseTools/Source/Python/Common/RangeExpression.py | 16 +++----
BaseTools/Source/Python/Common/String.py | 4 +-
BaseTools/Source/Python/Common/TargetTxtClassObject.py | 2 +-
BaseTools/Source/Python/Common/ToolDefClassObject.py | 2 +-
BaseTools/Source/Python/Common/VpdInfoFile.py | 3 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 12 ++---
BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 2 +-
BaseTools/Source/Python/Eot/Parser.py | 2 +-
BaseTools/Source/Python/GenFds/GenFds.py | 4 +-
BaseTools/Source/Python/TargetTool/TargetTool.py | 2 +-
BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py | 15 +++----
BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py | 21 +++------
BaseTools/Source/Python/UPT/Library/Misc.py | 6 +--
BaseTools/Source/Python/UPT/Library/ParserValidate.py | 2 +-
BaseTools/Source/Python/UPT/Library/String.py | 2 +-
BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py | 2 +-
BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 3 +-
BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py | 3 +-
BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py | 3 +-
BaseTools/Source/Python/Workspace/BuildClassObject.py | 2 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 6 +--
BaseTools/Source/Python/Workspace/MetaFileParser.py | 18 ++++----
BaseTools/Source/Python/build/BuildReport.py | 3 +-
BaseTools/Source/Python/build/build.py | 4 +-
BaseTools/gcc/mingw-gcc-build.py | 4 +-
33 files changed, 100 insertions(+), 119 deletions(-)
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index c9158800668d..b98f6dccea08 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -263,7 +263,7 @@ def main():
return 1
try:
- while 1:
+ while True:
line = file.readline()
if not line:
break
diff --git a/BaseTools/Scripts/UpdateBuildVersions.py b/BaseTools/Scripts/UpdateBuildVersions.py
index cff2e2263a8a..5725be57562f 100755
--- a/BaseTools/Scripts/UpdateBuildVersions.py
+++ b/BaseTools/Scripts/UpdateBuildVersions.py
@@ -253,7 +253,7 @@ def GetSvnRevision(opts):
StatusCmd = "svn st -v --depth infinity --non-interactive"
contents = ShellCommandResults(StatusCmd, opts)
os.chdir(Cwd)
- if type(contents) is ListType:
+ if isinstance(contents, ListType):
for line in contents:
if line.startswith("M "):
Modified = True
@@ -263,7 +263,7 @@ def GetSvnRevision(opts):
InfoCmd = "svn info %s" % SrcPath.replace("\\", "/").strip()
Revision = 0
contents = ShellCommandResults(InfoCmd, opts)
- if type(contents) is IntType:
+ if isinstance(contents, IntType):
return 0, Modified
for line in contents:
line = line.strip()
@@ -284,7 +284,7 @@ def CheckSvn(opts):
VerCmd = "svn --version"
contents = ShellCommandResults(VerCmd, opts)
opts.silent = OriginalSilent
- if type(contents) is IntType:
+ if isinstance(contents, IntType):
if opts.verbose:
sys.stdout.write("SVN does not appear to be available.\n")
sys.stdout.flush()
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 0017f66e5ec8..5af16e6d68d7 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -1608,8 +1608,7 @@ class PlatformAutoGen(AutoGen):
PcdNvStoreDfBuffer.SkuInfoList[skuname].DefaultValue = vardump
PcdNvStoreDfBuffer.MaxDatumSize = str(len(vardump.split(",")))
- PlatformPcds = self._PlatformPcds.keys()
- PlatformPcds.sort()
+ PlatformPcds = sorted(self._PlatformPcds.keys())
#
# Add VPD type PCD into VpdFile and determine whether the VPD PCD need to be fixed up.
#
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index e8f6788cdc40..6daff7210a37 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -80,7 +80,7 @@ class TargetDescBlock(object):
return hash(self.Target.Path)
def __eq__(self, Other):
- if type(Other) == type(self):
+ if isinstance(Other, type(self)):
return Other.Target.Path == self.Target.Path
else:
return str(Other) == self.Target.Path
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 98a43db7a4e5..0f6a1700f541 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -143,7 +143,7 @@ class DependencyExpression:
def __init__(self, Expression, ModuleType, Optimize=False):
self.ModuleType = ModuleType
self.Phase = gType2Phase[ModuleType]
- if type(Expression) == type([]):
+ if isinstance(Expression, type([])):
self.ExpressionString = " ".join(Expression)
self.TokenList = Expression
else:
diff --git a/BaseTools/Source/Python/Common/Dictionary.py b/BaseTools/Source/Python/Common/Dictionary.py
index 5f2cc8f31ffa..c381995f97ff 100644
--- a/BaseTools/Source/Python/Common/Dictionary.py
+++ b/BaseTools/Source/Python/Common/Dictionary.py
@@ -69,7 +69,7 @@ def printDict(Dict):
# @param key: The key of the item to be printed
#
def printList(Key, List):
- if type(List) == type([]):
+ if isinstance(List, type([])):
if len(List) > 0:
if Key.find(TAB_SPLIT) != -1:
print("\n" + Key)
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index af5baeb2f5e1..3b5afdc9ab06 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -159,23 +159,23 @@ class ValueExpression(object):
def Eval(Operator, Oprand1, Oprand2 = None):
WrnExp = None
- if Operator not in ["in", "not in"] and (type(Oprand1) == type('') or type(Oprand2) == type('')):
- if type(Oprand1) == type(''):
+ if Operator not in ["in", "not in"] and (isinstance(Oprand1, type('')) or isinstance(Oprand2, type(''))):
+ if isinstance(Oprand1, type('')):
if Oprand1[0] in ['"', "'"] or Oprand1.startswith('L"') or Oprand1.startswith("L'")or Oprand1.startswith('UINT'):
Oprand1, Size = ParseFieldValue(Oprand1)
else:
Oprand1, Size = ParseFieldValue('"' + Oprand1 + '"')
- if type(Oprand2) == type(''):
+ if isinstance(Oprand2, type('')):
if Oprand2[0] in ['"', "'"] or Oprand2.startswith('L"') or Oprand2.startswith("L'") or Oprand2.startswith('UINT'):
Oprand2, Size = ParseFieldValue(Oprand2)
else:
Oprand2, Size = ParseFieldValue('"' + Oprand2 + '"')
- if type(Oprand1) == type('') or type(Oprand2) == type(''):
+ if isinstance(Oprand1, type('')) or isinstance(Oprand2, type('')):
raise BadExpression(ERR_STRING_EXPR % Operator)
if Operator in ['in', 'not in']:
- if type(Oprand1) != type(''):
+ if not isinstance(Oprand1, type('')):
Oprand1 = IntToStr(Oprand1)
- if type(Oprand2) != type(''):
+ if not isinstance(Oprand2, type('')):
Oprand2 = IntToStr(Oprand2)
TypeDict = {
type(0) : 0,
@@ -185,18 +185,18 @@ class ValueExpression(object):
EvalStr = ''
if Operator in ["!", "NOT", "not"]:
- if type(Oprand1) == type(''):
+ if isinstance(Oprand1, type('')):
raise BadExpression(ERR_STRING_EXPR % Operator)
EvalStr = 'not Oprand1'
elif Operator in ["~"]:
- if type(Oprand1) == type(''):
+ if isinstance(Oprand1, type('')):
raise BadExpression(ERR_STRING_EXPR % Operator)
EvalStr = '~ Oprand1'
else:
if Operator in ["+", "-"] and (type(True) in [type(Oprand1), type(Oprand2)]):
# Boolean in '+'/'-' will be evaluated but raise warning
WrnExp = WrnExpression(WRN_BOOL_EXPR)
- elif type('') in [type(Oprand1), type(Oprand2)] and type(Oprand1)!= type(Oprand2):
+ elif type('') in [type(Oprand1), type(Oprand2)] and not isinstance(Oprand1, type(Oprand2)):
# == between string and number/boolean will always return False, != return True
if Operator == "==":
WrnExp = WrnExpression(WRN_EQCMP_STR_OTHERS)
@@ -217,11 +217,11 @@ class ValueExpression(object):
pass
else:
raise BadExpression(ERR_EXPR_TYPE)
- if type(Oprand1) == type('') and type(Oprand2) == type(''):
+ if isinstance(Oprand1, type('')) and isinstance(Oprand2, type('')):
if (Oprand1.startswith('L"') and not Oprand2.startswith('L"')) or \
(not Oprand1.startswith('L"') and Oprand2.startswith('L"')):
raise BadExpression(ERR_STRING_CMP % (Oprand1, Operator, Oprand2))
- if 'in' in Operator and type(Oprand2) == type(''):
+ if 'in' in Operator and isinstance(Oprand2, type('')):
Oprand2 = Oprand2.split()
EvalStr = 'Oprand1 ' + Operator + ' Oprand2'
@@ -248,7 +248,7 @@ class ValueExpression(object):
def __init__(self, Expression, SymbolTable={}):
self._NoProcess = False
- if type(Expression) != type(''):
+ if not isinstance(Expression, type('')):
self._Expr = Expression
self._NoProcess = True
return
@@ -296,7 +296,7 @@ class ValueExpression(object):
Token = self._GetToken()
except BadExpression:
pass
- if type(Token) == type('') and Token.startswith('{') and Token.endswith('}') and self._Idx >= self._Len:
+ if isinstance(Token, type('')) and Token.startswith('{') and Token.endswith('}') and self._Idx >= self._Len:
if len(Token) != len(self._Expr.replace(' ', '')):
raise BadExpression
return self._Expr
@@ -306,7 +306,7 @@ class ValueExpression(object):
Val = self._ConExpr()
RealVal = Val
- if type(Val) == type(''):
+ if isinstance(Val, type('')):
if Val == 'L""':
Val = False
elif not Val:
@@ -554,7 +554,7 @@ class ValueExpression(object):
Ex.Pcd = self._Token
raise Ex
self._Token = ValueExpression(self._Symb[self._Token], self._Symb)(True, self._Depth+1)
- if type(self._Token) != type(''):
+ if not isinstance(self._Token, type('')):
self._LiteralToken = hex(self._Token)
return
@@ -657,7 +657,7 @@ class ValueExpression(object):
if Ch == ')':
TmpValue = self._Expr[Idx :self._Idx - 1]
TmpValue = ValueExpression(TmpValue)(True)
- TmpValue = '0x%x' % int(TmpValue) if type(TmpValue) != type('') else TmpValue
+ TmpValue = '0x%x' % int(TmpValue) if not isinstance(TmpValue, type('')) else TmpValue
break
self._Token, Size = ParseFieldValue(Prefix + '(' + TmpValue + ')')
return self._Token
@@ -750,9 +750,9 @@ class ValueExpressionEx(ValueExpression):
except BadExpression:
if self.PcdType in ['UINT8', 'UINT16', 'UINT32', 'UINT64', 'BOOLEAN']:
PcdValue = PcdValue.strip()
- if type(PcdValue) == type('') and PcdValue.startswith('{') and PcdValue.endswith('}'):
+ if isinstance(PcdValue, type('')) and PcdValue.startswith('{') and PcdValue.endswith('}'):
PcdValue = PcdValue[1:-1].split(',')
- if type(PcdValue) == type([]):
+ if isinstance(PcdValue, type([])):
TmpValue = 0
Size = 0
for Item in PcdValue:
@@ -771,14 +771,14 @@ class ValueExpressionEx(ValueExpression):
else:
ItemValue = ParseFieldValue(Item)[0]
- if type(ItemValue) == type(''):
+ if isinstance(ItemValue, type('')):
ItemValue = int(ItemValue, 16) if ItemValue.startswith('0x') else int(ItemValue)
TmpValue = (ItemValue << (Size * 8)) | TmpValue
Size = Size + ItemSize
else:
TmpValue, Size = ParseFieldValue(PcdValue)
- if type(TmpValue) == type(''):
+ if isinstance(TmpValue, type('')):
TmpValue = int(TmpValue)
else:
PcdValue = '0x%0{}X'.format(Size) % (TmpValue)
@@ -824,13 +824,13 @@ class ValueExpressionEx(ValueExpression):
else:
ListItem = PcdValue.split(',')
- if type(ListItem) == type(0):
+ if isinstance(ListItem, type(0)):
for Index in range(0, Size):
ValueStr += '0x%02X' % (int(ListItem) & 255)
ListItem >>= 8
ValueStr += ', '
PcdValue = '{' + ValueStr[:-2] + '}'
- elif type(ListItem) == type(''):
+ elif isinstance(ListItem, type('')):
if ListItem.startswith('{') and ListItem.endswith('}'):
PcdValue = ListItem
else:
@@ -875,7 +875,7 @@ class ValueExpressionEx(ValueExpression):
TmpValue = ValueExpressionEx(Item, ValueType, self._Symb)(True)
else:
TmpValue = ValueExpressionEx(Item, self.PcdType, self._Symb)(True)
- Item = '0x%x' % TmpValue if type(TmpValue) != type('') else TmpValue
+ Item = '0x%x' % TmpValue if not isinstance(TmpValue, type('')) else TmpValue
if ItemSize == 0:
ItemValue, ItemSize = ParseFieldValue(Item)
else:
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 10cb95559822..1a7418734cf8 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1508,9 +1508,9 @@ def ParseDevPathValue (Value):
return '{' + out + '}', Size
def ParseFieldValue (Value):
- if type(Value) == type(0):
+ if isinstance(Value, type(0)):
return Value, (Value.bit_length() + 7) / 8
- if type(Value) != type(''):
+ if not isinstance(Value, type('')):
raise BadExpression('Type %s is %s' %(Value, type(Value)))
Value = Value.strip()
if Value.startswith('UINT8') and Value.endswith(')'):
@@ -1834,8 +1834,7 @@ def CheckPcdDatum(Type, Value):
Printset.add(TAB_PRINTCHAR_BS)
Printset.add(TAB_PRINTCHAR_NUL)
if not set(Value).issubset(Printset):
- PrintList = list(Printset)
- PrintList.sort()
+ PrintList = sorted(Printset)
return False, "Invalid PCD string value of type [%s]; must be printable chars %s." % (Type, PrintList)
elif Type == 'BOOLEAN':
if Value not in ['TRUE', 'True', 'true', '0x1', '0x01', '1', 'FALSE', 'False', 'false', '0x0', '0x00', '0']:
@@ -1997,7 +1996,7 @@ class PathClass(object):
# @retval True The two PathClass are the same
#
def __eq__(self, Other):
- if type(Other) == type(self):
+ if isinstance(Other, type(self)):
return self.Path == Other.Path
else:
return self.Path == str(Other)
@@ -2010,7 +2009,7 @@ class PathClass(object):
# @retval -1 The first PathClass is less than the second PathClass
# @retval 1 The first PathClass is Bigger than the second PathClass
def __cmp__(self, Other):
- if type(Other) == type(self):
+ if isinstance(Other, type(self)):
OtherKey = Other.Path
else:
OtherKey = str(Other)
@@ -2256,7 +2255,7 @@ class SkuClass():
return ["DEFAULT"]
skulist = [sku]
nextsku = sku
- while 1:
+ while True:
nextsku = self.GetNextSkuId(nextsku)
skulist.append(nextsku)
if nextsku == "DEFAULT":
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 496961554e87..1bf3adab1e1d 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -106,7 +106,7 @@ class XOROperatorObject(object):
def __init__(self):
pass
def Calculate(self, Operand, DataType, SymbolTable):
- if type(Operand) == type('') and not Operand.isalnum():
+ if isinstance(Operand, type('')) and not Operand.isalnum():
Expr = "XOR ..."
raise BadExpression(ERR_SNYTAX % Expr)
rangeId = str(uuid.uuid1())
@@ -120,7 +120,7 @@ class LEOperatorObject(object):
def __init__(self):
pass
def Calculate(self, Operand, DataType, SymbolTable):
- if type(Operand) == type('') and not Operand.isalnum():
+ if isinstance(Operand, type('')) and not Operand.isalnum():
Expr = "LE ..."
raise BadExpression(ERR_SNYTAX % Expr)
rangeId1 = str(uuid.uuid1())
@@ -132,7 +132,7 @@ class LTOperatorObject(object):
def __init__(self):
pass
def Calculate(self, Operand, DataType, SymbolTable):
- if type(Operand) == type('') and not Operand.isalnum():
+ if isinstance(Operand, type('')) and not Operand.isalnum():
Expr = "LT ..."
raise BadExpression(ERR_SNYTAX % Expr)
rangeId1 = str(uuid.uuid1())
@@ -145,7 +145,7 @@ class GEOperatorObject(object):
def __init__(self):
pass
def Calculate(self, Operand, DataType, SymbolTable):
- if type(Operand) == type('') and not Operand.isalnum():
+ if isinstance(Operand, type('')) and not Operand.isalnum():
Expr = "GE ..."
raise BadExpression(ERR_SNYTAX % Expr)
rangeId1 = str(uuid.uuid1())
@@ -158,7 +158,7 @@ class GTOperatorObject(object):
def __init__(self):
pass
def Calculate(self, Operand, DataType, SymbolTable):
- if type(Operand) == type('') and not Operand.isalnum():
+ if isinstance(Operand, type('')) and not Operand.isalnum():
Expr = "GT ..."
raise BadExpression(ERR_SNYTAX % Expr)
rangeId1 = str(uuid.uuid1())
@@ -171,7 +171,7 @@ class EQOperatorObject(object):
def __init__(self):
pass
def Calculate(self, Operand, DataType, SymbolTable):
- if type(Operand) == type('') and not Operand.isalnum():
+ if isinstance(Operand, type('')) and not Operand.isalnum():
Expr = "EQ ..."
raise BadExpression(ERR_SNYTAX % Expr)
rangeId1 = str(uuid.uuid1())
@@ -370,7 +370,7 @@ class RangeExpression(object):
def __init__(self, Expression, PcdDataType, SymbolTable = {}):
self._NoProcess = False
- if type(Expression) != type(''):
+ if not isinstance(Expression, type('')):
self._Expr = Expression
self._NoProcess = True
return
@@ -591,7 +591,7 @@ class RangeExpression(object):
Ex.Pcd = self._Token
raise Ex
self._Token = RangeExpression(self._Symb[self._Token], self._Symb)(True, self._Depth + 1)
- if type(self._Token) != type(''):
+ if not isinstance(self._Token, type('')):
self._LiteralToken = hex(self._Token)
return
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index 358e7b8d7c31..d2ec46d84eb8 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -246,7 +246,7 @@ def SplitModuleType(Key):
def ReplaceMacros(StringList, MacroDefinitions={}, SelfReplacement=False):
NewList = []
for String in StringList:
- if type(String) == type(''):
+ if isinstance(String, type('')):
NewList.append(ReplaceMacro(String, MacroDefinitions, SelfReplacement))
else:
NewList.append(String)
@@ -782,7 +782,7 @@ def RemoveBlockComment(Lines):
# Get String of a List
#
def GetStringOfList(List, Split=' '):
- if type(List) != type([]):
+ if not isinstance(List, type([])):
return List
Str = ''
for Item in List:
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 3408cff8d75e..9c1e6b407356 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -159,7 +159,7 @@ class TargetTxtClassObject(object):
# @param key: The key of the item to be printed
#
def printList(Key, List):
- if type(List) == type([]):
+ if isinstance(List, type([])):
if len(List) > 0:
if Key.find(TAB_SPLIT) != -1:
print("\n" + Key)
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index 6dab179efc01..d3587b171192 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -155,7 +155,7 @@ class ToolDefClassObject(object):
if ErrorCode != 0:
EdkLogger.error("tools_def.txt parser", FILE_NOT_FOUND, ExtraData=IncFile)
- if type(IncFileTmp) is PathClass:
+ if isinstance(IncFileTmp, PathClass):
IncFile = IncFileTmp.Path
else:
IncFile = IncFileTmp
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index d59697c64b68..96d906ae2b3a 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -128,8 +128,7 @@ class VpdInfoFile:
"Invalid parameter FilePath: %s." % FilePath)
Content = FILE_COMMENT_TEMPLATE
- Pcds = self._VpdArray.keys()
- Pcds.sort()
+ Pcds = sorted(self._VpdArray.keys())
for Pcd in Pcds:
i = 0
PcdTokenCName = Pcd.TokenCName
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 145c7435cd12..605a1d847c61 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -69,7 +69,7 @@ def ParseMacro(Parser):
self._ItemType = MODEL_META_DATA_DEFINE
# DEFINE defined macros
if Type == TAB_DSC_DEFINES_DEFINE:
- if type(self) == DecParser:
+ if isinstance(self, DecParser):
if MODEL_META_DATA_HEADER in self._SectionType:
self._FileLocalMacros[Name] = Value
else:
@@ -84,7 +84,7 @@ def ParseMacro(Parser):
SectionLocalMacros = self._SectionsMacroDict[SectionDictKey]
SectionLocalMacros[Name] = Value
# EDK_GLOBAL defined macros
- elif type(self) != DscParser:
+ elif not isinstance(self, DscParser):
EdkLogger.error('Parser', FORMAT_INVALID, "EDK_GLOBAL can only be used in .dsc file",
ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
elif self._SectionType != MODEL_META_DATA_HEADER:
@@ -216,7 +216,7 @@ class MetaFileParser(object):
# DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
#
def __getitem__(self, DataInfo):
- if type(DataInfo) != type(()):
+ if not isinstance(DataInfo, type(())):
DataInfo = (DataInfo,)
# Parse the file first, if necessary
@@ -258,7 +258,7 @@ class MetaFileParser(object):
TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
self._ValueList[0:len(TokenList)] = TokenList
# Don't do macro replacement for dsc file at this point
- if type(self) != DscParser:
+ if not isinstance(self, DscParser):
Macros = self._Macros
self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
@@ -356,7 +356,7 @@ class MetaFileParser(object):
if os.path.exists(UniFile):
self._UniObj = UniParser(UniFile, IsExtraUni=False, IsModuleUni=False)
- if type(self) == InfParser and self._Version < 0x00010005:
+ if isinstance(self, InfParser) and self._Version < 0x00010005:
# EDK module allows using defines as macros
self._FileLocalMacros[Name] = Value
self._Defines[Name] = Value
@@ -371,7 +371,7 @@ class MetaFileParser(object):
self._ValueList[1] = TokenList2[1] # keys
else:
self._ValueList[1] = TokenList[0]
- if len(TokenList) == 2 and type(self) != DscParser: # value
+ if len(TokenList) == 2 and not isinstance(self, DscParser): # value
self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
if self._ValueList[1].count('_') != 4:
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index eb76f4e6d54a..313fad602841 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -35,7 +35,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
Element.appendChild(Doc.createTextNode(String))
for Item in NodeList:
- if type(Item) == type([]):
+ if isinstance(Item, type([])):
Key = Item[0]
Value = Item[1]
if Key != '' and Key != None and Value != '' and Value != None:
diff --git a/BaseTools/Source/Python/Eot/Parser.py b/BaseTools/Source/Python/Eot/Parser.py
index ab19e30b69aa..951fe7e3be2e 100644
--- a/BaseTools/Source/Python/Eot/Parser.py
+++ b/BaseTools/Source/Python/Eot/Parser.py
@@ -731,7 +731,7 @@ def GetParameter(Parameter, Index = 1):
# @return: The name of parameter
#
def GetParameterName(Parameter):
- if type(Parameter) == type('') and Parameter.startswith('&'):
+ if isinstance(Parameter, type('')) and Parameter.startswith('&'):
return Parameter[1:].replace('{', '').replace('}', '').replace('\r', '').replace('\n', '').strip()
else:
return Parameter.strip()
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 0aadbbd080b3..cb8dabbe038d 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -372,7 +372,7 @@ def CheckBuildOptionPcd():
for Arch in GenFdsGlobalVariable.ArchList:
PkgList = GenFdsGlobalVariable.WorkSpace.GetPackageList(GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag)
for i, pcd in enumerate(GlobalData.BuildOptionPcd):
- if type(pcd) is tuple:
+ if isinstance(pcd, tuple):
continue
(pcdname, pcdvalue) = pcd.split('=')
if not pcdvalue:
@@ -840,7 +840,7 @@ class GenFds :
if not Name:
continue
- Name = ' '.join(Name) if type(Name) == type([]) else Name
+ Name = ' '.join(Name) if isinstance(Name, type([])) else Name
GuidXRefFile.write("%s %s\n" %(FileStatementGuid, Name))
# Append GUIDs, Protocols, and PPIs to the Xref file
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index fe74abb28901..2b6124dd4579 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -84,7 +84,7 @@ class TargetTool():
KeyList = self.TargetTxtDictionary.keys()
errMsg = ''
for Key in KeyList:
- if type(self.TargetTxtDictionary[Key]) == type([]):
+ if isinstance(self.TargetTxtDictionary[Key], type([])):
print("%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key])))
elif self.TargetTxtDictionary[Key] == None:
errMsg += " Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
index d39c1827ba26..53d7b2b19b52 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
@@ -123,8 +123,7 @@ def GenPcd(Package, Content):
if Pcd.GetSupModuleList():
Statement += GenDecTailComment(Pcd.GetSupModuleList())
- ArchList = Pcd.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(Pcd.GetSupArchList())
SortedArch = ' '.join(ArchList)
if SortedArch in NewSectionDict:
NewSectionDict[SortedArch] = \
@@ -205,8 +204,7 @@ def GenGuidProtocolPpi(Package, Content):
#
if Guid.GetSupModuleList():
Statement += GenDecTailComment(Guid.GetSupModuleList())
- ArchList = Guid.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(Guid.GetSupArchList())
SortedArch = ' '.join(ArchList)
if SortedArch in NewSectionDict:
NewSectionDict[SortedArch] = \
@@ -246,8 +244,7 @@ def GenGuidProtocolPpi(Package, Content):
#
if Protocol.GetSupModuleList():
Statement += GenDecTailComment(Protocol.GetSupModuleList())
- ArchList = Protocol.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(Protocol.GetSupArchList())
SortedArch = ' '.join(ArchList)
if SortedArch in NewSectionDict:
NewSectionDict[SortedArch] = \
@@ -287,8 +284,7 @@ def GenGuidProtocolPpi(Package, Content):
#
if Ppi.GetSupModuleList():
Statement += GenDecTailComment(Ppi.GetSupModuleList())
- ArchList = Ppi.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(Ppi.GetSupArchList())
SortedArch = ' '.join(ArchList)
if SortedArch in NewSectionDict:
NewSectionDict[SortedArch] = \
@@ -463,8 +459,7 @@ def PackageToDec(Package, DistHeader = None):
if LibraryClass.GetSupModuleList():
Statement += \
GenDecTailComment(LibraryClass.GetSupModuleList())
- ArchList = LibraryClass.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(LibraryClass.GetSupArchList())
SortedArch = ' '.join(ArchList)
if SortedArch in NewSectionDict:
NewSectionDict[SortedArch] = \
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index 4a9528b500f2..4dcdcff4f13a 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -494,8 +494,7 @@ def GenPackages(ModuleObject):
Statement += RelaPath.replace('\\', '/')
if FFE:
Statement += '|' + FFE
- ArchList = PackageDependency.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(PackageDependency.GetSupArchList())
SortedArch = ' '.join(ArchList)
if SortedArch in NewSectionDict:
NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [Statement]
@@ -514,8 +513,7 @@ def GenSources(ModuleObject):
SourceFile = Source.GetSourceFile()
Family = Source.GetFamily()
FeatureFlag = Source.GetFeatureFlag()
- SupArchList = Source.GetSupArchList()
- SupArchList.sort()
+ SupArchList = sorted(Source.GetSupArchList())
SortedArch = ' '.join(SupArchList)
Statement = GenSourceStatement(ConvertPath(SourceFile), Family, FeatureFlag)
if SortedArch in NewSectionDict:
@@ -723,8 +721,7 @@ def GenGuidSections(GuidObjList):
#
# merge duplicate items
#
- ArchList = Guid.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(Guid.GetSupArchList())
SortedArch = ' '.join(ArchList)
if (Statement, SortedArch) in GuidDict:
PreviousComment = GuidDict[Statement, SortedArch]
@@ -783,8 +780,7 @@ def GenProtocolPPiSections(ObjList, IsProtocol):
#
# merge duplicate items
#
- ArchList = Object.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(Object.GetSupArchList())
SortedArch = ' '.join(ArchList)
if (Statement, SortedArch) in Dict:
PreviousComment = Dict[Statement, SortedArch]
@@ -858,8 +854,7 @@ def GenPcdSections(ModuleObject):
#
# Merge duplicate entries
#
- ArchList = Pcd.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(Pcd.GetSupArchList())
SortedArch = ' '.join(ArchList)
if (Statement, SortedArch) in Dict:
PreviousComment = Dict[Statement, SortedArch]
@@ -1026,8 +1021,7 @@ def GenSpecialSections(ObjectList, SectionName, UserExtensionsContent=''):
if CommentStr and not CommentStr.endswith('\n#\n'):
CommentStr = CommentStr + '#\n'
NewStateMent = CommentStr + Statement
- SupArch = Obj.GetSupArchList()
- SupArch.sort()
+ SupArch = sorted(Obj.GetSupArchList())
SortedArch = ' '.join(SupArch)
if SortedArch in NewSectionDict:
NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [NewStateMent]
@@ -1105,8 +1099,7 @@ def GenBinaries(ModuleObject):
FileName = ConvertPath(FileNameObj.GetFilename())
FileType = FileNameObj.GetFileType()
FFE = FileNameObj.GetFeatureFlag()
- ArchList = FileNameObj.GetSupArchList()
- ArchList.sort()
+ ArchList = sorted(FileNameObj.GetSupArchList())
SortedArch = ' '.join(ArchList)
Key = (FileName, FileType, FFE, SortedArch)
if Key in BinariesDict:
diff --git a/BaseTools/Source/Python/UPT/Library/Misc.py b/BaseTools/Source/Python/UPT/Library/Misc.py
index 24e0a20daf87..936db991cdf5 100644
--- a/BaseTools/Source/Python/UPT/Library/Misc.py
+++ b/BaseTools/Source/Python/UPT/Library/Misc.py
@@ -515,7 +515,7 @@ class PathClass(object):
# Check whether PathClass are the same
#
def __eq__(self, Other):
- if type(Other) == type(self):
+ if isinstance(Other, type(self)):
return self.Path == Other.Path
else:
return self.Path == str(Other)
@@ -820,11 +820,11 @@ def ConvertArchList(ArchList):
if not ArchList:
return NewArchList
- if type(ArchList) == list:
+ if isinstance(ArchList, list):
for Arch in ArchList:
Arch = Arch.upper()
NewArchList.append(Arch)
- elif type(ArchList) == str:
+ elif isinstance(ArchList, str):
ArchList = ArchList.upper()
NewArchList.append(ArchList)
diff --git a/BaseTools/Source/Python/UPT/Library/ParserValidate.py b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
index 028cf9a54f84..5348073b56ba 100644
--- a/BaseTools/Source/Python/UPT/Library/ParserValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
@@ -341,7 +341,7 @@ def IsValidCFormatGuid(Guid):
#
# Index may out of bound
#
- if type(List[Index]) != type(1) or \
+ if not isinstance(List[Index], type(1)) or \
len(Value) > List[Index] or len(Value) < 3:
return False
diff --git a/BaseTools/Source/Python/UPT/Library/String.py b/BaseTools/Source/Python/UPT/Library/String.py
index de3035279f01..e6cab4650373 100644
--- a/BaseTools/Source/Python/UPT/Library/String.py
+++ b/BaseTools/Source/Python/UPT/Library/String.py
@@ -652,7 +652,7 @@ def ConvertToSqlString2(String):
# @param Split: split character
#
def GetStringOfList(List, Split=' '):
- if type(List) != type([]):
+ if not isinstance(List, type([])):
return List
Str = ''
for Item in List:
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
index fd02efb6bf04..05fe3b547326 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
@@ -40,7 +40,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
Element.appendChild(Doc.createTextNode(String))
for Item in NodeList:
- if type(Item) == type([]):
+ if isinstance(Item, type([])):
Key = Item[0]
Value = Item[1]
if Key != '' and Key != None and Value != '' and Value != None:
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 1e0c79d6677d..bcc5d96f9153 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -410,8 +410,7 @@ class DecPomAlignment(PackageObject):
#
PackagePath = os.path.split(self.GetFullPath())[0]
IncludePathList = \
- [os.path.normpath(Path) + sep for Path in IncludesDict.keys()]
- IncludePathList.sort()
+ sorted([os.path.normpath(Path) + sep for Path in IncludesDict.keys()])
#
# get a non-overlap set of include path, IncludePathList should be
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
index a15173285345..c0e4805a3f15 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
@@ -614,8 +614,7 @@ class InfPomAlignment(ModuleObject):
SourceFile = Item.GetSourceFileName()
Family = Item.GetFamily()
FeatureFlag = Item.GetFeatureFlagExp()
- SupArchList = ConvertArchList(Item.GetSupArchList())
- SupArchList.sort()
+ SupArchList = sorted(ConvertArchList(Item.GetSupArchList()))
Source = SourceFileObject()
Source.SetSourceFile(SourceFile)
Source.SetFamily(Family)
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
index 042d4784c84c..9685799a0f0d 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
@@ -194,8 +194,7 @@ def GenBinaryData(BinaryData, BinaryObj, BinariesDict, AsBuildIns, BinaryFileObj
# can be used for the attribute.
# If both not have VALID_ARCHITECTURE comment and no architecturie specified, then keep it empty.
#
- SupArchList = ConvertArchList(ItemObj.GetSupArchList())
- SupArchList.sort()
+ SupArchList = sorted(ConvertArchList(ItemObj.GetSupArchList()))
if len(SupArchList) == 1 and SupArchList[0] == 'COMMON':
if not (len(OriSupArchList) == 1 or OriSupArchList[0] == 'COMMON'):
SupArchList = OriSupArchList
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 0e1161c96f64..088e22ba098b 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -168,7 +168,7 @@ class StructurePcd(PcdClassObject):
self.validateranges = PcdObject.validateranges if PcdObject.validateranges else self.validateranges
self.validlists = PcdObject.validlists if PcdObject.validlists else self.validlists
self.expressions = PcdObject.expressions if PcdObject.expressions else self.expressions
- if type(PcdObject) is StructurePcd:
+ if isinstance(PcdObject, StructurePcd):
self.StructuredPcdIncludeFile = PcdObject.StructuredPcdIncludeFile if PcdObject.StructuredPcdIncludeFile else self.StructuredPcdIncludeFile
self.PackageDecs = PcdObject.PackageDecs if PcdObject.PackageDecs else self.PackageDecs
self.DefaultValues = PcdObject.DefaultValues if PcdObject.DefaultValues else self.DefaultValues
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index b08bdfbc4f4e..8551a0d8b7e7 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -868,13 +868,13 @@ class DscBuildData(PlatformBuildClassObject):
for pcdname in Pcds:
pcd = Pcds[pcdname]
Pcds[pcdname].SkuInfoList = {"DEFAULT":pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
- if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
+ if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
Pcds[pcdname].SkuOverrideValues = {"DEFAULT":pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
else:
for pcdname in Pcds:
pcd = Pcds[pcdname]
Pcds[pcdname].SkuInfoList = {skuid:pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
- if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
+ if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
return Pcds
def CompleteHiiPcdsDefaultStores(self, Pcds):
@@ -1234,7 +1234,7 @@ class DscBuildData(PlatformBuildClassObject):
File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
# Add the Structure PCD that only defined in DEC, don't have override in DSC file
for Pcd in self.DecPcds:
- if type (self._DecPcds[Pcd]) is StructurePcd:
+ if isinstance(self._DecPcds[Pcd], StructurePcd):
if Pcd not in S_pcd_set:
str_pcd_obj_str = StructurePcd()
str_pcd_obj_str.copy(self._DecPcds[Pcd])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 8ceedf5aec78..b96d027cb19e 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -75,10 +75,10 @@ def ParseMacro(Parser):
#
# First judge whether this DEFINE is in conditional directive statements or not.
#
- if type(self) == DscParser and self._InDirective > -1:
+ if isinstance(self, DscParser) and self._InDirective > -1:
pass
else:
- if type(self) == DecParser:
+ if isinstance(self, DecParser):
if MODEL_META_DATA_HEADER in self._SectionType:
self._FileLocalMacros[Name] = Value
else:
@@ -89,7 +89,7 @@ def ParseMacro(Parser):
self._ConstructSectionMacroDict(Name, Value)
# EDK_GLOBAL defined macros
- elif type(self) != DscParser:
+ elif not isinstance(self, DscParser):
EdkLogger.error('Parser', FORMAT_INVALID, "EDK_GLOBAL can only be used in .dsc file",
ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
elif self._SectionType != MODEL_META_DATA_HEADER:
@@ -230,7 +230,7 @@ class MetaFileParser(object):
# DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
#
def __getitem__(self, DataInfo):
- if type(DataInfo) != type(()):
+ if not isinstance(DataInfo, type(())):
DataInfo = (DataInfo,)
# Parse the file first, if necessary
@@ -272,7 +272,7 @@ class MetaFileParser(object):
TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
self._ValueList[0:len(TokenList)] = TokenList
# Don't do macro replacement for dsc file at this point
- if type(self) != DscParser:
+ if not isinstance(self, DscParser):
Macros = self._Macros
self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
@@ -379,7 +379,7 @@ class MetaFileParser(object):
EdkLogger.error('Parser', FORMAT_INVALID, "Invalid version number",
ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
- if type(self) == InfParser and self._Version < 0x00010005:
+ if isinstance(self, InfParser) and self._Version < 0x00010005:
# EDK module allows using defines as macros
self._FileLocalMacros[Name] = Value
self._Defines[Name] = Value
@@ -395,7 +395,7 @@ class MetaFileParser(object):
self._ValueList[1] = TokenList2[1] # keys
else:
self._ValueList[1] = TokenList[0]
- if len(TokenList) == 2 and type(self) != DscParser: # value
+ if len(TokenList) == 2 and not isinstance(self, DscParser): # value
self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
if self._ValueList[1].count('_') != 4:
@@ -424,7 +424,7 @@ class MetaFileParser(object):
# DecParser SectionType is a list, will contain more than one item only in Pcd Section
# As Pcd section macro usage is not alllowed, so here it is safe
#
- if type(self) == DecParser:
+ if isinstance(self, DecParser):
SectionDictKey = self._SectionType[0], ScopeKey
if SectionDictKey not in self._SectionsMacroDict:
self._SectionsMacroDict[SectionDictKey] = {}
@@ -441,7 +441,7 @@ class MetaFileParser(object):
SpeSpeMacroDict = {}
ActiveSectionType = self._SectionType
- if type(self) == DecParser:
+ if isinstance(self, DecParser):
ActiveSectionType = self._SectionType[0]
for (SectionType, Scope) in self._SectionsMacroDict:
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index aa357e4ed62b..10b480d619c5 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1758,8 +1758,7 @@ class FdRegionReport(object):
for Match in gOffsetGuidPattern.finditer(FvReport):
Guid = Match.group(2).upper()
OffsetInfo[Match.group(1)] = self._GuidsDb.get(Guid, Guid)
- OffsetList = OffsetInfo.keys()
- OffsetList.sort()
+ OffsetList = sorted(OffsetInfo.keys())
for Offset in OffsetList:
FileWrite (File, "%s %s" % (Offset, OffsetInfo[Offset]))
except IOError:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 6fbaad4c0fb6..03983c34beae 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -303,7 +303,7 @@ def LaunchCommand(Command, WorkingDir):
if EndOfProcedure != None:
EndOfProcedure.set()
if Proc == None:
- if type(Command) != type(""):
+ if not isinstance(Command, type("")):
Command = " ".join(Command)
EdkLogger.error("build", COMMAND_FAILURE, "Failed to start command", ExtraData="%s [%s]" % (Command, WorkingDir))
@@ -314,7 +314,7 @@ def LaunchCommand(Command, WorkingDir):
# check the return code of the program
if Proc.returncode != 0:
- if type(Command) != type(""):
+ if not isinstance(Command, type("")):
Command = " ".join(Command)
# print out the Response file and its content when make failure
RespFile = os.path.join(WorkingDir, 'OUTPUT', 'respfilelist.txt')
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 3bf524123d0f..6a805ce51885 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -258,9 +258,9 @@ class SourceFiles:
replaceables = ('extract-dir', 'filename', 'url')
for replaceItem in fdata:
if replaceItem in replaceables: continue
- if type(fdata[replaceItem]) != str: continue
+ if not isinstance(fdata[replaceItem], str): continue
for replaceable in replaceables:
- if type(fdata[replaceable]) != str: continue
+ if not isinstance(fdata[replaceable], str): continue
if replaceable in fdata:
fdata[replaceable] = \
fdata[replaceable].replace(
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 15/20] BaseTools: Replace StringIO.StringIO with io.BytesIO
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (13 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 14/20] BaseTools: Adjust old python2 idioms Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 16/20] BaseTools: Treat GenFds.py and build.py as python modules Gary Lin
` (5 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Replace StringIO.StringIO with io.BytesIO to be compatible with python3.
This commit also removes "import StringIO" from those python scripts
that don't really use it.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Scripts/ConvertUni.py | 5 -----
BaseTools/Source/Python/AutoGen/AutoGen.py | 10 +++++-----
BaseTools/Source/Python/AutoGen/GenDepex.py | 4 ++--
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 4 ++--
BaseTools/Source/Python/AutoGen/IdfClassObject.py | 1 -
BaseTools/Source/Python/AutoGen/StrGather.py | 4 ++--
BaseTools/Source/Python/AutoGen/UniClassObject.py | 6 +++---
BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 4 ++--
BaseTools/Source/Python/BPDG/GenVpd.py | 6 +++---
BaseTools/Source/Python/Eot/FvImage.py | 1 -
BaseTools/Source/Python/GenFds/AprioriSection.py | 4 ++--
BaseTools/Source/Python/GenFds/Capsule.py | 10 +++++-----
BaseTools/Source/Python/GenFds/CapsuleData.py | 4 ++--
BaseTools/Source/Python/GenFds/Fd.py | 6 +++---
BaseTools/Source/Python/GenFds/FfsFileStatement.py | 4 ++--
BaseTools/Source/Python/GenFds/FfsInfStatement.py | 4 ++--
BaseTools/Source/Python/GenFds/Fv.py | 6 +++---
BaseTools/Source/Python/GenFds/FvImageSection.py | 4 ++--
BaseTools/Source/Python/GenFds/GenFds.py | 8 ++++----
BaseTools/Source/Python/GenFds/OptionRom.py | 3 ---
BaseTools/Source/Python/GenFds/Region.py | 11 ++++++-----
BaseTools/Source/Python/Trim/Trim.py | 6 +++---
BaseTools/Source/Python/build/BuildReport.py | 4 ++--
BaseTools/Source/Python/build/build.py | 8 ++++----
24 files changed, 59 insertions(+), 68 deletions(-)
diff --git a/BaseTools/Scripts/ConvertUni.py b/BaseTools/Scripts/ConvertUni.py
index 2af55dfc6702..67bbe41b1f18 100755
--- a/BaseTools/Scripts/ConvertUni.py
+++ b/BaseTools/Scripts/ConvertUni.py
@@ -23,11 +23,6 @@ import codecs
import os
import sys
-try:
- from io import StringIO
-except ImportError:
- from StringIO import StringIO
-
class ConvertOneArg:
"""Converts utf-16 to utf-8 for one command line argument.
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 5af16e6d68d7..2bc925aa1e08 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -24,7 +24,7 @@ import uuid
import GenC
import GenMake
import GenDepex
-from StringIO import StringIO
+from io import BytesIO
from StrGather import *
from BuildEngine import BuildRule
@@ -3603,8 +3603,8 @@ class ModuleAutoGen(AutoGen):
def _GetAutoGenFileList(self):
UniStringAutoGenC = True
IdfStringAutoGenC = True
- UniStringBinBuffer = StringIO()
- IdfGenBinBuffer = StringIO()
+ UniStringBinBuffer = BytesIO()
+ IdfGenBinBuffer = BytesIO()
if self.BuildType == 'UEFI_HII':
UniStringAutoGenC = False
IdfStringAutoGenC = False
@@ -3888,8 +3888,8 @@ class ModuleAutoGen(AutoGen):
except:
EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
- # Use a instance of StringIO to cache data
- fStringIO = StringIO('')
+ # Use a instance of BytesIO to cache data
+ fStringIO = BytesIO('')
for Item in VfrUniOffsetList:
if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 0f6a1700f541..bb516b651266 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -17,7 +17,7 @@ import Common.LongFilePathOs as os
import re
import traceback
from Common.LongFilePathSupport import OpenLongFilePath as open
-from StringIO import StringIO
+from io import BytesIO
from struct import pack
from Common.BuildToolError import *
from Common.Misc import SaveFileOnChange
@@ -344,7 +344,7 @@ class DependencyExpression:
# @retval False If file exists and is not changed.
#
def Generate(self, File=None):
- Buffer = StringIO()
+ Buffer = BytesIO()
if len(self.PostfixNotation) == 0:
return False
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 85e6f44502a2..716ec8ca3f52 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -11,7 +11,7 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
from builtins import range
-from StringIO import StringIO
+from io import BytesIO
from Common.Misc import *
from Common.String import StringToArray
from struct import pack
@@ -976,7 +976,7 @@ def CreatePcdDatabaseCode (Info, AutoGenC, AutoGenH):
DbFileName = os.path.join(Info.PlatformInfo.BuildDir, "FV", Phase + "PcdDataBase.raw")
else:
DbFileName = os.path.join(Info.OutputDir, Phase + "PcdDataBase.raw")
- DbFile = StringIO()
+ DbFile = BytesIO()
DbFile.write(PcdDbBuffer)
Changed = SaveFileOnChange(DbFileName, DbFile.getvalue(), True)
def CreatePcdDataBase(PcdDBData):
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index d6d4703370aa..db1e5ee6a32d 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -14,7 +14,6 @@
# Import Modules
#
import Common.EdkLogger as EdkLogger
-import StringIO
from Common.BuildToolError import *
from Common.String import GetLineNo
from Common.Misc import PathClass
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index 718cd60514b4..b61450c02831 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -19,7 +19,7 @@ import re
import Common.EdkLogger as EdkLogger
from Common.BuildToolError import *
from UniClassObject import *
-from StringIO import StringIO
+from io import BytesIO
from struct import pack, unpack
from Common.LongFilePathSupport import OpenLongFilePath as open
@@ -382,7 +382,7 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
if Language not in UniLanguageListFiltered:
continue
- StringBuffer = StringIO()
+ StringBuffer = BytesIO()
StrStringValue = ''
ArrayLength = 0
NumberOfUseOtherLangDef = 0
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index cab7623bc4e6..5c4ccd7a8b77 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -21,7 +21,7 @@ from builtins import range
import Common.LongFilePathOs as os, codecs, re
import distutils.util
import Common.EdkLogger as EdkLogger
-import StringIO
+from io import BytesIO
from Common.BuildToolError import *
from Common.String import GetLineNo
from Common.Misc import PathClass
@@ -308,7 +308,7 @@ class UniFileClassObject(object):
self.VerifyUcs2Data(FileIn, FileName, Encoding)
- UniFile = StringIO.StringIO(FileIn)
+ UniFile = BytesIO(FileIn)
Info = codecs.lookup(Encoding)
(Reader, Writer) = (Info.streamreader, Info.streamwriter)
return codecs.StreamReaderWriter(UniFile, Reader, Writer)
@@ -322,7 +322,7 @@ class UniFileClassObject(object):
FileDecoded = codecs.decode(FileIn, Encoding)
Ucs2Info.encode(FileDecoded)
except:
- UniFile = StringIO.StringIO(FileIn)
+ UniFile = BytesIO(FileIn)
Info = codecs.lookup(Encoding)
(Reader, Writer) = (Info.streamreader, Info.streamwriter)
File = codecs.StreamReaderWriter(UniFile, Reader, Writer)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index ff355d05d79f..60027390e820 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -19,7 +19,7 @@ from builtins import range
import os
from Common.RangeExpression import RangeExpression
from Common.Misc import *
-from StringIO import StringIO
+from io import BytesIO
from struct import pack
class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
@@ -181,7 +181,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
Buffer += b
realLength += 1
- DbFile = StringIO()
+ DbFile = BytesIO()
if Phase == 'DXE' and os.path.exists(BinFilePath):
BinFile = open(BinFilePath, "rb")
BinBuffer = BinFile.read()
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 1bb37d744ec9..54b2cc54f578 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -15,7 +15,7 @@
from builtins import range
import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
import StringTable as st
import array
import re
@@ -674,8 +674,8 @@ class GenVPD :
# Open failed
EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.MapFileName, None)
- # Use a instance of StringIO to cache data
- fStringIO = StringIO.StringIO('')
+ # Use a instance of BytesIO to cache data
+ fStringIO = BytesIO('')
# Write the header of map file.
try :
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 64a27217e4a8..0a1eca1ed86f 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -24,7 +24,6 @@ import codecs
import copy
from UserDict import IterableUserDict
-from cStringIO import StringIO
from array import array
from Common.LongFilePathSupport import OpenLongFilePath as open
from CommonDataClass import *
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index b678079b3785..65919270af15 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -18,7 +18,7 @@
from builtins import range
from struct import *
import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
import FfsFileStatement
from GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import AprioriSectionClassObject
@@ -51,7 +51,7 @@ class AprioriSection (AprioriSectionClassObject):
def GenFfs (self, FvName, Dict = {}, IsMakefile = False):
DXE_GUID = "FC510EE7-FFDC-11D4-BD41-0080C73C8881"
PEI_GUID = "1B45CC0A-156A-428A-AF62-49864DA0E6E6"
- Buffer = StringIO.StringIO('')
+ Buffer = BytesIO('')
AprioriFileGuid = DXE_GUID
if self.AprioriType == "PEI":
AprioriFileGuid = PEI_GUID
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index e03d78995737..60019195df27 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -19,7 +19,7 @@ from GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import CapsuleClassObject
import Common.LongFilePathOs as os
import subprocess
-import StringIO
+from io import BytesIO
from Common.Misc import SaveFileOnChange
from GenFds import GenFds
from Common.Misc import PackRegistryFormatGuid
@@ -66,7 +66,7 @@ class Capsule (CapsuleClassObject) :
# UINT32 CapsuleImageSize;
# } EFI_CAPSULE_HEADER;
#
- Header = StringIO.StringIO()
+ Header = BytesIO()
#
# Use FMP capsule GUID: 6DCBD5ED-E82D-4C44-BDA1-7194199AD92A
#
@@ -97,7 +97,7 @@ class Capsule (CapsuleClassObject) :
# // UINT64 ItemOffsetList[];
# } EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER;
#
- FwMgrHdr = StringIO.StringIO()
+ FwMgrHdr = BytesIO()
if 'CAPSULE_HEADER_INIT_VERSION' in self.TokensDict:
FwMgrHdr.write(pack('=I', int(self.TokensDict['CAPSULE_HEADER_INIT_VERSION'], 16)))
else:
@@ -132,7 +132,7 @@ class Capsule (CapsuleClassObject) :
#
PreSize = FwMgrHdrSize
- Content = StringIO.StringIO()
+ Content = BytesIO()
for driver in self.CapsuleDataList:
FileName = driver.GenCapsuleSubItem()
FwMgrHdr.write(pack('=Q', PreSize))
@@ -247,7 +247,7 @@ class Capsule (CapsuleClassObject) :
def GenCapInf(self):
self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
self.UiCapsuleName + "_Cap" + '.inf')
- CapInfFile = StringIO.StringIO() #open (self.CapInfFileName , 'w+')
+ CapInfFile = BytesIO() #open (self.CapInfFileName , 'w+')
CapInfFile.writelines("[options]" + T_CHAR_LF)
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index 1fa202149b25..f0a55d81120b 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -17,7 +17,7 @@
#
import Ffs
from GenFdsGlobalVariable import GenFdsGlobalVariable
-import StringIO
+from io import BytesIO
from struct import pack
import os
from Common.Misc import SaveFileOnChange
@@ -82,7 +82,7 @@ class CapsuleFv (CapsuleData):
if self.FvName.find('.fv') == -1:
if self.FvName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
- FdBuffer = StringIO.StringIO('')
+ FdBuffer = BytesIO('')
FvObj.CapsuleName = self.CapsuleName
FvFile = FvObj.AddToBuffer(FdBuffer)
FvObj.CapsuleName = None
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index 21060625217e..acd73f6449f6 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -18,7 +18,7 @@
import Region
import Fv
import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
import sys
from struct import *
from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -74,7 +74,7 @@ class FD(FDClassObject):
HasCapsuleRegion = True
break
if HasCapsuleRegion:
- TempFdBuffer = StringIO.StringIO('')
+ TempFdBuffer = BytesIO('')
PreviousRegionStart = -1
PreviousRegionSize = 1
@@ -103,7 +103,7 @@ class FD(FDClassObject):
GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
- FdBuffer = StringIO.StringIO('')
+ FdBuffer = BytesIO('')
PreviousRegionStart = -1
PreviousRegionSize = 1
for RegionObj in self.RegionList :
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index cbfea730ef18..1293c8a107f0 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -19,7 +19,7 @@ from builtins import range
import Ffs
import Rule
import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
import subprocess
from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -83,7 +83,7 @@ class FileStatement (FileStatementClassObject) :
Dict.update(self.DefineVarDict)
SectionAlignments = None
if self.FvName != None :
- Buffer = StringIO.StringIO('')
+ Buffer = BytesIO('')
if self.FvName.upper() not in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (self.FvName))
Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index 3a781d6d3a97..d6edd1f0971e 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -18,7 +18,7 @@
#
import Rule
import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
from struct import *
from GenFdsGlobalVariable import GenFdsGlobalVariable
import Ffs
@@ -1091,7 +1091,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
def __GenUniVfrOffsetFile(self, VfrUniOffsetList, UniVfrOffsetFileName):
# Use a instance of StringIO to cache data
- fStringIO = StringIO.StringIO('')
+ fStringIO = BytesIO('')
for Item in VfrUniOffsetList:
if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index c64c0c80e299..88a520998eae 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -18,7 +18,7 @@
from builtins import range
import Common.LongFilePathOs as os
import subprocess
-import StringIO
+from io import BytesIO
from struct import *
import Ffs
@@ -268,7 +268,7 @@ class FV (FvClassObject):
#
self.InfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
self.UiFvName + '.inf')
- self.FvInfFile = StringIO.StringIO()
+ self.FvInfFile = BytesIO()
#
# Add [Options]
@@ -427,7 +427,7 @@ class FV (FvClassObject):
#
if TotalSize > 0:
FvExtHeaderFileName = os.path.join(GenFdsGlobalVariable.FvDir, self.UiFvName + '.ext')
- FvExtHeaderFile = StringIO.StringIO()
+ FvExtHeaderFile = BytesIO()
FvExtHeaderFile.write(Buffer)
Changed = SaveFileOnChange(FvExtHeaderFileName, FvExtHeaderFile.getvalue(), True)
FvExtHeaderFile.close()
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index ac5d5891df70..7416ce1b7d8a 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -16,7 +16,7 @@
# Import Modules
#
import Section
-import StringIO
+from io import BytesIO
from Ffs import Ffs
import subprocess
from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -97,7 +97,7 @@ class FvImageSection(FvImageSectionClassObject):
# Generate Fv
#
if self.FvName != None:
- Buffer = StringIO.StringIO('')
+ Buffer = BytesIO('')
Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName)
if Fv != None:
self.Fv = Fv
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index cb8dabbe038d..ebebcd7980e4 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -29,7 +29,7 @@ from Workspace.BuildClassObject import PcdClassObject
from Workspace.BuildClassObject import ModuleBuildClassObject
import RuleComplexFile
from EfiSection import EfiSection
-import StringIO
+from io import BytesIO
import Common.TargetTxtClassObject as TargetTxtClassObject
import Common.ToolDefClassObject as ToolDefClassObject
from Common.DataType import *
@@ -591,13 +591,13 @@ class GenFds :
if GenFds.OnlyGenerateThisFv != None and GenFds.OnlyGenerateThisFv.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(GenFds.OnlyGenerateThisFv.upper())
if FvObj != None:
- Buffer = StringIO.StringIO()
+ Buffer = BytesIO()
FvObj.AddToBuffer(Buffer)
Buffer.close()
return
elif GenFds.OnlyGenerateThisFv == None:
for FvName in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
- Buffer = StringIO.StringIO('')
+ Buffer = BytesIO('')
FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[FvName]
FvObj.AddToBuffer(Buffer)
Buffer.close()
@@ -749,7 +749,7 @@ class GenFds :
def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
- GuidXRefFile = StringIO.StringIO('')
+ GuidXRefFile = BytesIO('')
GuidDict = {}
ModuleList = []
FileGuidList = []
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index 2e61a38c1d33..946cdf812a24 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -17,7 +17,6 @@
#
import Common.LongFilePathOs as os
import subprocess
-import StringIO
import OptRomInfStatement
from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -138,5 +137,3 @@ class OverrideAttribs:
self.PciDeviceId = None
self.PciRevision = None
self.NeedCompress = None
-
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 5b9b203cf475..6ace73abe904 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -18,7 +18,7 @@
from builtins import range
from struct import *
from GenFdsGlobalVariable import GenFdsGlobalVariable
-import StringIO
+from io import BytesIO
import string
from CommonDataClass.FdfClass import RegionClassObject
import Common.LongFilePathOs as os
@@ -127,7 +127,7 @@ class Region(RegionClassObject):
if self.FvAddress % FvAlignValue != 0:
EdkLogger.error("GenFds", GENFDS_ERROR,
"FV (%s) is NOT %s Aligned!" % (FvObj.UiFvName, FvObj.FvAlignment))
- FvBuffer = StringIO.StringIO('')
+ FvBuffer = BytesIO('')
FvBaseAddress = '0x%X' % self.FvAddress
BlockSize = None
BlockNum = None
@@ -135,7 +135,8 @@ class Region(RegionClassObject):
if Flag:
continue
- if FvBuffer.len > Size:
+ FvBufferLen = len(FvBuffer.getvalue())
+ if FvBufferLen > Size:
FvBuffer.close()
EdkLogger.error("GenFds", GENFDS_ERROR,
"Size of FV (%s) is larger than Region Size 0x%X specified." % (RegionData, Size))
@@ -144,8 +145,8 @@ class Region(RegionClassObject):
#
Buffer.write(FvBuffer.getvalue())
FvBuffer.close()
- FvOffset = FvOffset + FvBuffer.len
- Size = Size - FvBuffer.len
+ FvOffset = FvOffset + FvBufferLen
+ Size = Size - FvBufferLen
continue
else:
EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (RegionData))
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index af1bf9de3e00..87edfbe31fbf 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -18,7 +18,7 @@ from builtins import range
import Common.LongFilePathOs as os
import sys
import re
-import StringIO
+from io import BytesIO
from optparse import OptionParser
from optparse import make_option
@@ -455,8 +455,8 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
except:
EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
- # Use a instance of StringIO to cache data
- fStringIO = StringIO.StringIO('')
+ # Use a instance of BytesIO to cache data
+ fStringIO = BytesIO('')
for Item in VfrUniOffsetList:
if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 10b480d619c5..be57bcd40ba8 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -28,7 +28,7 @@ import hashlib
import subprocess
import threading
from datetime import datetime
-from StringIO import StringIO
+from io import BytesIO
from Common import EdkLogger
from Common.Misc import SaveFileOnChange
from Common.Misc import GuidStructureByteArrayToGuidString
@@ -2062,7 +2062,7 @@ class BuildReport(object):
def GenerateReport(self, BuildDuration, AutoGenTime, MakeTime, GenFdsTime):
if self.ReportFile:
try:
- File = StringIO('')
+ File = BytesIO('')
for (Wa, MaList) in self.ReportList:
PlatformReport(Wa, MaList, self.ReportType).GenerateReport(File, BuildDuration, AutoGenTime, MakeTime, GenFdsTime, self.ReportType)
Content = FileLinesSplit(File.getvalue(), gLineMaxLength)
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 03983c34beae..66b46fab5c26 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -19,7 +19,7 @@
from __future__ import print_function
import Common.LongFilePathOs as os
import re
-import StringIO
+from io import BytesIO
import sys
import glob
import time
@@ -1784,7 +1784,7 @@ class Build():
if not Ma.IsLibrary:
ModuleList[Ma.Guid.upper()] = Ma
- MapBuffer = StringIO('')
+ MapBuffer = BytesIO('')
if self.LoadFixAddress != 0:
#
# Rebase module to the preferred memory address before GenFds
@@ -1934,7 +1934,7 @@ class Build():
if not Ma.IsLibrary:
ModuleList[Ma.Guid.upper()] = Ma
- MapBuffer = StringIO('')
+ MapBuffer = BytesIO('')
if self.LoadFixAddress != 0:
#
# Rebase module to the preferred memory address before GenFds
@@ -2122,7 +2122,7 @@ class Build():
#
# Rebase module to the preferred memory address before GenFds
#
- MapBuffer = StringIO('')
+ MapBuffer = BytesIO('')
if self.LoadFixAddress != 0:
self._CollectModuleMapBuffer(MapBuffer, ModuleList)
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 16/20] BaseTools: Treat GenFds.py and build.py as python modules
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (14 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 15/20] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 17/20] BaseTools: Adopt absolute import for python scripts Gary Lin
` (4 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Since GenFds.py and build.py import modules from its own directory, add
"-m" to the python parameters so that they can import its own modules.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/BinWrappers/PosixLike/GenFds | 2 +-
BaseTools/BinWrappers/PosixLike/build | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/BaseTools/BinWrappers/PosixLike/GenFds b/BaseTools/BinWrappers/PosixLike/GenFds
index 214d88fff1b1..ba45303dee74 100755
--- a/BaseTools/BinWrappers/PosixLike/GenFds
+++ b/BaseTools/BinWrappers/PosixLike/GenFds
@@ -11,4 +11,4 @@ dir=$(dirname "$full_cmd")
cmd=${full_cmd##*/}
export PYTHONPATH="$dir/../../Source/Python"
-exec "${python_exe:-python}" "$dir/../../Source/Python/$cmd/$cmd.py" "$@"
+exec "${python_exe:-python}" -m $cmd.$cmd "$@"
diff --git a/BaseTools/BinWrappers/PosixLike/build b/BaseTools/BinWrappers/PosixLike/build
index 214d88fff1b1..ba45303dee74 100755
--- a/BaseTools/BinWrappers/PosixLike/build
+++ b/BaseTools/BinWrappers/PosixLike/build
@@ -11,4 +11,4 @@ dir=$(dirname "$full_cmd")
cmd=${full_cmd##*/}
export PYTHONPATH="$dir/../../Source/Python"
-exec "${python_exe:-python}" "$dir/../../Source/Python/$cmd/$cmd.py" "$@"
+exec "${python_exe:-python}" -m $cmd.$cmd "$@"
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 17/20] BaseTools: Adopt absolute import for python scripts
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (15 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 16/20] BaseTools: Treat GenFds.py and build.py as python modules Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 18/20] BaseTools: Move OverrideAttribs to OptRomInfStatement.py Gary Lin
` (3 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Based on "futurize -f libfuturize.fixes.fix_absolute_import"
Since circular import is not allowed in python 3, the following changes
are applied to break the circles.
* BaseTools/Source/Python/GenFds/Capsule.py
- Delay "from .GenFds import GenFds" until GenCapsule()
- Delay "from .GenFds import FindExtendTool" until GenFmpCapsule()
To break the circle:
AutoGen.AutoGen => GenFds.FdfParser => GenFds.Capsule => GenFds.GenFds =>
GenFds.FdfParser
* BaseTools/Source/Python/GenFds/Fd.py
- Delay "from .GenFds import GenFds" until GenFd()
To break the circle:
AutoGen.AutoGen => GenFds.FdfParser => GenFds.Fd => GenFds.GenFds =>
GenFds.FdfParser
* BaseTools/Source/Python/GenFds/Fv.py
- Delay "from .GenFds import GenFds" until AddToBuffer()
To break the circle:
AutoGen.AutoGen => GenFds.FdfParser => GenFds.Fd => GenFds.Fv =>
GenFds.GenFds => GenFds.FdfParser
* BaseTools/Source/Python/GenFds/GuidSection.py
- Delay "from .GenFds import FindExtendTool" until GuidSection()
To break the circle:
AutoGen.AutoGen => GenFds.FdfParser => GenFds.Fd => GenFds.Fv =>
GenFds.AprioriSection => GenFds.FfsFileStatement => GenFds.GuidSection =>
GenFds.GenFds => GenFds.FdfParser
* BaseTools/Source/Python/GenFds/OptRomInfStatement.py
- Delay "from . import OptionRom" until __GetOptRomParams()
To break the circle:
AutoGen.AutoGen => GenFds.FdfParser => GenFds.OptionRom =>
GenFds.OptRomInfStatement => GenFds.OptionRom
* BaseTools/Source/Python/GenFds/OptionRom.py
- Remove the unused "from .GenFds import GenFds"
To break the circle:
AutoGen.AutoGen => GenFds.FdfParser => GenFds.OptionRom =>
GenFds.GenFds => GenFds.FdfParser
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/AutoGen/AutoGen.py | 17 +++---
BaseTools/Source/Python/AutoGen/GenC.py | 7 +--
BaseTools/Source/Python/AutoGen/GenMake.py | 3 +-
BaseTools/Source/Python/AutoGen/GenPcdDb.py | 7 +--
BaseTools/Source/Python/AutoGen/StrGather.py | 3 +-
BaseTools/Source/Python/BPDG/BPDG.py | 5 +-
BaseTools/Source/Python/BPDG/GenVpd.py | 3 +-
BaseTools/Source/Python/Common/Database.py | 8 +--
BaseTools/Source/Python/Common/DecClassObject.py | 17 +++---
BaseTools/Source/Python/Common/Dictionary.py | 5 +-
BaseTools/Source/Python/Common/DscClassObject.py | 21 ++++----
BaseTools/Source/Python/Common/EdkIIWorkspace.py | 3 +-
BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py | 19 +++----
BaseTools/Source/Python/Common/EdkLogger.py | 3 +-
BaseTools/Source/Python/Common/Expression.py | 3 +-
BaseTools/Source/Python/Common/FdfClassObject.py | 5 +-
BaseTools/Source/Python/Common/InfClassObject.py | 21 ++++----
BaseTools/Source/Python/Common/LongFilePathOs.py | 3 +-
BaseTools/Source/Python/Common/MigrationUtilities.py | 3 +-
BaseTools/Source/Python/Common/Misc.py | 9 ++--
BaseTools/Source/Python/Common/Parsing.py | 5 +-
BaseTools/Source/Python/Common/String.py | 9 ++--
BaseTools/Source/Python/Common/TargetTxtClassObject.py | 9 ++--
BaseTools/Source/Python/Common/ToolDefClassObject.py | 9 ++--
BaseTools/Source/Python/CommonDataClass/ModuleClass.py | 3 +-
BaseTools/Source/Python/CommonDataClass/PackageClass.py | 3 +-
BaseTools/Source/Python/CommonDataClass/PlatformClass.py | 3 +-
BaseTools/Source/Python/Ecc/CParser.py | 5 +-
BaseTools/Source/Python/Ecc/Check.py | 9 ++--
BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 13 ++---
BaseTools/Source/Python/Ecc/Database.py | 7 +--
BaseTools/Source/Python/Ecc/Ecc.py | 25 ++++-----
BaseTools/Source/Python/Ecc/Exception.py | 3 +-
BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
BaseTools/Source/Python/Ecc/MetaDataParser.py | 5 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 3 +-
BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 5 +-
BaseTools/Source/Python/Ecc/c.py | 13 ++---
BaseTools/Source/Python/Eot/CParser.py | 5 +-
BaseTools/Source/Python/Eot/CodeFragmentCollector.py | 11 ++--
BaseTools/Source/Python/Eot/Eot.py | 15 +++---
BaseTools/Source/Python/Eot/FileProfile.py | 3 +-
BaseTools/Source/Python/Eot/FvImage.py | 11 ++--
BaseTools/Source/Python/Eot/InfParserLite.py | 5 +-
BaseTools/Source/Python/Eot/Parser.py | 3 +-
BaseTools/Source/Python/Eot/Report.py | 3 +-
BaseTools/Source/Python/Eot/c.py | 9 ++--
BaseTools/Source/Python/GenFds/AprioriSection.py | 5 +-
BaseTools/Source/Python/GenFds/Capsule.py | 7 +--
BaseTools/Source/Python/GenFds/CapsuleData.py | 5 +-
BaseTools/Source/Python/GenFds/CompressSection.py | 7 +--
BaseTools/Source/Python/GenFds/DataSection.py | 7 +--
BaseTools/Source/Python/GenFds/DepexSection.py | 7 +--
BaseTools/Source/Python/GenFds/EfiSection.py | 7 +--
BaseTools/Source/Python/GenFds/Fd.py | 9 ++--
BaseTools/Source/Python/GenFds/FdfParser.py | 55 ++++++++++----------
BaseTools/Source/Python/GenFds/FfsFileStatement.py | 11 ++--
BaseTools/Source/Python/GenFds/FfsInfStatement.py | 19 +++----
BaseTools/Source/Python/GenFds/Fv.py | 12 ++---
BaseTools/Source/Python/GenFds/FvImageSection.py | 7 +--
BaseTools/Source/Python/GenFds/GenFds.py | 11 ++--
BaseTools/Source/Python/GenFds/GuidSection.py | 11 ++--
BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 3 +-
BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 16 +++---
BaseTools/Source/Python/GenFds/OptionRom.py | 6 +--
BaseTools/Source/Python/GenFds/Region.py | 3 +-
BaseTools/Source/Python/GenFds/RuleComplexFile.py | 3 +-
BaseTools/Source/Python/GenFds/RuleSimpleFile.py | 3 +-
BaseTools/Source/Python/GenFds/Section.py | 3 +-
BaseTools/Source/Python/GenFds/UiSection.py | 7 +--
BaseTools/Source/Python/GenFds/VerSection.py | 7 +--
BaseTools/Source/Python/GenFds/Vtf.py | 3 +-
BaseTools/Source/Python/Table/TableDataModel.py | 3 +-
BaseTools/Source/Python/Table/TableDec.py | 3 +-
BaseTools/Source/Python/Table/TableDsc.py | 3 +-
BaseTools/Source/Python/Table/TableEotReport.py | 5 +-
BaseTools/Source/Python/Table/TableFdf.py | 3 +-
BaseTools/Source/Python/Table/TableFile.py | 3 +-
BaseTools/Source/Python/Table/TableFunction.py | 3 +-
BaseTools/Source/Python/Table/TableIdentifier.py | 5 +-
BaseTools/Source/Python/Table/TableInf.py | 3 +-
BaseTools/Source/Python/Table/TablePcd.py | 5 +-
BaseTools/Source/Python/Table/TableQuery.py | 3 +-
BaseTools/Source/Python/Table/TableReport.py | 3 +-
BaseTools/Source/Python/UPT/Library/Parsing.py | 3 +-
BaseTools/Source/Python/Workspace/DscBuildData.py | 9 ++--
BaseTools/Source/Python/Workspace/InfBuildData.py | 3 +-
BaseTools/Source/Python/Workspace/MetaFileParser.py | 5 +-
BaseTools/Source/Python/Workspace/MetaFileTable.py | 5 +-
BaseTools/Source/Python/Workspace/WorkspaceCommon.py | 3 +-
BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 7 +--
BaseTools/Source/Python/build/build.py | 3 +-
92 files changed, 381 insertions(+), 297 deletions(-)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 2bc925aa1e08..6e8c3cc10c1e 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -14,6 +14,7 @@
## Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import re
@@ -21,13 +22,13 @@ import os.path as path
import copy
import uuid
-import GenC
-import GenMake
-import GenDepex
+from . import GenC
+from . import GenMake
+from . import GenDepex
from io import BytesIO
-from StrGather import *
-from BuildEngine import BuildRule
+from .StrGather import *
+from .BuildEngine import BuildRule
from Common.LongFilePathSupport import CopyLongFilePath
from Common.BuildToolError import *
@@ -40,13 +41,13 @@ from CommonDataClass.CommonClass import SkuInfoClass
from Workspace.BuildClassObject import *
from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile
import Common.VpdInfoFile as VpdInfoFile
-from GenPcdDb import CreatePcdDatabaseCode
+from .GenPcdDb import CreatePcdDatabaseCode
from Workspace.MetaFileCommentParser import UsageList
from Common.MultipleWorkspace import MultipleWorkspace as mws
-import InfSectionParser
+from . import InfSectionParser
import datetime
import hashlib
-from GenVar import VariableMgr, var_info
+from .GenVar import VariableMgr, var_info
## Regular expression for splitting Dependency Expression string into tokens
gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index d68160deb4a1..adf00d9a9514 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -13,6 +13,7 @@
## Import Modules
#
+from __future__ import absolute_import
from builtins import range
import string
import collections
@@ -23,9 +24,9 @@ from Common.BuildToolError import *
from Common.DataType import *
from Common.Misc import *
from Common.String import StringToArray
-from StrGather import *
-from GenPcdDb import CreatePcdDatabaseCode
-from IdfClassObject import *
+from .StrGather import *
+from .GenPcdDb import CreatePcdDatabaseCode
+from .IdfClassObject import *
## PCD type string
gItemTypeStringDatabase = {
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index eb56d0e7c5a3..a2a6cb2ac224 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -13,6 +13,7 @@
## Import Modules
#
+from __future__ import absolute_import
import Common.LongFilePathOs as os
import sys
import string
@@ -23,7 +24,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
from Common.BuildToolError import *
from Common.Misc import *
from Common.String import *
-from BuildEngine import *
+from .BuildEngine import *
import Common.GlobalData as GlobalData
## Regular expression for finding header file inclusions
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 716ec8ca3f52..476f5391ffdc 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -10,14 +10,15 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import absolute_import
from builtins import range
from io import BytesIO
from Common.Misc import *
from Common.String import StringToArray
from struct import pack
-from ValidCheckingInfoObject import VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER
-from ValidCheckingInfoObject import VAR_CHECK_PCD_VARIABLE_TAB
-from ValidCheckingInfoObject import VAR_VALID_OBJECT_FACTORY
+from .ValidCheckingInfoObject import VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER
+from .ValidCheckingInfoObject import VAR_CHECK_PCD_VARIABLE_TAB
+from .ValidCheckingInfoObject import VAR_VALID_OBJECT_FACTORY
from Common.VariableAttributes import VariableAttributes
import copy
from struct import unpack
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index b61450c02831..f1773d99e4dd 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -14,11 +14,12 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
import re
import Common.EdkLogger as EdkLogger
from Common.BuildToolError import *
-from UniClassObject import *
+from .UniClassObject import *
from io import BytesIO
from struct import pack, unpack
from Common.LongFilePathSupport import OpenLongFilePath as open
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index 9ab13a39e8bf..8388915d8637 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -21,6 +21,7 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import Common.LongFilePathOs as os
import sys
import encodings.ascii
@@ -30,8 +31,8 @@ from Common import EdkLogger
from Common.BuildToolError import *
from Common.BuildVersion import gBUILD_VERSION
-import StringTable as st
-import GenVpd
+from . import StringTable as st
+from . import GenVpd
PROJECT_NAME = st.LBL_BPDG_LONG_UNI
VERSION = (st.LBL_BPDG_VERSION + " Build " + gBUILD_VERSION)
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 54b2cc54f578..bbccdc780fb3 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -13,10 +13,11 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
from io import BytesIO
-import StringTable as st
+from . import StringTable as st
import array
import re
from Common.LongFilePathSupport import OpenLongFilePath as open
diff --git a/BaseTools/Source/Python/Common/Database.py b/BaseTools/Source/Python/Common/Database.py
index a81a44731f03..c281c199effd 100644
--- a/BaseTools/Source/Python/Common/Database.py
+++ b/BaseTools/Source/Python/Common/Database.py
@@ -14,13 +14,14 @@
##
# Import Modules
#
+from __future__ import absolute_import
import sqlite3
import Common.LongFilePathOs as os
-import EdkLogger as EdkLogger
+from . import EdkLogger as EdkLogger
from CommonDataClass.DataClass import *
-from String import *
-from DataType import *
+from .String import *
+from .DataType import *
from Table.TableDataModel import TableDataModel
from Table.TableFile import TableFile
@@ -117,4 +118,3 @@ if __name__ == '__main__':
Db.QueryTable(Db.TblFile)
Db.QueryTable(Db.TblDsc)
Db.Close()
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/Common/DecClassObject.py b/BaseTools/Source/Python/Common/DecClassObject.py
index 970e644318d0..71cd2d33000e 100644
--- a/BaseTools/Source/Python/Common/DecClassObject.py
+++ b/BaseTools/Source/Python/Common/DecClassObject.py
@@ -15,18 +15,19 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import Common.LongFilePathOs as os
-from String import *
-from DataType import *
-from Identification import *
-from Dictionary import *
+from .String import *
+from .DataType import *
+from .Identification import *
+from .Dictionary import *
from CommonDataClass.PackageClass import *
from CommonDataClass.CommonClass import PcdClass
-from BuildToolError import *
+from .BuildToolError import *
from Table.TableDec import TableDec
-import Database
-from Parsing import *
-import GlobalData
+from . import Database
+from .Parsing import *
+from . import GlobalData
from Common.LongFilePathSupport import OpenLongFilePath as open
#
diff --git a/BaseTools/Source/Python/Common/Dictionary.py b/BaseTools/Source/Python/Common/Dictionary.py
index c381995f97ff..149081be2831 100644
--- a/BaseTools/Source/Python/Common/Dictionary.py
+++ b/BaseTools/Source/Python/Common/Dictionary.py
@@ -15,8 +15,9 @@
# Import Modules
#
from __future__ import print_function
-import EdkLogger
-from DataType import *
+from __future__ import absolute_import
+from . import EdkLogger
+from .DataType import *
from Common.LongFilePathSupport import OpenLongFilePath as open
## Convert a text file to a dictionary
diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/Source/Python/Common/DscClassObject.py
index e6abc1f036ac..d4ebddadc74a 100644
--- a/BaseTools/Source/Python/Common/DscClassObject.py
+++ b/BaseTools/Source/Python/Common/DscClassObject.py
@@ -15,20 +15,21 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
-import EdkLogger as EdkLogger
-import Database
-from String import *
-from Parsing import *
-from DataType import *
-from Identification import *
-from Dictionary import *
+from . import EdkLogger as EdkLogger
+from . import Database
+from .String import *
+from .Parsing import *
+from .DataType import *
+from .Identification import *
+from .Dictionary import *
from CommonDataClass.PlatformClass import *
from CommonDataClass.CommonClass import SkuInfoClass
-from BuildToolError import *
-from Misc import sdict
-import GlobalData
+from .BuildToolError import *
+from .Misc import sdict
+from . import GlobalData
from Table.TableDsc import TableDsc
from Common.LongFilePathSupport import OpenLongFilePath as open
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspace.py b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
index 52f63ae53df8..4f7e69ca5ef2 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspace.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
@@ -15,8 +15,9 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import Common.LongFilePathOs as os, sys, time
-from DataType import *
+from .DataType import *
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.MultipleWorkspace import MultipleWorkspace as mws
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py b/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
index a2f7c94c1ca7..2b4dc720cea9 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
@@ -15,16 +15,17 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import Common.LongFilePathOs as os, string, copy, pdb, copy
-import EdkLogger
-import DataType
-from InfClassObject import *
-from DecClassObject import *
-from DscClassObject import *
-from String import *
-from BuildToolError import *
-from Misc import sdict
-import Database as Database
+from . import EdkLogger
+from . import DataType
+from .InfClassObject import *
+from .DecClassObject import *
+from .DscClassObject import *
+from .String import *
+from .BuildToolError import *
+from .Misc import sdict
+from . import Database as Database
import time as time
## PcdClassObject
diff --git a/BaseTools/Source/Python/Common/EdkLogger.py b/BaseTools/Source/Python/Common/EdkLogger.py
index ac1c8edc4fe2..636ac908c320 100644
--- a/BaseTools/Source/Python/Common/EdkLogger.py
+++ b/BaseTools/Source/Python/Common/EdkLogger.py
@@ -12,9 +12,10 @@
#
## Import modules
+from __future__ import absolute_import
import Common.LongFilePathOs as os, sys, logging
import traceback
-from BuildToolError import *
+from .BuildToolError import *
## Log level constants
DEBUG_0 = 1
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 3b5afdc9ab06..afe80e690877 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -13,11 +13,12 @@
## Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
from Common.GlobalData import *
from CommonDataClass.Exceptions import BadExpression
from CommonDataClass.Exceptions import WrnExpression
-from Misc import GuidStringToGuidStructureString, ParseFieldValue
+from .Misc import GuidStringToGuidStructureString, ParseFieldValue
import Common.EdkLogger as EdkLogger
import copy
diff --git a/BaseTools/Source/Python/Common/FdfClassObject.py b/BaseTools/Source/Python/Common/FdfClassObject.py
index 7ec0235967b2..b0ab8010343a 100644
--- a/BaseTools/Source/Python/Common/FdfClassObject.py
+++ b/BaseTools/Source/Python/Common/FdfClassObject.py
@@ -14,11 +14,12 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
-from FdfParserLite import FdfParser
+from .FdfParserLite import FdfParser
from Table.TableFdf import TableFdf
from CommonDataClass.DataClass import MODEL_FILE_FDF, MODEL_PCD, MODEL_META_DATA_COMPONENT
-from String import NormPath
+from .String import NormPath
## FdfObject
#
diff --git a/BaseTools/Source/Python/Common/InfClassObject.py b/BaseTools/Source/Python/Common/InfClassObject.py
index fe82ffd8eb4e..54f99c83910e 100644
--- a/BaseTools/Source/Python/Common/InfClassObject.py
+++ b/BaseTools/Source/Python/Common/InfClassObject.py
@@ -15,21 +15,22 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import Common.LongFilePathOs as os
import re
-import EdkLogger
+from . import EdkLogger
from CommonDataClass.CommonClass import LibraryClassClass
from CommonDataClass.ModuleClass import *
-from String import *
-from DataType import *
-from Identification import *
-from Dictionary import *
-from BuildToolError import *
-from Misc import sdict
-import GlobalData
+from .String import *
+from .DataType import *
+from .Identification import *
+from .Dictionary import *
+from .BuildToolError import *
+from .Misc import sdict
+from . import GlobalData
from Table.TableInf import TableInf
-import Database
-from Parsing import *
+from . import Database
+from .Parsing import *
from Common.LongFilePathSupport import OpenLongFilePath as open
#
diff --git a/BaseTools/Source/Python/Common/LongFilePathOs.py b/BaseTools/Source/Python/Common/LongFilePathOs.py
index 47d63faeb995..a32205545368 100644
--- a/BaseTools/Source/Python/Common/LongFilePathOs.py
+++ b/BaseTools/Source/Python/Common/LongFilePathOs.py
@@ -11,8 +11,9 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import absolute_import
import os
-import LongFilePathOsPath
+from . import LongFilePathOsPath
from Common.LongFilePathSupport import LongFilePath
from Common.LongFilePathSupport import UniToStr
diff --git a/BaseTools/Source/Python/Common/MigrationUtilities.py b/BaseTools/Source/Python/Common/MigrationUtilities.py
index 2385988247d4..88038046835a 100644
--- a/BaseTools/Source/Python/Common/MigrationUtilities.py
+++ b/BaseTools/Source/Python/Common/MigrationUtilities.py
@@ -14,10 +14,11 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import re
-import EdkLogger
+from . import EdkLogger
from optparse import OptionParser
from Common.BuildToolError import *
from XmlRoutines import *
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 1a7418734cf8..5d7b0cf9ddc5 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import sys
@@ -31,10 +32,10 @@ from UserList import UserList
from Common import EdkLogger as EdkLogger
from Common import GlobalData as GlobalData
-from DataType import *
-from BuildToolError import *
+from .DataType import *
+from .BuildToolError import *
from CommonDataClass.DataClass import *
-from Parsing import GetSplitValueList
+from .Parsing import GetSplitValueList
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.MultipleWorkspace import MultipleWorkspace as mws
import uuid
@@ -512,7 +513,7 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
try:
if GlobalData.gIsWindows:
try:
- from PyUtility import SaveFileToDisk
+ from .PyUtility import SaveFileToDisk
if not SaveFileToDisk(File, Content):
EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData=File)
except:
diff --git a/BaseTools/Source/Python/Common/Parsing.py b/BaseTools/Source/Python/Common/Parsing.py
index 9caa9424d8ed..ebbb9f564320 100644
--- a/BaseTools/Source/Python/Common/Parsing.py
+++ b/BaseTools/Source/Python/Common/Parsing.py
@@ -14,10 +14,11 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
-from String import *
+from .String import *
from CommonDataClass.DataClass import *
-from DataType import *
+from .DataType import *
## ParseDefineMacro
#
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index d2ec46d84eb8..d5932b367d69 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -14,15 +14,16 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
import re
-import DataType
+from . import DataType
import Common.LongFilePathOs as os
import string
-import EdkLogger as EdkLogger
+from . import EdkLogger as EdkLogger
-import GlobalData
-from BuildToolError import *
+from . import GlobalData
+from .BuildToolError import *
from CommonDataClass.Exceptions import *
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.MultipleWorkspace import MultipleWorkspace as mws
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 9c1e6b407356..b4ad4f1d290a 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -15,11 +15,12 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import Common.LongFilePathOs as os
-import EdkLogger
-import DataType
-from BuildToolError import *
-import GlobalData
+from . import EdkLogger
+from . import DataType
+from .BuildToolError import *
+from . import GlobalData
from Common.LongFilePathSupport import OpenLongFilePath as open
gDefaultTargetTxtFile = "target.txt"
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index d3587b171192..639ca176695c 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -14,14 +14,15 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import re
-import EdkLogger
+from . import EdkLogger
-from Dictionary import *
-from BuildToolError import *
-from TargetTxtClassObject import *
+from .Dictionary import *
+from .BuildToolError import *
+from .TargetTxtClassObject import *
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.Misc import PathClass
from Common.String import NormPath
diff --git a/BaseTools/Source/Python/CommonDataClass/ModuleClass.py b/BaseTools/Source/Python/CommonDataClass/ModuleClass.py
index c5ea15af5b97..54cc3e92e108 100644
--- a/BaseTools/Source/Python/CommonDataClass/ModuleClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/ModuleClass.py
@@ -13,7 +13,8 @@
##
# Import Modules
#
-from CommonClass import *
+from __future__ import absolute_import
+from .CommonClass import *
## ModuleHeaderClass
#
diff --git a/BaseTools/Source/Python/CommonDataClass/PackageClass.py b/BaseTools/Source/Python/CommonDataClass/PackageClass.py
index 89d4d0797fe1..d2ef29aea312 100644
--- a/BaseTools/Source/Python/CommonDataClass/PackageClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/PackageClass.py
@@ -13,7 +13,8 @@
##
# Import Modules
#
-from CommonClass import *
+from __future__ import absolute_import
+from .CommonClass import *
from Common.Misc import sdict
## PackageHeaderClass
diff --git a/BaseTools/Source/Python/CommonDataClass/PlatformClass.py b/BaseTools/Source/Python/CommonDataClass/PlatformClass.py
index a93d1ce2a1db..ee656e4d55e2 100644
--- a/BaseTools/Source/Python/CommonDataClass/PlatformClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/PlatformClass.py
@@ -13,7 +13,8 @@
##
# Import Modules
#
-from CommonClass import *
+from __future__ import absolute_import
+from .CommonClass import *
## SkuInfoListClass
#
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index bd4f10e1edff..02d70e73cbb8 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -1,4 +1,5 @@
from __future__ import print_function
+from __future__ import absolute_import
# $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
from antlr3 import *
@@ -23,8 +24,8 @@ from antlr3.compat import set, frozenset
#
##
-import CodeFragment
-import FileProfile
+from . import CodeFragment
+from . import FileProfile
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index 92259999853c..339a6ddca471 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -10,15 +10,16 @@
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import re
from CommonDataClass.DataClass import *
import Common.DataType as DT
-from EccToolError import *
-from MetaDataParser import ParseHeaderCommentSection
-import EccGlobalData
-import c
+from .EccToolError import *
+from .MetaDataParser import ParseHeaderCommentSection
+from . import EccGlobalData
+from . import c
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.MultipleWorkspace import MultipleWorkspace as mws
diff --git a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
index 7bdb3cc3aea5..f96ea2fec6d2 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
@@ -17,18 +17,19 @@
#
from __future__ import print_function
+from __future__ import absolute_import
import re
import Common.LongFilePathOs as os
import sys
import antlr3
-from CLexer import CLexer
-from CParser import CParser
+from .CLexer import CLexer
+from .CParser import CParser
-import FileProfile
-from CodeFragment import Comment
-from CodeFragment import PP_Directive
-from ParserWarning import Warning
+from . import FileProfile
+from .CodeFragment import Comment
+from .CodeFragment import PP_Directive
+from .ParserWarning import Warning
##define T_CHAR_SPACE ' '
diff --git a/BaseTools/Source/Python/Ecc/Database.py b/BaseTools/Source/Python/Ecc/Database.py
index 204117512452..34f49f3cba8b 100644
--- a/BaseTools/Source/Python/Ecc/Database.py
+++ b/BaseTools/Source/Python/Ecc/Database.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import absolute_import
import sqlite3
import Common.LongFilePathOs as os, time
@@ -26,9 +27,9 @@ from Table.TableFunction import TableFunction
from Table.TablePcd import TablePcd
from Table.TableIdentifier import TableIdentifier
from Table.TableReport import TableReport
-from MetaFileWorkspace.MetaFileTable import ModuleTable
-from MetaFileWorkspace.MetaFileTable import PackageTable
-from MetaFileWorkspace.MetaFileTable import PlatformTable
+from .MetaFileWorkspace.MetaFileTable import ModuleTable
+from .MetaFileWorkspace.MetaFileTable import PackageTable
+from .MetaFileWorkspace.MetaFileTable import PlatformTable
from Table.TableFdf import TableFdf
##
diff --git a/BaseTools/Source/Python/Ecc/Ecc.py b/BaseTools/Source/Python/Ecc/Ecc.py
index 94f9a427e370..46c79f169334 100644
--- a/BaseTools/Source/Python/Ecc/Ecc.py
+++ b/BaseTools/Source/Python/Ecc/Ecc.py
@@ -14,14 +14,15 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.LongFilePathOs as os, time, glob, sys
import Common.EdkLogger as EdkLogger
-import Database
-import EccGlobalData
-from MetaDataParser import *
+from . import Database
+from . import EccGlobalData
+from .MetaDataParser import *
from optparse import OptionParser
-from Configuration import Configuration
-from Check import Check
+from .Configuration import Configuration
+from .Check import Check
import Common.GlobalData as GlobalData
from Common.String import NormPath
@@ -29,14 +30,14 @@ from Common.BuildVersion import gBUILD_VERSION
from Common import BuildToolError
from Common.Misc import PathClass
from Common.Misc import DirCache
-from MetaFileWorkspace.MetaFileParser import DscParser
-from MetaFileWorkspace.MetaFileParser import DecParser
-from MetaFileWorkspace.MetaFileParser import InfParser
-from MetaFileWorkspace.MetaFileParser import Fdf
-from MetaFileWorkspace.MetaFileTable import MetaFileStorage
-import c
+from .MetaFileWorkspace.MetaFileParser import DscParser
+from .MetaFileWorkspace.MetaFileParser import DecParser
+from .MetaFileWorkspace.MetaFileParser import InfParser
+from .MetaFileWorkspace.MetaFileParser import Fdf
+from .MetaFileWorkspace.MetaFileTable import MetaFileStorage
+from . import c
import re, string
-from Exception import *
+from .Exception import *
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.MultipleWorkspace import MultipleWorkspace as mws
diff --git a/BaseTools/Source/Python/Ecc/Exception.py b/BaseTools/Source/Python/Ecc/Exception.py
index bde41c3a4b57..ba43ebe03fdb 100644
--- a/BaseTools/Source/Python/Ecc/Exception.py
+++ b/BaseTools/Source/Python/Ecc/Exception.py
@@ -15,7 +15,8 @@
# Import Modules
#
from __future__ import print_function
-from Xml.XmlRoutines import *
+from __future__ import absolute_import
+from .Xml.XmlRoutines import *
import Common.LongFilePathOs as os
# ExceptionXml to parse Exception Node of XML file
diff --git a/BaseTools/Source/Python/Ecc/FileProfile.py b/BaseTools/Source/Python/Ecc/FileProfile.py
index f31d37ff9683..6f93cbf60479 100644
--- a/BaseTools/Source/Python/Ecc/FileProfile.py
+++ b/BaseTools/Source/Python/Ecc/FileProfile.py
@@ -16,9 +16,10 @@
# Import Modules
#
+from __future__ import absolute_import
import re
import Common.LongFilePathOs as os
-from ParserWarning import Warning
+from .ParserWarning import Warning
from Common.LongFilePathSupport import OpenLongFilePath as open
CommentList = []
@@ -54,5 +55,3 @@ class FileProfile :
except IOError:
raise Warning("Error when opening file %s" % FileName)
-
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/Ecc/MetaDataParser.py b/BaseTools/Source/Python/Ecc/MetaDataParser.py
index 9b8b96aa4b43..545a9c3b20b4 100644
--- a/BaseTools/Source/Python/Ecc/MetaDataParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaDataParser.py
@@ -11,12 +11,13 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
from CommonDataClass.DataClass import *
-from EccToolError import *
+from .EccToolError import *
from Common.MultipleWorkspace import MultipleWorkspace as mws
-import EccGlobalData
+from . import EccGlobalData
import re
## Get the inlcude path list for a source file
#
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 605a1d847c61..83676422692b 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -14,6 +14,7 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import re
@@ -32,7 +33,7 @@ from Common.Misc import GuidStructureStringToGuidString, CheckPcdDatum, PathClas
from Common.Expression import *
from CommonDataClass.Exceptions import *
-from MetaFileTable import MetaFileStorage
+from .MetaFileTable import MetaFileStorage
from GenFds.FdfParser import FdfParser
from Common.LongFilePathSupport import OpenLongFilePath as open
from Common.LongFilePathSupport import CodecOpenLongFilePath
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
index 54a3016948b1..f70a2f238334 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
@@ -14,13 +14,14 @@
##
# Import Modules
#
+from __future__ import absolute_import
import uuid
import Common.EdkLogger as EdkLogger
import EccGlobalData
-from MetaDataTable import Table
-from MetaDataTable import ConvertToSqlString
+from .MetaDataTable import Table
+from .MetaDataTable import ConvertToSqlString
from CommonDataClass.DataClass import MODEL_FILE_DSC, MODEL_FILE_DEC, MODEL_FILE_INF, \
MODEL_FILE_OTHERS
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 7f83387c08c8..56eb21ead519 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -12,18 +12,19 @@
#
from __future__ import print_function
+from __future__ import absolute_import
import sys
import Common.LongFilePathOs as os
import re
import string
-import CodeFragmentCollector
-import FileProfile
+from . import CodeFragmentCollector
+from . import FileProfile
from CommonDataClass import DataClass
-import Database
+from . import Database
from Common import EdkLogger
-from EccToolError import *
-import EccGlobalData
-import MetaDataParser
+from .EccToolError import *
+from . import EccGlobalData
+from . import MetaDataParser
IncludeFileListDict = {}
AllIncludeFileListDict = {}
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index bd4f10e1edff..02d70e73cbb8 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -1,4 +1,5 @@
from __future__ import print_function
+from __future__ import absolute_import
# $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
from antlr3 import *
@@ -23,8 +24,8 @@ from antlr3.compat import set, frozenset
#
##
-import CodeFragment
-import FileProfile
+from . import CodeFragment
+from . import FileProfile
diff --git a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
index 5d5336bee463..b92f81ad514d 100644
--- a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
@@ -16,17 +16,18 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import re
import Common.LongFilePathOs as os
import sys
import antlr3
-from CLexer import CLexer
-from CParser import CParser
+from .CLexer import CLexer
+from .CParser import CParser
-import FileProfile
-from CodeFragment import PP_Directive
-from ParserWarning import Warning
+from . import FileProfile
+from .CodeFragment import PP_Directive
+from .ParserWarning import Warning
##define T_CHAR_SPACE ' '
diff --git a/BaseTools/Source/Python/Eot/Eot.py b/BaseTools/Source/Python/Eot/Eot.py
index 5029f7369d4a..1905b06aefab 100644
--- a/BaseTools/Source/Python/Eot/Eot.py
+++ b/BaseTools/Source/Python/Eot/Eot.py
@@ -14,22 +14,23 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.LongFilePathOs as os, time, glob
import Common.EdkLogger as EdkLogger
-import EotGlobalData
+from . import EotGlobalData
from optparse import OptionParser
from Common.String import NormPath
from Common import BuildToolError
from Common.Misc import GuidStructureStringToGuidString
-from InfParserLite import *
-import c
-import Database
-from FvImage import *
+from .InfParserLite import *
+from . import c
+from . import Database
+from .FvImage import *
from array import array
-from Report import Report
+from .Report import Report
from Common.Misc import ParseConsoleLog
from Common.BuildVersion import gBUILD_VERSION
-from Parser import ConvertGuid
+from .Parser import ConvertGuid
from Common.LongFilePathSupport import OpenLongFilePath as open
## Class Eot
diff --git a/BaseTools/Source/Python/Eot/FileProfile.py b/BaseTools/Source/Python/Eot/FileProfile.py
index 0544c0d55b44..3846279cad4c 100644
--- a/BaseTools/Source/Python/Eot/FileProfile.py
+++ b/BaseTools/Source/Python/Eot/FileProfile.py
@@ -16,9 +16,10 @@
# Import Modules
#
+from __future__ import absolute_import
import re
import Common.LongFilePathOs as os
-from ParserWarning import Warning
+from .ParserWarning import Warning
from Common.LongFilePathSupport import OpenLongFilePath as open
# Profile contents of a file
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 0a1eca1ed86f..085698557efe 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -14,6 +14,7 @@
## Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import re
@@ -31,7 +32,7 @@ from Common.Misc import sdict, GuidStructureStringToGuidString
import Common.EdkLogger as EdkLogger
-import EotGlobalData
+from . import EotGlobalData
# Global definiton
gFfsPrintTitle = "%-36s %-21s %8s %8s %8s %-4s %-36s" % ("GUID", "TYPE", "OFFSET", "SIZE", "FREE", "ALIGN", "NAME")
@@ -557,7 +558,7 @@ class CompressedImage(Image):
def _GetSections(m):
try:
- import EfiCompressor
+ from . import EfiCompressor
TmpData = EfiCompressor.FrameworkDecompress(
m[m._HEADER_SIZE_:],
len(m) - m._HEADER_SIZE_
@@ -565,7 +566,7 @@ class CompressedImage(Image):
DecData = array('B')
DecData.fromstring(TmpData)
except:
- import EfiCompressor
+ from . import EfiCompressor
TmpData = EfiCompressor.UefiDecompress(
m[m._HEADER_SIZE_:],
len(m) - m._HEADER_SIZE_
@@ -665,7 +666,7 @@ class GuidDefinedImage(Image):
SectionList.append(Sec)
elif Guid == m.TIANO_COMPRESS_GUID:
try:
- import EfiCompressor
+ from . import EfiCompressor
# skip the header
Offset = m.DataOffset - 4
TmpData = EfiCompressor.FrameworkDecompress(m[Offset:], len(m)-Offset)
@@ -686,7 +687,7 @@ class GuidDefinedImage(Image):
pass
elif Guid == m.LZMA_COMPRESS_GUID:
try:
- import LzmaCompressor
+ from . import LzmaCompressor
# skip the header
Offset = m.DataOffset - 4
TmpData = LzmaCompressor.LzmaDecompress(m[Offset:], len(m)-Offset)
diff --git a/BaseTools/Source/Python/Eot/InfParserLite.py b/BaseTools/Source/Python/Eot/InfParserLite.py
index 4bdd60a6f71c..4080dc852008 100644
--- a/BaseTools/Source/Python/Eot/InfParserLite.py
+++ b/BaseTools/Source/Python/Eot/InfParserLite.py
@@ -15,6 +15,7 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import Common.EdkLogger as EdkLogger
@@ -22,8 +23,8 @@ from Common.DataType import *
from CommonDataClass.DataClass import *
from Common.Identification import *
from Common.String import *
-from Parser import *
-import Database
+from .Parser import *
+from . import Database
## EdkInfParser() class
#
diff --git a/BaseTools/Source/Python/Eot/Parser.py b/BaseTools/Source/Python/Eot/Parser.py
index 951fe7e3be2e..8703e2d55552 100644
--- a/BaseTools/Source/Python/Eot/Parser.py
+++ b/BaseTools/Source/Python/Eot/Parser.py
@@ -15,12 +15,13 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.LongFilePathOs as os, re
import Common.EdkLogger as EdkLogger
from Common.DataType import *
from CommonDataClass.DataClass import *
from Common.String import CleanString, GetSplitValueList, ReplaceMacro
-import EotGlobalData
+from . import EotGlobalData
from Common.Misc import sdict
from Common.String import GetSplitList
from Common.LongFilePathSupport import OpenLongFilePath as open
diff --git a/BaseTools/Source/Python/Eot/Report.py b/BaseTools/Source/Python/Eot/Report.py
index 386e3eb8ec05..6440ede6bce3 100644
--- a/BaseTools/Source/Python/Eot/Report.py
+++ b/BaseTools/Source/Python/Eot/Report.py
@@ -1,3 +1,4 @@
+from __future__ import absolute_import
## @file
# This file is used to create report for Eot tool
#
@@ -15,7 +16,7 @@
# Import Modules
#
import Common.LongFilePathOs as os
-import EotGlobalData
+from . import EotGlobalData
from Common.LongFilePathSupport import OpenLongFilePath as open
## Report() class
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index ceefc952237f..4f0b58a52c79 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -16,15 +16,16 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import sys
import Common.LongFilePathOs as os
import re
-import CodeFragmentCollector
-import FileProfile
+from . import CodeFragmentCollector
+from . import FileProfile
from CommonDataClass import DataClass
from Common import EdkLogger
-from EotToolError import *
-import EotGlobalData
+from .EotToolError import *
+from . import EotGlobalData
# Global Dicts
IncludeFileListDict = {}
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 65919270af15..983c89e82970 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -15,12 +15,13 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
from struct import *
import Common.LongFilePathOs as os
from io import BytesIO
-import FfsFileStatement
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from . import FfsFileStatement
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import AprioriSectionClassObject
from Common.String import *
from Common.Misc import SaveFileOnChange, PathClass
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index 60019195df27..fb929065634e 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -15,17 +15,16 @@
##
# Import Modules
#
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from __future__ import absolute_import
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import CapsuleClassObject
import Common.LongFilePathOs as os
import subprocess
from io import BytesIO
from Common.Misc import SaveFileOnChange
-from GenFds import GenFds
from Common.Misc import PackRegistryFormatGuid
import uuid
from struct import pack
-from GenFds import FindExtendTool
from Common import EdkLogger
from Common.BuildToolError import *
@@ -57,6 +56,7 @@ class Capsule (CapsuleClassObject) :
# @retval string Generated Capsule file path
#
def GenFmpCapsule(self):
+ from .GenFds import FindExtendTool
#
# Generate capsule header
# typedef struct {
@@ -201,6 +201,7 @@ class Capsule (CapsuleClassObject) :
# @retval string Generated Capsule file path
#
def GenCapsule(self):
+ from .GenFds import GenFds
if self.UiCapsuleName.upper() + 'cap' in GenFds.ImageBinDict.keys():
return GenFds.ImageBinDict[self.UiCapsuleName.upper() + 'cap']
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index f0a55d81120b..d7208deea1e3 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -15,8 +15,9 @@
##
# Import Modules
#
-import Ffs
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from __future__ import absolute_import
+from . import Ffs
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from io import BytesIO
from struct import pack
import os
diff --git a/BaseTools/Source/Python/GenFds/CompressSection.py b/BaseTools/Source/Python/GenFds/CompressSection.py
index 64ad275d832e..ae4866661de6 100644
--- a/BaseTools/Source/Python/GenFds/CompressSection.py
+++ b/BaseTools/Source/Python/GenFds/CompressSection.py
@@ -15,11 +15,12 @@
##
# Import Modules
#
-from Ffs import Ffs
-import Section
+from __future__ import absolute_import
+from .Ffs import Ffs
+from . import Section
import subprocess
import Common.LongFilePathOs as os
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import CompressSectionClassObject
## generate compress section
diff --git a/BaseTools/Source/Python/GenFds/DataSection.py b/BaseTools/Source/Python/GenFds/DataSection.py
index 2d2975f75c0f..cf0d27d83e08 100644
--- a/BaseTools/Source/Python/GenFds/DataSection.py
+++ b/BaseTools/Source/Python/GenFds/DataSection.py
@@ -15,10 +15,11 @@
##
# Import Modules
#
-import Section
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from __future__ import absolute_import
+from . import Section
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
import subprocess
-from Ffs import Ffs
+from .Ffs import Ffs
import Common.LongFilePathOs as os
from CommonDataClass.FdfClass import DataSectionClassObject
from Common.Misc import PeImageClass
diff --git a/BaseTools/Source/Python/GenFds/DepexSection.py b/BaseTools/Source/Python/GenFds/DepexSection.py
index 1992d2abd807..d67321473268 100644
--- a/BaseTools/Source/Python/GenFds/DepexSection.py
+++ b/BaseTools/Source/Python/GenFds/DepexSection.py
@@ -15,10 +15,11 @@
##
# Import Modules
#
-import Section
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from __future__ import absolute_import
+from . import Section
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
import subprocess
-from Ffs import Ffs
+from .Ffs import Ffs
import Common.LongFilePathOs as os
from CommonDataClass.FdfClass import DepexSectionClassObject
from AutoGen.GenDepex import DependencyExpression
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index d24df30cb734..7bce56ed55db 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -15,11 +15,12 @@
##
# Import Modules
#
+from __future__ import absolute_import
from struct import *
-import Section
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from . import Section
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
import subprocess
-from Ffs import Ffs
+from .Ffs import Ffs
import Common.LongFilePathOs as os
from CommonDataClass.FdfClass import EfiSectionClassObject
from Common import EdkLogger
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index acd73f6449f6..86a0d9a47bfc 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -15,18 +15,18 @@
##
# Import Modules
#
-import Region
-import Fv
+from __future__ import absolute_import
+from . import Region
+from . import Fv
import Common.LongFilePathOs as os
from io import BytesIO
import sys
from struct import *
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import FDClassObject
from Common import EdkLogger
from Common.BuildToolError import *
from Common.Misc import SaveFileOnChange
-from GenFds import GenFds
## generate FD
#
@@ -46,6 +46,7 @@ class FD(FDClassObject):
# @retval string Generated FD file name
#
def GenFd (self, Flag = False):
+ from .GenFds import GenFds
if self.FdUiName.upper() + 'fd' in GenFds.ImageBinDict.keys():
return GenFds.ImageBinDict[self.FdUiName.upper() + 'fd']
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 43f849b07172..4e07d04c08df 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -17,34 +17,35 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import re
-import Fd
-import Region
-import Fv
-import AprioriSection
-import FfsInfStatement
-import FfsFileStatement
-import VerSection
-import UiSection
-import FvImageSection
-import DataSection
-import DepexSection
-import CompressSection
-import GuidSection
-import Capsule
-import CapsuleData
-import Rule
-import RuleComplexFile
-import RuleSimpleFile
-import EfiSection
-import Vtf
-import ComponentStatement
-import OptionRom
-import OptRomInfStatement
-import OptRomFileStatement
+from . import Fd
+from . import Region
+from . import Fv
+from . import AprioriSection
+from . import FfsInfStatement
+from . import FfsFileStatement
+from . import VerSection
+from . import UiSection
+from . import FvImageSection
+from . import DataSection
+from . import DepexSection
+from . import CompressSection
+from . import GuidSection
+from . import Capsule
+from . import CapsuleData
+from . import Rule
+from . import RuleComplexFile
+from . import RuleSimpleFile
+from . import EfiSection
+from . import Vtf
+from . import ComponentStatement
+from . import OptionRom
+from . import OptRomInfStatement
+from . import OptRomFileStatement
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from Common.BuildToolError import *
from Common import EdkLogger
from Common.Misc import PathClass
@@ -58,8 +59,8 @@ from Common.Misc import tdict
from Common.MultipleWorkspace import MultipleWorkspace as mws
import Common.LongFilePathOs as os
from Common.LongFilePathSupport import OpenLongFilePath as open
-from Capsule import EFI_CERT_TYPE_PKCS7_GUID
-from Capsule import EFI_CERT_TYPE_RSA2048_SHA256_GUID
+from .Capsule import EFI_CERT_TYPE_PKCS7_GUID
+from .Capsule import EFI_CERT_TYPE_RSA2048_SHA256_GUID
##define T_CHAR_SPACE ' '
##define T_CHAR_NULL '\0'
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index 1293c8a107f0..1b9b4deef9ae 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -15,20 +15,21 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
-import Ffs
-import Rule
+from . import Ffs
+from . import Rule
import Common.LongFilePathOs as os
from io import BytesIO
import subprocess
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import FileStatementClassObject
from Common import EdkLogger
from Common.BuildToolError import *
from Common.Misc import GuidStructureByteArrayToGuidString
-from GuidSection import GuidSection
-from FvImageSection import FvImageSection
+from .GuidSection import GuidSection
+from .FvImageSection import FvImageSection
from Common.Misc import SaveFileOnChange
from struct import *
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index d6edd1f0971e..d1f3604a7fc9 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -16,17 +16,18 @@
##
# Import Modules
#
-import Rule
+from __future__ import absolute_import
+from . import Rule
import Common.LongFilePathOs as os
from io import BytesIO
from struct import *
-from GenFdsGlobalVariable import GenFdsGlobalVariable
-import Ffs
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
+from . import Ffs
import subprocess
import sys
-import Section
-import RuleSimpleFile
-import RuleComplexFile
+from . import Section
+from . import RuleSimpleFile
+from . import RuleComplexFile
from CommonDataClass.FdfClass import FfsInfStatementClassObject
from Common.MultipleWorkspace import MultipleWorkspace as mws
from Common.String import *
@@ -36,15 +37,15 @@ from Common.Misc import ProcessDuplicatedInf
from Common.Misc import GetVariableOffset
from Common import EdkLogger
from Common.BuildToolError import *
-from GuidSection import GuidSection
-from FvImageSection import FvImageSection
+from .GuidSection import GuidSection
+from .FvImageSection import FvImageSection
from Common.Misc import PeImageClass
from AutoGen.GenDepex import DependencyExpression
from PatchPcdValue.PatchPcdValue import PatchBinaryFile
from Common.LongFilePathSupport import CopyLongFilePath
from Common.LongFilePathSupport import OpenLongFilePath as open
import Common.GlobalData as GlobalData
-from DepexSection import DepexSection
+from .DepexSection import DepexSection
from Common.Misc import SaveFileOnChange
## generate FFS from INF
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 88a520998eae..bff661e43e7e 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -15,17 +15,17 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import subprocess
from io import BytesIO
from struct import *
-import Ffs
-import AprioriSection
-import FfsFileStatement
-from GenFdsGlobalVariable import GenFdsGlobalVariable
-from GenFds import GenFds
+from . import Ffs
+from . import AprioriSection
+from . import FfsFileStatement
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import FvClassObject
from Common.Misc import SaveFileOnChange
from Common.LongFilePathSupport import CopyLongFilePath
@@ -70,7 +70,7 @@ class FV (FvClassObject):
# @retval string Generated FV file path
#
def AddToBuffer (self, Buffer, BaseAddress=None, BlockSize= None, BlockNum=None, ErasePloarity='1', VtfDict=None, MacroDict = {}, Flag=False) :
-
+ from .GenFds import GenFds
if BaseAddress == None and self.UiFvName.upper() + 'fv' in GenFds.ImageBinDict.keys():
return GenFds.ImageBinDict[self.UiFvName.upper() + 'fv']
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 7416ce1b7d8a..2352e8962e94 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -15,11 +15,12 @@
##
# Import Modules
#
-import Section
+from __future__ import absolute_import
+from . import Section
from io import BytesIO
-from Ffs import Ffs
+from .Ffs import Ffs
import subprocess
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
import Common.LongFilePathOs as os
from CommonDataClass.FdfClass import FvImageSectionClassObject
from Common import EdkLogger
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index ebebcd7980e4..7c1df37778fb 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -16,19 +16,20 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
from optparse import OptionParser
import sys
import Common.LongFilePathOs as os
import linecache
-import FdfParser
+from . import FdfParser
import Common.BuildToolError as BuildToolError
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from Workspace.WorkspaceDatabase import WorkspaceDatabase
from Workspace.BuildClassObject import PcdClassObject
from Workspace.BuildClassObject import ModuleBuildClassObject
-import RuleComplexFile
-from EfiSection import EfiSection
+from . import RuleComplexFile
+from .EfiSection import EfiSection
from io import BytesIO
import Common.TargetTxtClassObject as TargetTxtClassObject
import Common.ToolDefClassObject as ToolDefClassObject
@@ -44,7 +45,7 @@ from Common.Misc import CheckPcdDatum
from Common.Misc import BuildOptionPcdValueFormat
from Common.BuildVersion import gBUILD_VERSION
from Common.MultipleWorkspace import MultipleWorkspace as mws
-import FfsFileStatement
+from . import FfsFileStatement
import glob
from struct import unpack
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index ea737bb9a7ea..877dabc196e1 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -15,19 +15,19 @@
##
# Import Modules
#
-import Section
+from __future__ import absolute_import
+from . import Section
import subprocess
-from Ffs import Ffs
+from .Ffs import Ffs
import Common.LongFilePathOs as os
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import GuidSectionClassObject
from Common import ToolDefClassObject
import sys
from Common import EdkLogger
from Common.BuildToolError import *
-from FvImageSection import FvImageSection
+from .FvImageSection import FvImageSection
from Common.LongFilePathSupport import OpenLongFilePath as open
-from GenFds import FindExtendTool
## generate GUIDed section
#
@@ -129,6 +129,7 @@ class GuidSection(GuidSectionClassObject) :
ExternalTool = None
ExternalOption = None
if self.NameGuid != None:
+ from .GenFds import FindExtendTool
ExternalTool, ExternalOption = FindExtendTool(self.KeyStringList, self.CurrentArchList, self.NameGuid)
#
diff --git a/BaseTools/Source/Python/GenFds/OptRomFileStatement.py b/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
index ab4fae611e33..f33d6606ca96 100644
--- a/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
@@ -15,9 +15,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.LongFilePathOs as os
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
##
#
#
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index 80c4bbab6eff..5de0cac2a374 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -15,16 +15,16 @@
##
# Import Modules
#
-import RuleSimpleFile
-import RuleComplexFile
-import Section
-import OptionRom
+from __future__ import absolute_import
+from . import RuleSimpleFile
+from . import RuleComplexFile
+from . import Section
import Common.GlobalData as GlobalData
from Common.DataType import *
from Common.String import *
-from FfsInfStatement import FfsInfStatement
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .FfsInfStatement import FfsInfStatement
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
##
#
@@ -45,7 +45,7 @@ class OptRomInfStatement (FfsInfStatement):
# @param self The object pointer
#
def __GetOptRomParams(self):
-
+ from . import OptionRom
if self.OverrideAttribs == None:
self.OverrideAttribs = OptionRom.OverrideAttribs()
@@ -151,5 +151,3 @@ class OptRomInfStatement (FfsInfStatement):
OutputFileList.extend(FileList)
return OutputFileList
-
-
\ No newline at end of file
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index 946cdf812a24..a7a63e80845c 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -15,12 +15,12 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.LongFilePathOs as os
import subprocess
-import OptRomInfStatement
-from GenFdsGlobalVariable import GenFdsGlobalVariable
-from GenFds import GenFds
+from . import OptRomInfStatement
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import OptionRomClassObject
from Common.Misc import SaveFileOnChange
from Common import EdkLogger
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 6ace73abe904..0debd579bec5 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -15,9 +15,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
from builtins import range
from struct import *
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from io import BytesIO
import string
from CommonDataClass.FdfClass import RegionClassObject
diff --git a/BaseTools/Source/Python/GenFds/RuleComplexFile.py b/BaseTools/Source/Python/GenFds/RuleComplexFile.py
index 36c483fbb207..c357fedbd3be 100644
--- a/BaseTools/Source/Python/GenFds/RuleComplexFile.py
+++ b/BaseTools/Source/Python/GenFds/RuleComplexFile.py
@@ -15,7 +15,8 @@
##
# Import Modules
#
-import Rule
+from __future__ import absolute_import
+from . import Rule
from CommonDataClass.FdfClass import RuleComplexFileClassObject
## complex rule
diff --git a/BaseTools/Source/Python/GenFds/RuleSimpleFile.py b/BaseTools/Source/Python/GenFds/RuleSimpleFile.py
index 061f984e6af4..7aa184e7d8bb 100644
--- a/BaseTools/Source/Python/GenFds/RuleSimpleFile.py
+++ b/BaseTools/Source/Python/GenFds/RuleSimpleFile.py
@@ -15,7 +15,8 @@
##
# Import Modules
#
-import Rule
+from __future__ import absolute_import
+from . import Rule
from CommonDataClass.FdfClass import RuleSimpleFileClassObject
## simple rule
diff --git a/BaseTools/Source/Python/GenFds/Section.py b/BaseTools/Source/Python/GenFds/Section.py
index 463faa378165..4650afe27114 100644
--- a/BaseTools/Source/Python/GenFds/Section.py
+++ b/BaseTools/Source/Python/GenFds/Section.py
@@ -15,8 +15,9 @@
##
# Import Modules
#
+from __future__ import absolute_import
from CommonDataClass.FdfClass import SectionClassObject
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
import Common.LongFilePathOs as os, glob
from Common import EdkLogger
from Common.BuildToolError import *
diff --git a/BaseTools/Source/Python/GenFds/UiSection.py b/BaseTools/Source/Python/GenFds/UiSection.py
index 4f6926f7cae4..a92bebc65cf8 100644
--- a/BaseTools/Source/Python/GenFds/UiSection.py
+++ b/BaseTools/Source/Python/GenFds/UiSection.py
@@ -15,11 +15,12 @@
##
# Import Modules
#
-import Section
-from Ffs import Ffs
+from __future__ import absolute_import
+from . import Section
+from .Ffs import Ffs
import subprocess
import Common.LongFilePathOs as os
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import UiSectionClassObject
from Common.LongFilePathSupport import OpenLongFilePath as open
diff --git a/BaseTools/Source/Python/GenFds/VerSection.py b/BaseTools/Source/Python/GenFds/VerSection.py
index e29029980fad..c8186fe7477e 100644
--- a/BaseTools/Source/Python/GenFds/VerSection.py
+++ b/BaseTools/Source/Python/GenFds/VerSection.py
@@ -15,11 +15,12 @@
##
# Import Modules
#
-from Ffs import Ffs
-import Section
+from __future__ import absolute_import
+from .Ffs import Ffs
+from . import Section
import Common.LongFilePathOs as os
import subprocess
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
from CommonDataClass.FdfClass import VerSectionClassObject
from Common.LongFilePathSupport import OpenLongFilePath as open
diff --git a/BaseTools/Source/Python/GenFds/Vtf.py b/BaseTools/Source/Python/GenFds/Vtf.py
index 06e3d275c381..a0898b2f639b 100644
--- a/BaseTools/Source/Python/GenFds/Vtf.py
+++ b/BaseTools/Source/Python/GenFds/Vtf.py
@@ -15,7 +15,8 @@
##
# Import Modules
#
-from GenFdsGlobalVariable import GenFdsGlobalVariable
+from __future__ import absolute_import
+from .GenFdsGlobalVariable import GenFdsGlobalVariable
import Common.LongFilePathOs as os
from CommonDataClass.FdfClass import VtfClassObject
from Common.LongFilePathSupport import OpenLongFilePath as open
diff --git a/BaseTools/Source/Python/Table/TableDataModel.py b/BaseTools/Source/Python/Table/TableDataModel.py
index 9c3d7bd9345f..74df04a8200a 100644
--- a/BaseTools/Source/Python/Table/TableDataModel.py
+++ b/BaseTools/Source/Python/Table/TableDataModel.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
import CommonDataClass.DataClass as DataClass
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
## TableDataModel
diff --git a/BaseTools/Source/Python/Table/TableDec.py b/BaseTools/Source/Python/Table/TableDec.py
index 6b7d22c9384c..b33230f437b6 100644
--- a/BaseTools/Source/Python/Table/TableDec.py
+++ b/BaseTools/Source/Python/Table/TableDec.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
import CommonDataClass.DataClass as DataClass
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
## TableDec
diff --git a/BaseTools/Source/Python/Table/TableDsc.py b/BaseTools/Source/Python/Table/TableDsc.py
index 69477d544d8e..c8d6e4b3dc15 100644
--- a/BaseTools/Source/Python/Table/TableDsc.py
+++ b/BaseTools/Source/Python/Table/TableDsc.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
import CommonDataClass.DataClass as DataClass
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
## TableDsc
diff --git a/BaseTools/Source/Python/Table/TableEotReport.py b/BaseTools/Source/Python/Table/TableEotReport.py
index 740105c8f99d..a32779524fff 100644
--- a/BaseTools/Source/Python/Table/TableEotReport.py
+++ b/BaseTools/Source/Python/Table/TableEotReport.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
import Common.LongFilePathOs as os, time
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString2
import Eot.EotToolError as EotToolError
import Eot.EotGlobalData as EotGlobalData
@@ -73,4 +74,4 @@ class TableEotReport(Table):
SqlCommand = """select max(ID) from %s""" % self.Table
self.Cur.execute(SqlCommand)
for Item in self.Cur:
- return Item[0]
\ No newline at end of file
+ return Item[0]
diff --git a/BaseTools/Source/Python/Table/TableFdf.py b/BaseTools/Source/Python/Table/TableFdf.py
index 927b5d1a3be6..b15c5c21201c 100644
--- a/BaseTools/Source/Python/Table/TableFdf.py
+++ b/BaseTools/Source/Python/Table/TableFdf.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
import CommonDataClass.DataClass as DataClass
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
## TableFdf
diff --git a/BaseTools/Source/Python/Table/TableFile.py b/BaseTools/Source/Python/Table/TableFile.py
index caf749e9d3c5..57e1fc0bcece 100644
--- a/BaseTools/Source/Python/Table/TableFile.py
+++ b/BaseTools/Source/Python/Table/TableFile.py
@@ -14,8 +14,9 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
import Common.LongFilePathOs as os
from CommonDataClass.DataClass import FileClass
diff --git a/BaseTools/Source/Python/Table/TableFunction.py b/BaseTools/Source/Python/Table/TableFunction.py
index 3d7c2d0ea5a0..97f360cec59e 100644
--- a/BaseTools/Source/Python/Table/TableFunction.py
+++ b/BaseTools/Source/Python/Table/TableFunction.py
@@ -14,8 +14,9 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
## TableFunction
diff --git a/BaseTools/Source/Python/Table/TableIdentifier.py b/BaseTools/Source/Python/Table/TableIdentifier.py
index bcd6d6e1c152..b7b4d9018c0e 100644
--- a/BaseTools/Source/Python/Table/TableIdentifier.py
+++ b/BaseTools/Source/Python/Table/TableIdentifier.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
from Common.String import ConvertToSqlString
-from Table import Table
+from .Table import Table
## TableIdentifier
#
@@ -87,4 +88,4 @@ class TableIdentifier(Table):
% (self.Table, self.ID, Modifier, Type, Name, Value, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn)
Table.Insert(self, SqlCommand)
- return self.ID
\ No newline at end of file
+ return self.ID
diff --git a/BaseTools/Source/Python/Table/TableInf.py b/BaseTools/Source/Python/Table/TableInf.py
index b6e300b150c1..424c2663ca0f 100644
--- a/BaseTools/Source/Python/Table/TableInf.py
+++ b/BaseTools/Source/Python/Table/TableInf.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
import CommonDataClass.DataClass as DataClass
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
## TableInf
diff --git a/BaseTools/Source/Python/Table/TablePcd.py b/BaseTools/Source/Python/Table/TablePcd.py
index 19623f98f42c..01011062cee4 100644
--- a/BaseTools/Source/Python/Table/TablePcd.py
+++ b/BaseTools/Source/Python/Table/TablePcd.py
@@ -14,8 +14,9 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString
## TablePcd
@@ -87,4 +88,4 @@ class TablePcd(Table):
% (self.Table, self.ID, CName, TokenSpaceGuidCName, Token, DatumType, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn)
Table.Insert(self, SqlCommand)
- return self.ID
\ No newline at end of file
+ return self.ID
diff --git a/BaseTools/Source/Python/Table/TableQuery.py b/BaseTools/Source/Python/Table/TableQuery.py
index e1d2537394b2..205602d05d6c 100644
--- a/BaseTools/Source/Python/Table/TableQuery.py
+++ b/BaseTools/Source/Python/Table/TableQuery.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
from Common.String import ConvertToSqlString
-from Table import Table
+from .Table import Table
## TableQuery
#
diff --git a/BaseTools/Source/Python/Table/TableReport.py b/BaseTools/Source/Python/Table/TableReport.py
index 4af0e98d86b4..40afa15c08c5 100644
--- a/BaseTools/Source/Python/Table/TableReport.py
+++ b/BaseTools/Source/Python/Table/TableReport.py
@@ -14,9 +14,10 @@
##
# Import Modules
#
+from __future__ import absolute_import
import Common.EdkLogger as EdkLogger
import Common.LongFilePathOs as os, time
-from Table import Table
+from .Table import Table
from Common.String import ConvertToSqlString2
import EccToolError as EccToolError
import EccGlobalData as EccGlobalData
diff --git a/BaseTools/Source/Python/UPT/Library/Parsing.py b/BaseTools/Source/Python/UPT/Library/Parsing.py
index bac664506f4d..dbbfff9e9462 100644
--- a/BaseTools/Source/Python/UPT/Library/Parsing.py
+++ b/BaseTools/Source/Python/UPT/Library/Parsing.py
@@ -16,6 +16,7 @@
'''
Parsing
'''
+from __future__ import absolute_import
##
# Import Modules
@@ -43,7 +44,7 @@ from Logger import StringTable as ST
import Logger.Log as Logger
from Parser.DecParser import Dec
-import GlobalData
+from . import GlobalData
gPKG_INFO_DICT = {}
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 8551a0d8b7e7..4b1c8a257db6 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -18,6 +18,7 @@
# into PlatformBuildClassObject form for easier use for AutoGen.
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
from Common.String import *
from Common.DataType import *
@@ -27,11 +28,11 @@ from types import *
from CommonDataClass.CommonClass import SkuInfoClass
from Common.TargetTxtClassObject import *
from Common.ToolDefClassObject import *
-from MetaDataTable import *
-from MetaFileTable import *
-from MetaFileParser import *
+from .MetaDataTable import *
+from .MetaFileTable import *
+from .MetaFileParser import *
-from WorkspaceCommon import GetDeclaredPcd
+from .WorkspaceCommon import GetDeclaredPcd
from Common.Misc import AnalyzeDscPcd
from Common.Misc import ProcessDuplicatedInf
import re
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 9fc2e681b73d..85b5fefe26dd 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -12,12 +12,13 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import absolute_import
from builtins import range
from Common.String import *
from Common.DataType import *
from Common.Misc import *
from types import *
-from MetaFileParser import *
+from .MetaFileParser import *
from Workspace.BuildClassObject import ModuleBuildClassObject, LibraryClassObject, PcdClassObject
## Module build information from INF file
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index b96d027cb19e..f871ab717c4b 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -16,6 +16,7 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
from builtins import range
import Common.LongFilePathOs as os
import re
@@ -34,8 +35,8 @@ from Common.Expression import *
from CommonDataClass.Exceptions import *
from Common.LongFilePathSupport import OpenLongFilePath as open
-from MetaFileTable import MetaFileStorage
-from MetaFileCommentParser import CheckInfComment
+from .MetaFileTable import MetaFileStorage
+from .MetaFileCommentParser import CheckInfComment
## A decorator used to parse macro definition
def ParseMacro(Parser):
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index 9416065b284f..6cf4c023e246 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -14,13 +14,14 @@
##
# Import Modules
#
+from __future__ import absolute_import
import uuid
import Common.EdkLogger as EdkLogger
from Common.BuildToolError import FORMAT_INVALID
-from MetaDataTable import Table, TableFile
-from MetaDataTable import ConvertToSqlString
+from .MetaDataTable import Table, TableFile
+from .MetaDataTable import ConvertToSqlString
from CommonDataClass.DataClass import MODEL_FILE_DSC, MODEL_FILE_DEC, MODEL_FILE_INF, \
MODEL_FILE_OTHERS
from Common.DataType import *
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
index 6b5e0edb0a4d..fd7a3617668c 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
@@ -11,9 +11,10 @@
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
+from __future__ import absolute_import
from Common.Misc import sdict
from Common.DataType import SUP_MODULE_USER_DEFINED
-from BuildClassObject import LibraryClassObject
+from .BuildClassObject import LibraryClassObject
import Common.GlobalData as GlobalData
from Workspace.BuildClassObject import StructurePcd
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
index a3407d113e0f..052d8739bccb 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
@@ -15,15 +15,16 @@
##
# Import Modules
#
+from __future__ import absolute_import
import sqlite3
from Common.String import *
from Common.DataType import *
from Common.Misc import *
from types import *
-from MetaDataTable import *
-from MetaFileTable import *
-from MetaFileParser import *
+from .MetaDataTable import *
+from .MetaFileTable import *
+from .MetaFileParser import *
from Workspace.DecBuildData import DecBuildData
from Workspace.DscBuildData import DscBuildData
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 66b46fab5c26..22feae2964cf 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -17,6 +17,7 @@
# Import Modules
#
from __future__ import print_function
+from __future__ import absolute_import
import Common.LongFilePathOs as os
import re
from io import BytesIO
@@ -46,7 +47,7 @@ from Common.BuildToolError import *
from Workspace.WorkspaceDatabase import *
from Common.MultipleWorkspace import MultipleWorkspace as mws
-from BuildReport import BuildReport
+from .BuildReport import BuildReport
from GenPatchPcdTable.GenPatchPcdTable import *
from PatchPcdValue.PatchPcdValue import *
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 18/20] BaseTools: Move OverrideAttribs to OptRomInfStatement.py
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (16 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 17/20] BaseTools: Adopt absolute import for python scripts Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 19/20] BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py Gary Lin
` (2 subsequent siblings)
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Move "class OverrideAttribs" to OptRomInfStatement.py to remove
"import OptionRom" which may form a circular import:
GenFds.OptionRom => GenFds.OptRomInfStatement => GenFds.OptionRom
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/GenFds/FdfParser.py | 2 +-
BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 18 ++++++++++++++++--
BaseTools/Source/Python/GenFds/OptionRom.py | 14 --------------
3 files changed, 17 insertions(+), 17 deletions(-)
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 4e07d04c08df..52e4727e38a6 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -4481,7 +4481,7 @@ class FdfParser:
#
def __GetOptRomOverrides(self, Obj):
if self.__IsToken('{'):
- Overrides = OptionRom.OverrideAttribs()
+ Overrides = OptRomInfStatement.OverrideAttribs()
while True:
if self.__IsKeyword( "PCI_VENDOR_ID"):
if not self.__IsToken( "="):
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index 5de0cac2a374..a12230fe21ee 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -45,9 +45,8 @@ class OptRomInfStatement (FfsInfStatement):
# @param self The object pointer
#
def __GetOptRomParams(self):
- from . import OptionRom
if self.OverrideAttribs == None:
- self.OverrideAttribs = OptionRom.OverrideAttribs()
+ self.OverrideAttribs = OverrideAttribs()
if self.OverrideAttribs.NeedCompress == None:
self.OverrideAttribs.NeedCompress = self.OptRomDefs.get ('PCI_COMPRESS')
@@ -151,3 +150,18 @@ class OptRomInfStatement (FfsInfStatement):
OutputFileList.extend(FileList)
return OutputFileList
+
+##
+#
+#
+class OverrideAttribs:
+ ## The constructor
+ #
+ # @param self The object pointer
+ #
+ def __init__(self):
+ self.PciVendorId = None
+ self.PciClassCode = None
+ self.PciDeviceId = None
+ self.PciRevision = None
+ self.NeedCompress = None
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index a7a63e80845c..01846eb01440 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -123,17 +123,3 @@ class OPTIONROM (OptionRomClassObject):
GenFdsGlobalVariable.SharpCounter = 0
return OutputFile
-
-class OverrideAttribs:
-
- ## The constructor
- #
- # @param self The object pointer
- #
- def __init__(self):
-
- self.PciVendorId = None
- self.PciClassCode = None
- self.PciDeviceId = None
- self.PciRevision = None
- self.NeedCompress = None
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 19/20] BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (17 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 18/20] BaseTools: Move OverrideAttribs to OptRomInfStatement.py Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-02-01 8:36 ` [PATCH v2 20/20] BaseTools: Move ImageBinDict " Gary Lin
2018-06-20 6:22 ` [PATCH v2 00/20] BaseTools: One step toward python3 Paolo Bonzini
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Importing "FindExtendTool" from GenFds.GenFds could create the following
circular imports:
* GenFds.FdfParser => GenFds.Capsule => GenFds.GenFds => GenFds.FdfParser
* GenFds.FdfParser => GenFds.Fd => GenFds.Fv => GenFds.AprioriSection =>
GenFds.FfsFileStatement => GenFds.GuidSection => GenFds.GenFds =>
GenFds.FdfParser
This commit moves "FindExtendTool" to GenFdsGlobalVariable.py to break
the circles.
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/GenFds/Capsule.py | 2 +-
BaseTools/Source/Python/GenFds/GenFds.py | 93 -------------------
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 95 +++++++++++++++++++-
BaseTools/Source/Python/GenFds/GuidSection.py | 2 +-
4 files changed, 96 insertions(+), 96 deletions(-)
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index fb929065634e..55e9e1dade2f 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -17,6 +17,7 @@
#
from __future__ import absolute_import
from .GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import FindExtendTool
from CommonDataClass.FdfClass import CapsuleClassObject
import Common.LongFilePathOs as os
import subprocess
@@ -56,7 +57,6 @@ class Capsule (CapsuleClassObject) :
# @retval string Generated Capsule file path
#
def GenFmpCapsule(self):
- from .GenFds import FindExtendTool
#
# Generate capsule header
# typedef struct {
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 7c1df37778fb..953069fc52f7 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -415,99 +415,6 @@ def CheckBuildOptionPcd():
GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, NewValue)
-
-## FindExtendTool()
-#
-# Find location of tools to process data
-#
-# @param KeyStringList Filter for inputs of section generation
-# @param CurrentArchList Arch list
-# @param NameGuid The Guid name
-#
-def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
- ToolDb = ToolDefClassObject.ToolDefDict(GenFdsGlobalVariable.ConfDir).ToolsDefTxtDatabase
- # if user not specify filter, try to deduce it from global data.
- if KeyStringList == None or KeyStringList == []:
- Target = GenFdsGlobalVariable.TargetName
- ToolChain = GenFdsGlobalVariable.ToolChainTag
- if ToolChain not in ToolDb['TOOL_CHAIN_TAG']:
- EdkLogger.error("GenFds", GENFDS_ERROR, "Can not find external tool because tool tag %s is not defined in tools_def.txt!" % ToolChain)
- KeyStringList = [Target + '_' + ToolChain + '_' + CurrentArchList[0]]
- for Arch in CurrentArchList:
- if Target + '_' + ToolChain + '_' + Arch not in KeyStringList:
- KeyStringList.append(Target + '_' + ToolChain + '_' + Arch)
-
- if GenFdsGlobalVariable.GuidToolDefinition:
- if NameGuid in GenFdsGlobalVariable.GuidToolDefinition.keys():
- return GenFdsGlobalVariable.GuidToolDefinition[NameGuid]
-
- ToolDefinition = ToolDefClassObject.ToolDefDict(GenFdsGlobalVariable.ConfDir).ToolsDefTxtDictionary
- ToolPathTmp = None
- ToolOption = None
- ToolPathKey = None
- ToolOptionKey = None
- KeyList = None
- for ToolDef in ToolDefinition.items():
- if NameGuid == ToolDef[1]:
- KeyList = ToolDef[0].split('_')
- Key = KeyList[0] + \
- '_' + \
- KeyList[1] + \
- '_' + \
- KeyList[2]
- if Key in KeyStringList and KeyList[4] == 'GUID':
- ToolPathKey = Key + '_' + KeyList[3] + '_PATH'
- ToolOptionKey = Key + '_' + KeyList[3] + '_FLAGS'
- ToolPath = ToolDefinition.get(ToolPathKey)
- ToolOption = ToolDefinition.get(ToolOptionKey)
- if ToolPathTmp == None:
- ToolPathTmp = ToolPath
- else:
- if ToolPathTmp != ToolPath:
- EdkLogger.error("GenFds", GENFDS_ERROR, "Don't know which tool to use, %s or %s ?" % (ToolPathTmp, ToolPath))
-
- BuildOption = {}
- for Arch in CurrentArchList:
- Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
- # key is (ToolChainFamily, ToolChain, CodeBase)
- for item in Platform.BuildOptions:
- if '_PATH' in item[1] or '_FLAGS' in item[1] or '_GUID' in item[1]:
- if not item[0] or (item[0] and GenFdsGlobalVariable.ToolChainFamily== item[0]):
- if item[1] not in BuildOption:
- BuildOption[item[1]] = Platform.BuildOptions[item]
- if BuildOption:
- ToolList = [TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG, TAB_TOD_DEFINES_TARGET_ARCH]
- for Index in range(2, -1, -1):
- for Key in dict(BuildOption):
- List = Key.split('_')
- if List[Index] == '*':
- for String in ToolDb[ToolList[Index]]:
- if String in [Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]:
- List[Index] = String
- NewKey = '%s_%s_%s_%s_%s' % tuple(List)
- if NewKey not in BuildOption:
- BuildOption[NewKey] = BuildOption[Key]
- continue
- del BuildOption[Key]
- elif List[Index] not in ToolDb[ToolList[Index]]:
- del BuildOption[Key]
- if BuildOption:
- if not KeyList:
- for Op in BuildOption:
- if NameGuid == BuildOption[Op]:
- KeyList = Op.split('_')
- Key = KeyList[0] + '_' + KeyList[1] +'_' + KeyList[2]
- if Key in KeyStringList and KeyList[4] == 'GUID':
- ToolPathKey = Key + '_' + KeyList[3] + '_PATH'
- ToolOptionKey = Key + '_' + KeyList[3] + '_FLAGS'
- if ToolPathKey in BuildOption.keys():
- ToolPathTmp = BuildOption.get(ToolPathKey)
- if ToolOptionKey in BuildOption.keys():
- ToolOption = BuildOption.get(ToolOptionKey)
-
- GenFdsGlobalVariable.GuidToolDefinition[NameGuid] = (ToolPathTmp, ToolOption)
- return ToolPathTmp, ToolOption
-
## Parse command line options
#
# Using standard Python module optparse to parse command line option of this tool.
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index d7fd58c7482f..a534fcb9371a 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -27,8 +27,9 @@ from Common.BuildToolError import *
from Common import EdkLogger
from Common.Misc import SaveFileOnChange
+from Common.DataType import TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG, TAB_TOD_DEFINES_TARGET_ARCH
from Common.TargetTxtClassObject import TargetTxtClassObject
-from Common.ToolDefClassObject import ToolDefClassObject
+from Common.ToolDefClassObject import ToolDefClassObject, ToolDefDict
from AutoGen.BuildEngine import BuildRule
import Common.DataType as DataType
from Common.Misc import PathClass
@@ -845,3 +846,95 @@ class GenFdsGlobalVariable:
DebugLogger = staticmethod(DebugLogger)
MacroExtend = staticmethod (MacroExtend)
GetPcdValue = staticmethod(GetPcdValue)
+
+## FindExtendTool()
+#
+# Find location of tools to process data
+#
+# @param KeyStringList Filter for inputs of section generation
+# @param CurrentArchList Arch list
+# @param NameGuid The Guid name
+#
+def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
+ ToolDb = ToolDefDict(GenFdsGlobalVariable.ConfDir).ToolsDefTxtDatabase
+ # if user not specify filter, try to deduce it from global data.
+ if KeyStringList == None or KeyStringList == []:
+ Target = GenFdsGlobalVariable.TargetName
+ ToolChain = GenFdsGlobalVariable.ToolChainTag
+ if ToolChain not in ToolDb['TOOL_CHAIN_TAG']:
+ EdkLogger.error("GenFds", GENFDS_ERROR, "Can not find external tool because tool tag %s is not defined in tools_def.txt!" % ToolChain)
+ KeyStringList = [Target + '_' + ToolChain + '_' + CurrentArchList[0]]
+ for Arch in CurrentArchList:
+ if Target + '_' + ToolChain + '_' + Arch not in KeyStringList:
+ KeyStringList.append(Target + '_' + ToolChain + '_' + Arch)
+
+ if GenFdsGlobalVariable.GuidToolDefinition:
+ if NameGuid in GenFdsGlobalVariable.GuidToolDefinition.keys():
+ return GenFdsGlobalVariable.GuidToolDefinition[NameGuid]
+
+ ToolDefinition = ToolDefDict(GenFdsGlobalVariable.ConfDir).ToolsDefTxtDictionary
+ ToolPathTmp = None
+ ToolOption = None
+ ToolPathKey = None
+ ToolOptionKey = None
+ KeyList = None
+ for ToolDef in ToolDefinition.items():
+ if NameGuid == ToolDef[1]:
+ KeyList = ToolDef[0].split('_')
+ Key = KeyList[0] + \
+ '_' + \
+ KeyList[1] + \
+ '_' + \
+ KeyList[2]
+ if Key in KeyStringList and KeyList[4] == 'GUID':
+ ToolPathKey = Key + '_' + KeyList[3] + '_PATH'
+ ToolOptionKey = Key + '_' + KeyList[3] + '_FLAGS'
+ ToolPath = ToolDefinition.get(ToolPathKey)
+ ToolOption = ToolDefinition.get(ToolOptionKey)
+ if ToolPathTmp == None:
+ ToolPathTmp = ToolPath
+ else:
+ if ToolPathTmp != ToolPath:
+ EdkLogger.error("GenFds", GENFDS_ERROR, "Don't know which tool to use, %s or %s ?" % (ToolPathTmp, ToolPath))
+
+ BuildOption = {}
+ for Arch in CurrentArchList:
+ Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+ # key is (ToolChainFamily, ToolChain, CodeBase)
+ for item in Platform.BuildOptions:
+ if '_PATH' in item[1] or '_FLAGS' in item[1] or '_GUID' in item[1]:
+ if not item[0] or (item[0] and GenFdsGlobalVariable.ToolChainFamily== item[0]):
+ if item[1] not in BuildOption:
+ BuildOption[item[1]] = Platform.BuildOptions[item]
+ if BuildOption:
+ ToolList = [TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG, TAB_TOD_DEFINES_TARGET_ARCH]
+ for Index in range(2, -1, -1):
+ for Key in dict(BuildOption):
+ List = Key.split('_')
+ if List[Index] == '*':
+ for String in ToolDb[ToolList[Index]]:
+ if String in [Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]:
+ List[Index] = String
+ NewKey = '%s_%s_%s_%s_%s' % tuple(List)
+ if NewKey not in BuildOption:
+ BuildOption[NewKey] = BuildOption[Key]
+ continue
+ del BuildOption[Key]
+ elif List[Index] not in ToolDb[ToolList[Index]]:
+ del BuildOption[Key]
+ if BuildOption:
+ if not KeyList:
+ for Op in BuildOption:
+ if NameGuid == BuildOption[Op]:
+ KeyList = Op.split('_')
+ Key = KeyList[0] + '_' + KeyList[1] +'_' + KeyList[2]
+ if Key in KeyStringList and KeyList[4] == 'GUID':
+ ToolPathKey = Key + '_' + KeyList[3] + '_PATH'
+ ToolOptionKey = Key + '_' + KeyList[3] + '_FLAGS'
+ if ToolPathKey in BuildOption.keys():
+ ToolPathTmp = BuildOption.get(ToolPathKey)
+ if ToolOptionKey in BuildOption.keys():
+ ToolOption = BuildOption.get(ToolOptionKey)
+
+ GenFdsGlobalVariable.GuidToolDefinition[NameGuid] = (ToolPathTmp, ToolOption)
+ return ToolPathTmp, ToolOption
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index 877dabc196e1..679934dc7794 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -21,6 +21,7 @@ import subprocess
from .Ffs import Ffs
import Common.LongFilePathOs as os
from .GenFdsGlobalVariable import GenFdsGlobalVariable
+from .GenFdsGlobalVariable import FindExtendTool
from CommonDataClass.FdfClass import GuidSectionClassObject
from Common import ToolDefClassObject
import sys
@@ -129,7 +130,6 @@ class GuidSection(GuidSectionClassObject) :
ExternalTool = None
ExternalOption = None
if self.NameGuid != None:
- from .GenFds import FindExtendTool
ExternalTool, ExternalOption = FindExtendTool(self.KeyStringList, self.CurrentArchList, self.NameGuid)
#
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* [PATCH v2 20/20] BaseTools: Move ImageBinDict to GenFdsGlobalVariable.py
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (18 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 19/20] BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py Gary Lin
@ 2018-02-01 8:36 ` Gary Lin
2018-06-20 6:22 ` [PATCH v2 00/20] BaseTools: One step toward python3 Paolo Bonzini
20 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-02-01 8:36 UTC (permalink / raw)
To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao
Moving "ImageBinDict" from GenFds.py to GenFdsGlobalVariable.py to
remove the requirement to import GenFds.GenFds in Capsule.py, Fd.py and
Fv.py. This breaks the following circular imports:
* GenFds.FdfParser => GenFds.Capsule => GenFds.GenFds => GenFds.FdfParser
* GenFds.FdfParser => GenFds.Fd => GenFds.GenFds => GenFds.FdfParser
* GenFds.FdfParser => GenFds.Fd => GenFds.Fv => GenFds.GenFds =>
GenFds.FdfParser
Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
BaseTools/Source/Python/GenFds/Capsule.py | 7 +++----
BaseTools/Source/Python/GenFds/Fd.py | 15 +++++++--------
BaseTools/Source/Python/GenFds/Fv.py | 11 +++++------
BaseTools/Source/Python/GenFds/GenFds.py | 2 --
BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 3 +++
5 files changed, 18 insertions(+), 20 deletions(-)
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index 55e9e1dade2f..247fb0c75e1a 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -201,9 +201,8 @@ class Capsule (CapsuleClassObject) :
# @retval string Generated Capsule file path
#
def GenCapsule(self):
- from .GenFds import GenFds
- if self.UiCapsuleName.upper() + 'cap' in GenFds.ImageBinDict.keys():
- return GenFds.ImageBinDict[self.UiCapsuleName.upper() + 'cap']
+ if self.UiCapsuleName.upper() + 'cap' in GenFdsGlobalVariable.ImageBinDict.keys():
+ return GenFdsGlobalVariable.ImageBinDict[self.UiCapsuleName.upper() + 'cap']
GenFdsGlobalVariable.InfLogger( "\nGenerate %s Capsule" %self.UiCapsuleName)
if ('CAPSULE_GUID' in self.TokensDict and
@@ -237,7 +236,7 @@ class Capsule (CapsuleClassObject) :
GenFdsGlobalVariable.VerboseLogger( "\nGenerate %s Capsule Successfully" %self.UiCapsuleName)
GenFdsGlobalVariable.SharpCounter = 0
- GenFds.ImageBinDict[self.UiCapsuleName.upper() + 'cap'] = CapOutputFile
+ GenFdsGlobalVariable.ImageBinDict[self.UiCapsuleName.upper() + 'cap'] = CapOutputFile
return CapOutputFile
## Generate inf file for capsule
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index 86a0d9a47bfc..1a95a4567cca 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -46,9 +46,8 @@ class FD(FDClassObject):
# @retval string Generated FD file name
#
def GenFd (self, Flag = False):
- from .GenFds import GenFds
- if self.FdUiName.upper() + 'fd' in GenFds.ImageBinDict.keys():
- return GenFds.ImageBinDict[self.FdUiName.upper() + 'fd']
+ if self.FdUiName.upper() + 'fd' in GenFdsGlobalVariable.ImageBinDict.keys():
+ return GenFdsGlobalVariable.ImageBinDict[self.FdUiName.upper() + 'fd']
#
# Print Information
@@ -93,7 +92,7 @@ class FD(FDClassObject):
PadRegion.Offset = PreviousRegionStart + PreviousRegionSize
PadRegion.Size = RegionObj.Offset - PadRegion.Offset
if not Flag:
- PadRegion.AddToBuffer(TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
+ PadRegion.AddToBuffer(TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
PreviousRegionStart = RegionObj.Offset
PreviousRegionSize = RegionObj.Size
#
@@ -102,7 +101,7 @@ class FD(FDClassObject):
if PreviousRegionSize > self.Size:
pass
GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
- RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
+ RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
FdBuffer = BytesIO('')
PreviousRegionStart = -1
@@ -123,7 +122,7 @@ class FD(FDClassObject):
PadRegion.Offset = PreviousRegionStart + PreviousRegionSize
PadRegion.Size = RegionObj.Offset - PadRegion.Offset
if not Flag:
- PadRegion.AddToBuffer(FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
+ PadRegion.AddToBuffer(FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
PreviousRegionStart = RegionObj.Offset
PreviousRegionSize = RegionObj.Size
#
@@ -137,7 +136,7 @@ class FD(FDClassObject):
# Call each region's AddToBuffer function
#
GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
- RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict, Flag=Flag)
+ RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.vtfRawDict, self.DefineVarDict, Flag=Flag)
#
# Write the buffer contents to Fd file
#
@@ -145,7 +144,7 @@ class FD(FDClassObject):
if not Flag:
SaveFileOnChange(FdFileName, FdBuffer.getvalue())
FdBuffer.close()
- GenFds.ImageBinDict[self.FdUiName.upper() + 'fd'] = FdFileName
+ GenFdsGlobalVariable.ImageBinDict[self.FdUiName.upper() + 'fd'] = FdFileName
return FdFileName
## generate VTF
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index bff661e43e7e..d398d7393df6 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -70,9 +70,8 @@ class FV (FvClassObject):
# @retval string Generated FV file path
#
def AddToBuffer (self, Buffer, BaseAddress=None, BlockSize= None, BlockNum=None, ErasePloarity='1', VtfDict=None, MacroDict = {}, Flag=False) :
- from .GenFds import GenFds
- if BaseAddress == None and self.UiFvName.upper() + 'fv' in GenFds.ImageBinDict.keys():
- return GenFds.ImageBinDict[self.UiFvName.upper() + 'fv']
+ if BaseAddress == None and self.UiFvName.upper() + 'fv' in GenFdsGlobalVariable.ImageBinDict.keys():
+ return GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper() + 'fv']
#
# Check whether FV in Capsule is in FD flash region.
@@ -86,7 +85,7 @@ class FV (FvClassObject):
for RegionData in RegionObj.RegionDataList:
if RegionData.endswith(".fv"):
continue
- elif RegionData.upper() + 'fv' in GenFds.ImageBinDict.keys():
+ elif RegionData.upper() + 'fv' in GenFdsGlobalVariable.ImageBinDict.keys():
continue
elif self.UiFvName.upper() == RegionData.upper():
GenFdsGlobalVariable.ErrorLogger("Capsule %s in FD region can't contain a FV %s in FD region." % (self.CapsuleName, self.UiFvName.upper()))
@@ -141,7 +140,7 @@ class FV (FvClassObject):
FvOutputFile = self.CreateFileName
if Flag:
- GenFds.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
+ GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
return FvOutputFile
FvInfoFileName = os.path.join(GenFdsGlobalVariable.FfsDir, self.UiFvName + '.inf')
@@ -221,7 +220,7 @@ class FV (FvClassObject):
# FvAlignmentValue is less than 1K
self.FvAlignment = str (FvAlignmentValue)
FvFileObj.close()
- GenFds.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
+ GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
GenFdsGlobalVariable.LargeFileInFvFlags.pop()
else:
GenFdsGlobalVariable.ErrorLogger("Failed to generate %s FV file." %self.UiFvName)
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 953069fc52f7..58793cc9533d 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -462,8 +462,6 @@ def myOptionParser():
#
class GenFds :
FdfParsef = None
- # FvName, FdName, CapName in FDF, Image file name
- ImageBinDict = {}
OnlyGenerateThisFd = None
OnlyGenerateThisFv = None
OnlyGenerateThisCap = None
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index a534fcb9371a..f4b619334e7f 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -93,6 +93,9 @@ class GenFdsGlobalVariable:
SectionHeader = struct.Struct("3B 1B")
+ # FvName, FdName, CapName in FDF, Image file name
+ ImageBinDict = {}
+
## LoadBuildRule
#
@staticmethod
--
2.16.1
^ permalink raw reply related [flat|nested] 24+ messages in thread
* Re: [PATCH v2 00/20] BaseTools: One step toward python3
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
` (19 preceding siblings ...)
2018-02-01 8:36 ` [PATCH v2 20/20] BaseTools: Move ImageBinDict " Gary Lin
@ 2018-06-20 6:22 ` Paolo Bonzini
2018-06-20 7:29 ` Zhu, Yonghong
20 siblings, 1 reply; 24+ messages in thread
From: Paolo Bonzini @ 2018-06-20 6:22 UTC (permalink / raw)
To: Gary Lin, edk2-devel; +Cc: Liming Gao
On 01/02/2018 09:35, Gary Lin wrote:
> v2 changes:
> - Rebase to the current git HEAD (821807bcefb9a36e598d71a8004fae5aab2052a0)
> - Apply "futurize -f libfuturize.fixes.fix_absolute_import" and
> refactor some python scripts to break the circular imports.
>
> This patch series is also available in
> https://github.com/lcp/edk2/tree/python3-futurize-v2
>
> Since python2 will be EOL in 2020, we start to evaluate the impact of
> the python2 removal. As expected, OMVF building failed the test. It's
> actually a task noted in the wiki page:
>
> https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support
>
> Maybe it's time to convert the python scripts gradully.
I cannot find any answer to this series. Is there any reason why it
wasn't considered?
Thanks,
Paolo
> This patchset doesn't make the python scripts in BaseTools compatible
> with python3 immediately. It aims to do the trivial and safe conversion
> and replacement to make some statements compatible with both python2 and
> python3, so we can deal with the difficult cases later.
>
> With the help of "futurize" from python-future, it's easier to refactor
> the statements. This patchset is basically equivalent to "futurize -1"
> plus "StringIO.StringIO => io.BytesIO".
>
> For the "io.BytesIO" change, it MIGHT introduce slow down to the build
> time since io.BytesIO is slower than StringIO.StringIO in python2(*).
> For a quick test, I built OVMF with the following command based on
> 8ab0bd2397c9d3922e0c7dbb1aa6f7e08799079f:
>
> $ rm -rf Build && make -C BaseTools/ clean
> $ time ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
> -D NETWORK_IP6_ENABLE \
> -D HTTP_BOOT_ENABLE \
> -D TLS_ENABLE
>
> Before io.BytesIO:
>
> Build total time: 00:03:56
> real 4m22.991s
> user 3m55.874s
> sys 0m27.250s
>
> After io.BytesIO:
>
> Build total time: 00:03:57
> real 4m23.953s
> user 3m57.526s
> sys 0m27.192s
>
> The difference is only 1 second, and I would say the impact is subtle.
>
> The next step will be fixing relative import and maybe applying more
> futurize fixes. We won't get there soon but at least we are moving...
>
> (*) https://stackoverflow.com/questions/37462075/confusing-about-stringio-cstringio-and-byteio
>
> Contributed-under: TianoCore Contribution Agreement 1.1
> Cc: Yonghong Zhu <yonghong.zhu@intel.com>
> Cc: Liming Gao <liming.gao@intel.com>
> Signed-off-by: Gary Lin <glin@suse.com>
>
>
> Gary Lin (20):
> BaseTools: Refactor python except statements
> BaseTools: Refactor python print statements
> BaseTools: Remove the old python "not-equal"
> BaseTools: Use the python3-range functions
> BaseTools: Remove tuple parameter in python scripts
> BaseTools: Remove the deprecated hash_key()
> BaseTools: Import reduce() from functools
> BaseTools: Replace StandardError with Expression
> BaseTools: Remove types.TypeType
> BaseTools: Refactor python raise statement
> BaseTools: Adjust the spaces around commas and colons
> BaseTools: Migrate to the new octal literal
> BaseTools: Unify long int and int in python scripts
> BaseTools: Adjust old python2 idioms
> BaseTools: Replace StringIO.StringIO with io.BytesIO
> BaseTools: Treat GenFds.py and build.py as python modules
> BaseTools: Adopt absolute import for python scripts
> BaseTools: Move OverrideAttribs to OptRomInfStatement.py
> BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py
> BaseTools: Move ImageBinDict to GenFdsGlobalVariable.py
>
> BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py | 5 +-
> BaseTools/BinWrappers/PosixLike/GenFds | 2 +-
> BaseTools/BinWrappers/PosixLike/build | 2 +-
> BaseTools/Scripts/BinToPcd.py | 46 +++--
> BaseTools/Scripts/ConvertMasmToNasm.py | 1 +
> BaseTools/Scripts/ConvertUni.py | 5 -
> BaseTools/Scripts/MemoryProfileSymbolGen.py | 22 +-
> BaseTools/Scripts/PatchCheck.py | 7 +-
> BaseTools/Scripts/RunMakefile.py | 2 +-
> BaseTools/Scripts/SmiHandlerProfileSymbolGen.py | 20 +-
> BaseTools/Scripts/UpdateBuildVersions.py | 18 +-
> BaseTools/Source/Python/AutoGen/AutoGen.py | 98 ++++-----
> BaseTools/Source/Python/AutoGen/BuildEngine.py | 38 ++--
> BaseTools/Source/Python/AutoGen/GenC.py | 12 +-
> BaseTools/Source/Python/AutoGen/GenDepex.py | 8 +-
> BaseTools/Source/Python/AutoGen/GenMake.py | 11 +-
> BaseTools/Source/Python/AutoGen/GenPcdDb.py | 149 +++++++-------
> BaseTools/Source/Python/AutoGen/GenVar.py | 166 +++++++--------
> BaseTools/Source/Python/AutoGen/IdfClassObject.py | 1 -
> BaseTools/Source/Python/AutoGen/InfSectionParser.py | 1 +
> BaseTools/Source/Python/AutoGen/StrGather.py | 8 +-
> BaseTools/Source/Python/AutoGen/UniClassObject.py | 18 +-
> BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 10 +-
> BaseTools/Source/Python/BPDG/BPDG.py | 8 +-
> BaseTools/Source/Python/BPDG/GenVpd.py | 28 +--
> BaseTools/Source/Python/Common/DataType.py | 4 +-
> BaseTools/Source/Python/Common/Database.py | 8 +-
> BaseTools/Source/Python/Common/DecClassObject.py | 56 ++---
> BaseTools/Source/Python/Common/Dictionary.py | 14 +-
> BaseTools/Source/Python/Common/DscClassObject.py | 91 +++++----
> BaseTools/Source/Python/Common/EdkIIWorkspace.py | 28 +--
> BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py | 152 +++++++-------
> BaseTools/Source/Python/Common/EdkLogger.py | 3 +-
> BaseTools/Source/Python/Common/Expression.py | 86 ++++----
> BaseTools/Source/Python/Common/FdfClassObject.py | 6 +-
> BaseTools/Source/Python/Common/FdfParserLite.py | 47 ++---
> BaseTools/Source/Python/Common/InfClassObject.py | 134 ++++++------
> BaseTools/Source/Python/Common/LongFilePathOs.py | 5 +-
> BaseTools/Source/Python/Common/MigrationUtilities.py | 4 +-
> BaseTools/Source/Python/Common/Misc.py | 79 ++++----
> BaseTools/Source/Python/Common/Parsing.py | 6 +-
> BaseTools/Source/Python/Common/RangeExpression.py | 32 +--
> BaseTools/Source/Python/Common/String.py | 16 +-
> BaseTools/Source/Python/Common/TargetTxtClassObject.py | 24 ++-
> BaseTools/Source/Python/Common/ToolDefClassObject.py | 12 +-
> BaseTools/Source/Python/Common/VpdInfoFile.py | 23 ++-
> BaseTools/Source/Python/CommonDataClass/ModuleClass.py | 3 +-
> BaseTools/Source/Python/CommonDataClass/PackageClass.py | 3 +-
> BaseTools/Source/Python/CommonDataClass/PlatformClass.py | 3 +-
> BaseTools/Source/Python/Ecc/CParser.py | 178 ++++++++--------
> BaseTools/Source/Python/Ecc/Check.py | 10 +-
> BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 82 ++++----
> BaseTools/Source/Python/Ecc/Configuration.py | 5 +-
> BaseTools/Source/Python/Ecc/Database.py | 7 +-
> BaseTools/Source/Python/Ecc/Ecc.py | 25 +--
> BaseTools/Source/Python/Ecc/Exception.py | 6 +-
> BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
> BaseTools/Source/Python/Ecc/MetaDataParser.py | 8 +-
> BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 5 +-
> BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 44 ++--
> BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 5 +-
> BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 9 +-
> BaseTools/Source/Python/Ecc/c.py | 28 +--
> BaseTools/Source/Python/Eot/CParser.py | 178 ++++++++--------
> BaseTools/Source/Python/Eot/CodeFragmentCollector.py | 72 +++----
> BaseTools/Source/Python/Eot/Eot.py | 15 +-
> BaseTools/Source/Python/Eot/FileProfile.py | 3 +-
> BaseTools/Source/Python/Eot/FvImage.py | 28 +--
> BaseTools/Source/Python/Eot/InfParserLite.py | 13 +-
> BaseTools/Source/Python/Eot/Parser.py | 5 +-
> BaseTools/Source/Python/Eot/Report.py | 3 +-
> BaseTools/Source/Python/Eot/c.py | 32 +--
> BaseTools/Source/Python/GenFds/AprioriSection.py | 12 +-
> BaseTools/Source/Python/GenFds/Capsule.py | 22 +-
> BaseTools/Source/Python/GenFds/CapsuleData.py | 11 +-
> BaseTools/Source/Python/GenFds/CompressSection.py | 7 +-
> BaseTools/Source/Python/GenFds/DataSection.py | 7 +-
> BaseTools/Source/Python/GenFds/DepexSection.py | 7 +-
> BaseTools/Source/Python/GenFds/EfiSection.py | 13 +-
> BaseTools/Source/Python/GenFds/Fd.py | 32 +--
> BaseTools/Source/Python/GenFds/FdfParser.py | 100 ++++-----
> BaseTools/Source/Python/GenFds/FfsFileStatement.py | 16 +-
> BaseTools/Source/Python/GenFds/FfsInfStatement.py | 35 ++--
> BaseTools/Source/Python/GenFds/Fv.py | 34 ++--
> BaseTools/Source/Python/GenFds/FvImageSection.py | 15 +-
> BaseTools/Source/Python/GenFds/GenFds.py | 126 ++----------
> BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 108 +++++++++-
> BaseTools/Source/Python/GenFds/GuidSection.py | 11 +-
> BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 3 +-
> BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 30 ++-
> BaseTools/Source/Python/GenFds/OptionRom.py | 23 +--
> BaseTools/Source/Python/GenFds/Region.py | 17 +-
> BaseTools/Source/Python/GenFds/RuleComplexFile.py | 3 +-
> BaseTools/Source/Python/GenFds/RuleSimpleFile.py | 3 +-
> BaseTools/Source/Python/GenFds/Section.py | 3 +-
> BaseTools/Source/Python/GenFds/UiSection.py | 7 +-
> BaseTools/Source/Python/GenFds/VerSection.py | 7 +-
> BaseTools/Source/Python/GenFds/Vtf.py | 3 +-
> BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 9 +-
> BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 1 +
> BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 32 +--
> BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 30 +--
> BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 36 ++--
> BaseTools/Source/Python/Table/TableDataModel.py | 3 +-
> BaseTools/Source/Python/Table/TableDec.py | 3 +-
> BaseTools/Source/Python/Table/TableDsc.py | 3 +-
> BaseTools/Source/Python/Table/TableEotReport.py | 5 +-
> BaseTools/Source/Python/Table/TableFdf.py | 3 +-
> BaseTools/Source/Python/Table/TableFile.py | 3 +-
> BaseTools/Source/Python/Table/TableFunction.py | 3 +-
> BaseTools/Source/Python/Table/TableIdentifier.py | 5 +-
> BaseTools/Source/Python/Table/TableInf.py | 3 +-
> BaseTools/Source/Python/Table/TablePcd.py | 5 +-
> BaseTools/Source/Python/Table/TableQuery.py | 3 +-
> BaseTools/Source/Python/Table/TableReport.py | 3 +-
> BaseTools/Source/Python/TargetTool/TargetTool.py | 39 ++--
> BaseTools/Source/Python/Trim/Trim.py | 25 +--
> BaseTools/Source/Python/UPT/Core/DependencyRules.py | 12 +-
> BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py | 4 +-
> BaseTools/Source/Python/UPT/Core/FileHook.py | 2 +-
> BaseTools/Source/Python/UPT/Core/IpiDb.py | 6 +-
> BaseTools/Source/Python/UPT/Core/PackageFile.py | 12 +-
> BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py | 15 +-
> BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py | 42 ++--
> BaseTools/Source/Python/UPT/InstallPkg.py | 2 +-
> BaseTools/Source/Python/UPT/InventoryWs.py | 2 +-
> BaseTools/Source/Python/UPT/Library/CommentParsing.py | 5 +-
> BaseTools/Source/Python/UPT/Library/ExpressionValidate.py | 11 +-
> BaseTools/Source/Python/UPT/Library/Misc.py | 11 +-
> BaseTools/Source/Python/UPT/Library/ParserValidate.py | 2 +-
> BaseTools/Source/Python/UPT/Library/Parsing.py | 6 +-
> BaseTools/Source/Python/UPT/Library/String.py | 5 +-
> BaseTools/Source/Python/UPT/Library/UniClassObject.py | 20 +-
> BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py | 4 +-
> BaseTools/Source/Python/UPT/MkPkg.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py | 6 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py | 3 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py | 4 +-
> BaseTools/Source/Python/UPT/Parser/DecParserMisc.py | 1 +
> BaseTools/Source/Python/UPT/Parser/InfSectionParser.py | 3 +-
> BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 57 +++---
> BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py | 3 +-
> BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py | 3 +-
> BaseTools/Source/Python/UPT/ReplacePkg.py | 2 +-
> BaseTools/Source/Python/UPT/RmPkg.py | 2 +-
> BaseTools/Source/Python/UPT/TestInstall.py | 4 +-
> BaseTools/Source/Python/UPT/UPT.py | 9 +-
> BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py | 5 +-
> BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py | 10 +-
> BaseTools/Source/Python/UPT/Xml/CommonXml.py | 2 +-
> BaseTools/Source/Python/UPT/Xml/IniToXml.py | 1 +
> BaseTools/Source/Python/UPT/Xml/XmlParser.py | 25 +--
> BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py | 3 +-
> BaseTools/Source/Python/Workspace/BuildClassObject.py | 2 +-
> BaseTools/Source/Python/Workspace/DecBuildData.py | 14 +-
> BaseTools/Source/Python/Workspace/DscBuildData.py | 213 ++++++++++----------
> BaseTools/Source/Python/Workspace/InfBuildData.py | 6 +-
> BaseTools/Source/Python/Workspace/MetaFileParser.py | 75 +++----
> BaseTools/Source/Python/Workspace/MetaFileTable.py | 15 +-
> BaseTools/Source/Python/Workspace/WorkspaceCommon.py | 5 +-
> BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 7 +-
> BaseTools/Source/Python/build/BuildReport.py | 19 +-
> BaseTools/Source/Python/build/build.py | 38 ++--
> BaseTools/Tests/CheckPythonSyntax.py | 2 +-
> BaseTools/Tests/TestTools.py | 13 +-
> BaseTools/Tests/TianoCompress.py | 6 +-
> BaseTools/gcc/mingw-gcc-build.py | 112 +++++-----
> 175 files changed, 2092 insertions(+), 1927 deletions(-)
>
^ permalink raw reply [flat|nested] 24+ messages in thread
* Re: [PATCH v2 00/20] BaseTools: One step toward python3
2018-06-20 6:22 ` [PATCH v2 00/20] BaseTools: One step toward python3 Paolo Bonzini
@ 2018-06-20 7:29 ` Zhu, Yonghong
2018-06-20 8:08 ` Gary Lin
0 siblings, 1 reply; 24+ messages in thread
From: Zhu, Yonghong @ 2018-06-20 7:29 UTC (permalink / raw)
To: Paolo Bonzini, Gary Lin, edk2-devel@lists.01.org
Cc: Gao, Liming, Zhu, Yonghong
Hi Paolo and Gary,
The patches are good and helpful. But it is out of date, so could you help to recreate the patches ?
Best Regards,
Zhu Yonghong
-----Original Message-----
From: edk2-devel [mailto:edk2-devel-bounces@lists.01.org] On Behalf Of Paolo Bonzini
Sent: Wednesday, June 20, 2018 2:22 PM
To: Gary Lin <glin@suse.com>; edk2-devel@lists.01.org
Cc: Gao, Liming <liming.gao@intel.com>
Subject: Re: [edk2] [PATCH v2 00/20] BaseTools: One step toward python3
On 01/02/2018 09:35, Gary Lin wrote:
> v2 changes:
> - Rebase to the current git HEAD (821807bcefb9a36e598d71a8004fae5aab2052a0)
> - Apply "futurize -f libfuturize.fixes.fix_absolute_import" and
> refactor some python scripts to break the circular imports.
>
> This patch series is also available in
> https://github.com/lcp/edk2/tree/python3-futurize-v2
>
> Since python2 will be EOL in 2020, we start to evaluate the impact of
> the python2 removal. As expected, OMVF building failed the test. It's
> actually a task noted in the wiki page:
>
> https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-
> Python3-Support
>
> Maybe it's time to convert the python scripts gradully.
I cannot find any answer to this series. Is there any reason why it wasn't considered?
Thanks,
Paolo
> This patchset doesn't make the python scripts in BaseTools compatible
> with python3 immediately. It aims to do the trivial and safe
> conversion and replacement to make some statements compatible with
> both python2 and python3, so we can deal with the difficult cases later.
>
> With the help of "futurize" from python-future, it's easier to
> refactor the statements. This patchset is basically equivalent to "futurize -1"
> plus "StringIO.StringIO => io.BytesIO".
>
> For the "io.BytesIO" change, it MIGHT introduce slow down to the build
> time since io.BytesIO is slower than StringIO.StringIO in python2(*).
> For a quick test, I built OVMF with the following command based on
> 8ab0bd2397c9d3922e0c7dbb1aa6f7e08799079f:
>
> $ rm -rf Build && make -C BaseTools/ clean $ time ./OvmfPkg/build.sh
> -D SECURE_BOOT_ENABLE \
> -D NETWORK_IP6_ENABLE \
> -D HTTP_BOOT_ENABLE \
> -D TLS_ENABLE
>
> Before io.BytesIO:
>
> Build total time: 00:03:56
> real 4m22.991s
> user 3m55.874s
> sys 0m27.250s
>
> After io.BytesIO:
>
> Build total time: 00:03:57
> real 4m23.953s
> user 3m57.526s
> sys 0m27.192s
>
> The difference is only 1 second, and I would say the impact is subtle.
>
> The next step will be fixing relative import and maybe applying more
> futurize fixes. We won't get there soon but at least we are moving...
>
> (*)
> https://stackoverflow.com/questions/37462075/confusing-about-stringio-
> cstringio-and-byteio
>
> Contributed-under: TianoCore Contribution Agreement 1.1
> Cc: Yonghong Zhu <yonghong.zhu@intel.com>
> Cc: Liming Gao <liming.gao@intel.com>
> Signed-off-by: Gary Lin <glin@suse.com>
>
>
> Gary Lin (20):
> BaseTools: Refactor python except statements
> BaseTools: Refactor python print statements
> BaseTools: Remove the old python "not-equal"
> BaseTools: Use the python3-range functions
> BaseTools: Remove tuple parameter in python scripts
> BaseTools: Remove the deprecated hash_key()
> BaseTools: Import reduce() from functools
> BaseTools: Replace StandardError with Expression
> BaseTools: Remove types.TypeType
> BaseTools: Refactor python raise statement
> BaseTools: Adjust the spaces around commas and colons
> BaseTools: Migrate to the new octal literal
> BaseTools: Unify long int and int in python scripts
> BaseTools: Adjust old python2 idioms
> BaseTools: Replace StringIO.StringIO with io.BytesIO
> BaseTools: Treat GenFds.py and build.py as python modules
> BaseTools: Adopt absolute import for python scripts
> BaseTools: Move OverrideAttribs to OptRomInfStatement.py
> BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py
> BaseTools: Move ImageBinDict to GenFdsGlobalVariable.py
>
> BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py | 5 +-
> BaseTools/BinWrappers/PosixLike/GenFds | 2 +-
> BaseTools/BinWrappers/PosixLike/build | 2 +-
> BaseTools/Scripts/BinToPcd.py | 46 +++--
> BaseTools/Scripts/ConvertMasmToNasm.py | 1 +
> BaseTools/Scripts/ConvertUni.py | 5 -
> BaseTools/Scripts/MemoryProfileSymbolGen.py | 22 +-
> BaseTools/Scripts/PatchCheck.py | 7 +-
> BaseTools/Scripts/RunMakefile.py | 2 +-
> BaseTools/Scripts/SmiHandlerProfileSymbolGen.py | 20 +-
> BaseTools/Scripts/UpdateBuildVersions.py | 18 +-
> BaseTools/Source/Python/AutoGen/AutoGen.py | 98 ++++-----
> BaseTools/Source/Python/AutoGen/BuildEngine.py | 38 ++--
> BaseTools/Source/Python/AutoGen/GenC.py | 12 +-
> BaseTools/Source/Python/AutoGen/GenDepex.py | 8 +-
> BaseTools/Source/Python/AutoGen/GenMake.py | 11 +-
> BaseTools/Source/Python/AutoGen/GenPcdDb.py | 149 +++++++-------
> BaseTools/Source/Python/AutoGen/GenVar.py | 166 +++++++--------
> BaseTools/Source/Python/AutoGen/IdfClassObject.py | 1 -
> BaseTools/Source/Python/AutoGen/InfSectionParser.py | 1 +
> BaseTools/Source/Python/AutoGen/StrGather.py | 8 +-
> BaseTools/Source/Python/AutoGen/UniClassObject.py | 18 +-
> BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 10 +-
> BaseTools/Source/Python/BPDG/BPDG.py | 8 +-
> BaseTools/Source/Python/BPDG/GenVpd.py | 28 +--
> BaseTools/Source/Python/Common/DataType.py | 4 +-
> BaseTools/Source/Python/Common/Database.py | 8 +-
> BaseTools/Source/Python/Common/DecClassObject.py | 56 ++---
> BaseTools/Source/Python/Common/Dictionary.py | 14 +-
> BaseTools/Source/Python/Common/DscClassObject.py | 91 +++++----
> BaseTools/Source/Python/Common/EdkIIWorkspace.py | 28 +--
> BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py | 152 +++++++-------
> BaseTools/Source/Python/Common/EdkLogger.py | 3 +-
> BaseTools/Source/Python/Common/Expression.py | 86 ++++----
> BaseTools/Source/Python/Common/FdfClassObject.py | 6 +-
> BaseTools/Source/Python/Common/FdfParserLite.py | 47 ++---
> BaseTools/Source/Python/Common/InfClassObject.py | 134 ++++++------
> BaseTools/Source/Python/Common/LongFilePathOs.py | 5 +-
> BaseTools/Source/Python/Common/MigrationUtilities.py | 4 +-
> BaseTools/Source/Python/Common/Misc.py | 79 ++++----
> BaseTools/Source/Python/Common/Parsing.py | 6 +-
> BaseTools/Source/Python/Common/RangeExpression.py | 32 +--
> BaseTools/Source/Python/Common/String.py | 16 +-
> BaseTools/Source/Python/Common/TargetTxtClassObject.py | 24 ++-
> BaseTools/Source/Python/Common/ToolDefClassObject.py | 12 +-
> BaseTools/Source/Python/Common/VpdInfoFile.py | 23 ++-
> BaseTools/Source/Python/CommonDataClass/ModuleClass.py | 3 +-
> BaseTools/Source/Python/CommonDataClass/PackageClass.py | 3 +-
> BaseTools/Source/Python/CommonDataClass/PlatformClass.py | 3 +-
> BaseTools/Source/Python/Ecc/CParser.py | 178 ++++++++--------
> BaseTools/Source/Python/Ecc/Check.py | 10 +-
> BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 82 ++++----
> BaseTools/Source/Python/Ecc/Configuration.py | 5 +-
> BaseTools/Source/Python/Ecc/Database.py | 7 +-
> BaseTools/Source/Python/Ecc/Ecc.py | 25 +--
> BaseTools/Source/Python/Ecc/Exception.py | 6 +-
> BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
> BaseTools/Source/Python/Ecc/MetaDataParser.py | 8 +-
> BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 5 +-
> BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 44 ++--
> BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 5 +-
> BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 9 +-
> BaseTools/Source/Python/Ecc/c.py | 28 +--
> BaseTools/Source/Python/Eot/CParser.py | 178 ++++++++--------
> BaseTools/Source/Python/Eot/CodeFragmentCollector.py | 72 +++----
> BaseTools/Source/Python/Eot/Eot.py | 15 +-
> BaseTools/Source/Python/Eot/FileProfile.py | 3 +-
> BaseTools/Source/Python/Eot/FvImage.py | 28 +--
> BaseTools/Source/Python/Eot/InfParserLite.py | 13 +-
> BaseTools/Source/Python/Eot/Parser.py | 5 +-
> BaseTools/Source/Python/Eot/Report.py | 3 +-
> BaseTools/Source/Python/Eot/c.py | 32 +--
> BaseTools/Source/Python/GenFds/AprioriSection.py | 12 +-
> BaseTools/Source/Python/GenFds/Capsule.py | 22 +-
> BaseTools/Source/Python/GenFds/CapsuleData.py | 11 +-
> BaseTools/Source/Python/GenFds/CompressSection.py | 7 +-
> BaseTools/Source/Python/GenFds/DataSection.py | 7 +-
> BaseTools/Source/Python/GenFds/DepexSection.py | 7 +-
> BaseTools/Source/Python/GenFds/EfiSection.py | 13 +-
> BaseTools/Source/Python/GenFds/Fd.py | 32 +--
> BaseTools/Source/Python/GenFds/FdfParser.py | 100 ++++-----
> BaseTools/Source/Python/GenFds/FfsFileStatement.py | 16 +-
> BaseTools/Source/Python/GenFds/FfsInfStatement.py | 35 ++--
> BaseTools/Source/Python/GenFds/Fv.py | 34 ++--
> BaseTools/Source/Python/GenFds/FvImageSection.py | 15 +-
> BaseTools/Source/Python/GenFds/GenFds.py | 126 ++----------
> BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 108 +++++++++-
> BaseTools/Source/Python/GenFds/GuidSection.py | 11 +-
> BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 3 +-
> BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 30 ++-
> BaseTools/Source/Python/GenFds/OptionRom.py | 23 +--
> BaseTools/Source/Python/GenFds/Region.py | 17 +-
> BaseTools/Source/Python/GenFds/RuleComplexFile.py | 3 +-
> BaseTools/Source/Python/GenFds/RuleSimpleFile.py | 3 +-
> BaseTools/Source/Python/GenFds/Section.py | 3 +-
> BaseTools/Source/Python/GenFds/UiSection.py | 7 +-
> BaseTools/Source/Python/GenFds/VerSection.py | 7 +-
> BaseTools/Source/Python/GenFds/Vtf.py | 3 +-
> BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 9 +-
> BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 1 +
> BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 32 +--
> BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 30 +--
> BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 36 ++--
> BaseTools/Source/Python/Table/TableDataModel.py | 3 +-
> BaseTools/Source/Python/Table/TableDec.py | 3 +-
> BaseTools/Source/Python/Table/TableDsc.py | 3 +-
> BaseTools/Source/Python/Table/TableEotReport.py | 5 +-
> BaseTools/Source/Python/Table/TableFdf.py | 3 +-
> BaseTools/Source/Python/Table/TableFile.py | 3 +-
> BaseTools/Source/Python/Table/TableFunction.py | 3 +-
> BaseTools/Source/Python/Table/TableIdentifier.py | 5 +-
> BaseTools/Source/Python/Table/TableInf.py | 3 +-
> BaseTools/Source/Python/Table/TablePcd.py | 5 +-
> BaseTools/Source/Python/Table/TableQuery.py | 3 +-
> BaseTools/Source/Python/Table/TableReport.py | 3 +-
> BaseTools/Source/Python/TargetTool/TargetTool.py | 39 ++--
> BaseTools/Source/Python/Trim/Trim.py | 25 +--
> BaseTools/Source/Python/UPT/Core/DependencyRules.py | 12 +-
> BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py | 4 +-
> BaseTools/Source/Python/UPT/Core/FileHook.py | 2 +-
> BaseTools/Source/Python/UPT/Core/IpiDb.py | 6 +-
> BaseTools/Source/Python/UPT/Core/PackageFile.py | 12 +-
> BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py | 15 +-
> BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py | 42 ++--
> BaseTools/Source/Python/UPT/InstallPkg.py | 2 +-
> BaseTools/Source/Python/UPT/InventoryWs.py | 2 +-
> BaseTools/Source/Python/UPT/Library/CommentParsing.py | 5 +-
> BaseTools/Source/Python/UPT/Library/ExpressionValidate.py | 11 +-
> BaseTools/Source/Python/UPT/Library/Misc.py | 11 +-
> BaseTools/Source/Python/UPT/Library/ParserValidate.py | 2 +-
> BaseTools/Source/Python/UPT/Library/Parsing.py | 6 +-
> BaseTools/Source/Python/UPT/Library/String.py | 5 +-
> BaseTools/Source/Python/UPT/Library/UniClassObject.py | 20 +-
> BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py | 4 +-
> BaseTools/Source/Python/UPT/MkPkg.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py | 6 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py | 4 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py | 2 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py | 3 +-
> BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py | 4 +-
> BaseTools/Source/Python/UPT/Parser/DecParserMisc.py | 1 +
> BaseTools/Source/Python/UPT/Parser/InfSectionParser.py | 3 +-
> BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 57 +++---
> BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py | 3 +-
> BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py | 3 +-
> BaseTools/Source/Python/UPT/ReplacePkg.py | 2 +-
> BaseTools/Source/Python/UPT/RmPkg.py | 2 +-
> BaseTools/Source/Python/UPT/TestInstall.py | 4 +-
> BaseTools/Source/Python/UPT/UPT.py | 9 +-
> BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py | 5 +-
> BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py | 10 +-
> BaseTools/Source/Python/UPT/Xml/CommonXml.py | 2 +-
> BaseTools/Source/Python/UPT/Xml/IniToXml.py | 1 +
> BaseTools/Source/Python/UPT/Xml/XmlParser.py | 25 +--
> BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py | 3 +-
> BaseTools/Source/Python/Workspace/BuildClassObject.py | 2 +-
> BaseTools/Source/Python/Workspace/DecBuildData.py | 14 +-
> BaseTools/Source/Python/Workspace/DscBuildData.py | 213 ++++++++++----------
> BaseTools/Source/Python/Workspace/InfBuildData.py | 6 +-
> BaseTools/Source/Python/Workspace/MetaFileParser.py | 75 +++----
> BaseTools/Source/Python/Workspace/MetaFileTable.py | 15 +-
> BaseTools/Source/Python/Workspace/WorkspaceCommon.py | 5 +-
> BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 7 +-
> BaseTools/Source/Python/build/BuildReport.py | 19 +-
> BaseTools/Source/Python/build/build.py | 38 ++--
> BaseTools/Tests/CheckPythonSyntax.py | 2 +-
> BaseTools/Tests/TestTools.py | 13 +-
> BaseTools/Tests/TianoCompress.py | 6 +-
> BaseTools/gcc/mingw-gcc-build.py | 112 +++++-----
> 175 files changed, 2092 insertions(+), 1927 deletions(-)
>
_______________________________________________
edk2-devel mailing list
edk2-devel@lists.01.org
https://lists.01.org/mailman/listinfo/edk2-devel
^ permalink raw reply [flat|nested] 24+ messages in thread
* Re: [PATCH v2 00/20] BaseTools: One step toward python3
2018-06-20 7:29 ` Zhu, Yonghong
@ 2018-06-20 8:08 ` Gary Lin
0 siblings, 0 replies; 24+ messages in thread
From: Gary Lin @ 2018-06-20 8:08 UTC (permalink / raw)
To: Zhu, Yonghong; +Cc: Paolo Bonzini, edk2-devel@lists.01.org, Gao, Liming
On Wed, Jun 20, 2018 at 07:29:40AM +0000, Zhu, Yonghong wrote:
> Hi Paolo and Gary,
>
> The patches are good and helpful. But it is out of date, so could you help to recreate the patches ?
>
Sure. I'll re-run the futurize scripts since BaseTools has been changed
a lot recently and it might take a while.
Cheers,
Gary Lin
> Best Regards,
> Zhu Yonghong
>
>
> -----Original Message-----
> From: edk2-devel [mailto:edk2-devel-bounces@lists.01.org] On Behalf Of Paolo Bonzini
> Sent: Wednesday, June 20, 2018 2:22 PM
> To: Gary Lin <glin@suse.com>; edk2-devel@lists.01.org
> Cc: Gao, Liming <liming.gao@intel.com>
> Subject: Re: [edk2] [PATCH v2 00/20] BaseTools: One step toward python3
>
> On 01/02/2018 09:35, Gary Lin wrote:
> > v2 changes:
> > - Rebase to the current git HEAD (821807bcefb9a36e598d71a8004fae5aab2052a0)
> > - Apply "futurize -f libfuturize.fixes.fix_absolute_import" and
> > refactor some python scripts to break the circular imports.
> >
> > This patch series is also available in
> > https://github.com/lcp/edk2/tree/python3-futurize-v2
> >
> > Since python2 will be EOL in 2020, we start to evaluate the impact of
> > the python2 removal. As expected, OMVF building failed the test. It's
> > actually a task noted in the wiki page:
> >
> > https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-
> > Python3-Support
> >
> > Maybe it's time to convert the python scripts gradully.
>
> I cannot find any answer to this series. Is there any reason why it wasn't considered?
>
> Thanks,
>
> Paolo
>
> > This patchset doesn't make the python scripts in BaseTools compatible
> > with python3 immediately. It aims to do the trivial and safe
> > conversion and replacement to make some statements compatible with
> > both python2 and python3, so we can deal with the difficult cases later.
> >
> > With the help of "futurize" from python-future, it's easier to
> > refactor the statements. This patchset is basically equivalent to "futurize -1"
> > plus "StringIO.StringIO => io.BytesIO".
> >
> > For the "io.BytesIO" change, it MIGHT introduce slow down to the build
> > time since io.BytesIO is slower than StringIO.StringIO in python2(*).
> > For a quick test, I built OVMF with the following command based on
> > 8ab0bd2397c9d3922e0c7dbb1aa6f7e08799079f:
> >
> > $ rm -rf Build && make -C BaseTools/ clean $ time ./OvmfPkg/build.sh
> > -D SECURE_BOOT_ENABLE \
> > -D NETWORK_IP6_ENABLE \
> > -D HTTP_BOOT_ENABLE \
> > -D TLS_ENABLE
> >
> > Before io.BytesIO:
> >
> > Build total time: 00:03:56
> > real 4m22.991s
> > user 3m55.874s
> > sys 0m27.250s
> >
> > After io.BytesIO:
> >
> > Build total time: 00:03:57
> > real 4m23.953s
> > user 3m57.526s
> > sys 0m27.192s
> >
> > The difference is only 1 second, and I would say the impact is subtle.
> >
> > The next step will be fixing relative import and maybe applying more
> > futurize fixes. We won't get there soon but at least we are moving...
> >
> > (*)
> > https://stackoverflow.com/questions/37462075/confusing-about-stringio-
> > cstringio-and-byteio
> >
> > Contributed-under: TianoCore Contribution Agreement 1.1
> > Cc: Yonghong Zhu <yonghong.zhu@intel.com>
> > Cc: Liming Gao <liming.gao@intel.com>
> > Signed-off-by: Gary Lin <glin@suse.com>
> >
> >
> > Gary Lin (20):
> > BaseTools: Refactor python except statements
> > BaseTools: Refactor python print statements
> > BaseTools: Remove the old python "not-equal"
> > BaseTools: Use the python3-range functions
> > BaseTools: Remove tuple parameter in python scripts
> > BaseTools: Remove the deprecated hash_key()
> > BaseTools: Import reduce() from functools
> > BaseTools: Replace StandardError with Expression
> > BaseTools: Remove types.TypeType
> > BaseTools: Refactor python raise statement
> > BaseTools: Adjust the spaces around commas and colons
> > BaseTools: Migrate to the new octal literal
> > BaseTools: Unify long int and int in python scripts
> > BaseTools: Adjust old python2 idioms
> > BaseTools: Replace StringIO.StringIO with io.BytesIO
> > BaseTools: Treat GenFds.py and build.py as python modules
> > BaseTools: Adopt absolute import for python scripts
> > BaseTools: Move OverrideAttribs to OptRomInfStatement.py
> > BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py
> > BaseTools: Move ImageBinDict to GenFdsGlobalVariable.py
> >
> > BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py | 5 +-
> > BaseTools/BinWrappers/PosixLike/GenFds | 2 +-
> > BaseTools/BinWrappers/PosixLike/build | 2 +-
> > BaseTools/Scripts/BinToPcd.py | 46 +++--
> > BaseTools/Scripts/ConvertMasmToNasm.py | 1 +
> > BaseTools/Scripts/ConvertUni.py | 5 -
> > BaseTools/Scripts/MemoryProfileSymbolGen.py | 22 +-
> > BaseTools/Scripts/PatchCheck.py | 7 +-
> > BaseTools/Scripts/RunMakefile.py | 2 +-
> > BaseTools/Scripts/SmiHandlerProfileSymbolGen.py | 20 +-
> > BaseTools/Scripts/UpdateBuildVersions.py | 18 +-
> > BaseTools/Source/Python/AutoGen/AutoGen.py | 98 ++++-----
> > BaseTools/Source/Python/AutoGen/BuildEngine.py | 38 ++--
> > BaseTools/Source/Python/AutoGen/GenC.py | 12 +-
> > BaseTools/Source/Python/AutoGen/GenDepex.py | 8 +-
> > BaseTools/Source/Python/AutoGen/GenMake.py | 11 +-
> > BaseTools/Source/Python/AutoGen/GenPcdDb.py | 149 +++++++-------
> > BaseTools/Source/Python/AutoGen/GenVar.py | 166 +++++++--------
> > BaseTools/Source/Python/AutoGen/IdfClassObject.py | 1 -
> > BaseTools/Source/Python/AutoGen/InfSectionParser.py | 1 +
> > BaseTools/Source/Python/AutoGen/StrGather.py | 8 +-
> > BaseTools/Source/Python/AutoGen/UniClassObject.py | 18 +-
> > BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 10 +-
> > BaseTools/Source/Python/BPDG/BPDG.py | 8 +-
> > BaseTools/Source/Python/BPDG/GenVpd.py | 28 +--
> > BaseTools/Source/Python/Common/DataType.py | 4 +-
> > BaseTools/Source/Python/Common/Database.py | 8 +-
> > BaseTools/Source/Python/Common/DecClassObject.py | 56 ++---
> > BaseTools/Source/Python/Common/Dictionary.py | 14 +-
> > BaseTools/Source/Python/Common/DscClassObject.py | 91 +++++----
> > BaseTools/Source/Python/Common/EdkIIWorkspace.py | 28 +--
> > BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py | 152 +++++++-------
> > BaseTools/Source/Python/Common/EdkLogger.py | 3 +-
> > BaseTools/Source/Python/Common/Expression.py | 86 ++++----
> > BaseTools/Source/Python/Common/FdfClassObject.py | 6 +-
> > BaseTools/Source/Python/Common/FdfParserLite.py | 47 ++---
> > BaseTools/Source/Python/Common/InfClassObject.py | 134 ++++++------
> > BaseTools/Source/Python/Common/LongFilePathOs.py | 5 +-
> > BaseTools/Source/Python/Common/MigrationUtilities.py | 4 +-
> > BaseTools/Source/Python/Common/Misc.py | 79 ++++----
> > BaseTools/Source/Python/Common/Parsing.py | 6 +-
> > BaseTools/Source/Python/Common/RangeExpression.py | 32 +--
> > BaseTools/Source/Python/Common/String.py | 16 +-
> > BaseTools/Source/Python/Common/TargetTxtClassObject.py | 24 ++-
> > BaseTools/Source/Python/Common/ToolDefClassObject.py | 12 +-
> > BaseTools/Source/Python/Common/VpdInfoFile.py | 23 ++-
> > BaseTools/Source/Python/CommonDataClass/ModuleClass.py | 3 +-
> > BaseTools/Source/Python/CommonDataClass/PackageClass.py | 3 +-
> > BaseTools/Source/Python/CommonDataClass/PlatformClass.py | 3 +-
> > BaseTools/Source/Python/Ecc/CParser.py | 178 ++++++++--------
> > BaseTools/Source/Python/Ecc/Check.py | 10 +-
> > BaseTools/Source/Python/Ecc/CodeFragmentCollector.py | 82 ++++----
> > BaseTools/Source/Python/Ecc/Configuration.py | 5 +-
> > BaseTools/Source/Python/Ecc/Database.py | 7 +-
> > BaseTools/Source/Python/Ecc/Ecc.py | 25 +--
> > BaseTools/Source/Python/Ecc/Exception.py | 6 +-
> > BaseTools/Source/Python/Ecc/FileProfile.py | 5 +-
> > BaseTools/Source/Python/Ecc/MetaDataParser.py | 8 +-
> > BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py | 5 +-
> > BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 44 ++--
> > BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py | 5 +-
> > BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py | 9 +-
> > BaseTools/Source/Python/Ecc/c.py | 28 +--
> > BaseTools/Source/Python/Eot/CParser.py | 178 ++++++++--------
> > BaseTools/Source/Python/Eot/CodeFragmentCollector.py | 72 +++----
> > BaseTools/Source/Python/Eot/Eot.py | 15 +-
> > BaseTools/Source/Python/Eot/FileProfile.py | 3 +-
> > BaseTools/Source/Python/Eot/FvImage.py | 28 +--
> > BaseTools/Source/Python/Eot/InfParserLite.py | 13 +-
> > BaseTools/Source/Python/Eot/Parser.py | 5 +-
> > BaseTools/Source/Python/Eot/Report.py | 3 +-
> > BaseTools/Source/Python/Eot/c.py | 32 +--
> > BaseTools/Source/Python/GenFds/AprioriSection.py | 12 +-
> > BaseTools/Source/Python/GenFds/Capsule.py | 22 +-
> > BaseTools/Source/Python/GenFds/CapsuleData.py | 11 +-
> > BaseTools/Source/Python/GenFds/CompressSection.py | 7 +-
> > BaseTools/Source/Python/GenFds/DataSection.py | 7 +-
> > BaseTools/Source/Python/GenFds/DepexSection.py | 7 +-
> > BaseTools/Source/Python/GenFds/EfiSection.py | 13 +-
> > BaseTools/Source/Python/GenFds/Fd.py | 32 +--
> > BaseTools/Source/Python/GenFds/FdfParser.py | 100 ++++-----
> > BaseTools/Source/Python/GenFds/FfsFileStatement.py | 16 +-
> > BaseTools/Source/Python/GenFds/FfsInfStatement.py | 35 ++--
> > BaseTools/Source/Python/GenFds/Fv.py | 34 ++--
> > BaseTools/Source/Python/GenFds/FvImageSection.py | 15 +-
> > BaseTools/Source/Python/GenFds/GenFds.py | 126 ++----------
> > BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | 108 +++++++++-
> > BaseTools/Source/Python/GenFds/GuidSection.py | 11 +-
> > BaseTools/Source/Python/GenFds/OptRomFileStatement.py | 3 +-
> > BaseTools/Source/Python/GenFds/OptRomInfStatement.py | 30 ++-
> > BaseTools/Source/Python/GenFds/OptionRom.py | 23 +--
> > BaseTools/Source/Python/GenFds/Region.py | 17 +-
> > BaseTools/Source/Python/GenFds/RuleComplexFile.py | 3 +-
> > BaseTools/Source/Python/GenFds/RuleSimpleFile.py | 3 +-
> > BaseTools/Source/Python/GenFds/Section.py | 3 +-
> > BaseTools/Source/Python/GenFds/UiSection.py | 7 +-
> > BaseTools/Source/Python/GenFds/VerSection.py | 7 +-
> > BaseTools/Source/Python/GenFds/Vtf.py | 3 +-
> > BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | 9 +-
> > BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py | 1 +
> > BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | 32 +--
> > BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 30 +--
> > BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | 36 ++--
> > BaseTools/Source/Python/Table/TableDataModel.py | 3 +-
> > BaseTools/Source/Python/Table/TableDec.py | 3 +-
> > BaseTools/Source/Python/Table/TableDsc.py | 3 +-
> > BaseTools/Source/Python/Table/TableEotReport.py | 5 +-
> > BaseTools/Source/Python/Table/TableFdf.py | 3 +-
> > BaseTools/Source/Python/Table/TableFile.py | 3 +-
> > BaseTools/Source/Python/Table/TableFunction.py | 3 +-
> > BaseTools/Source/Python/Table/TableIdentifier.py | 5 +-
> > BaseTools/Source/Python/Table/TableInf.py | 3 +-
> > BaseTools/Source/Python/Table/TablePcd.py | 5 +-
> > BaseTools/Source/Python/Table/TableQuery.py | 3 +-
> > BaseTools/Source/Python/Table/TableReport.py | 3 +-
> > BaseTools/Source/Python/TargetTool/TargetTool.py | 39 ++--
> > BaseTools/Source/Python/Trim/Trim.py | 25 +--
> > BaseTools/Source/Python/UPT/Core/DependencyRules.py | 12 +-
> > BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py | 4 +-
> > BaseTools/Source/Python/UPT/Core/FileHook.py | 2 +-
> > BaseTools/Source/Python/UPT/Core/IpiDb.py | 6 +-
> > BaseTools/Source/Python/UPT/Core/PackageFile.py | 12 +-
> > BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py | 15 +-
> > BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py | 42 ++--
> > BaseTools/Source/Python/UPT/InstallPkg.py | 2 +-
> > BaseTools/Source/Python/UPT/InventoryWs.py | 2 +-
> > BaseTools/Source/Python/UPT/Library/CommentParsing.py | 5 +-
> > BaseTools/Source/Python/UPT/Library/ExpressionValidate.py | 11 +-
> > BaseTools/Source/Python/UPT/Library/Misc.py | 11 +-
> > BaseTools/Source/Python/UPT/Library/ParserValidate.py | 2 +-
> > BaseTools/Source/Python/UPT/Library/Parsing.py | 6 +-
> > BaseTools/Source/Python/UPT/Library/String.py | 5 +-
> > BaseTools/Source/Python/UPT/Library/UniClassObject.py | 20 +-
> > BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py | 4 +-
> > BaseTools/Source/Python/UPT/MkPkg.py | 2 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py | 6 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py | 2 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py | 4 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py | 2 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py | 4 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py | 4 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py | 4 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py | 4 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py | 2 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py | 3 +-
> > BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py | 4 +-
> > BaseTools/Source/Python/UPT/Parser/DecParserMisc.py | 1 +
> > BaseTools/Source/Python/UPT/Parser/InfSectionParser.py | 3 +-
> > BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | 57 +++---
> > BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py | 3 +-
> > BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py | 3 +-
> > BaseTools/Source/Python/UPT/ReplacePkg.py | 2 +-
> > BaseTools/Source/Python/UPT/RmPkg.py | 2 +-
> > BaseTools/Source/Python/UPT/TestInstall.py | 4 +-
> > BaseTools/Source/Python/UPT/UPT.py | 9 +-
> > BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py | 5 +-
> > BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py | 10 +-
> > BaseTools/Source/Python/UPT/Xml/CommonXml.py | 2 +-
> > BaseTools/Source/Python/UPT/Xml/IniToXml.py | 1 +
> > BaseTools/Source/Python/UPT/Xml/XmlParser.py | 25 +--
> > BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py | 3 +-
> > BaseTools/Source/Python/Workspace/BuildClassObject.py | 2 +-
> > BaseTools/Source/Python/Workspace/DecBuildData.py | 14 +-
> > BaseTools/Source/Python/Workspace/DscBuildData.py | 213 ++++++++++----------
> > BaseTools/Source/Python/Workspace/InfBuildData.py | 6 +-
> > BaseTools/Source/Python/Workspace/MetaFileParser.py | 75 +++----
> > BaseTools/Source/Python/Workspace/MetaFileTable.py | 15 +-
> > BaseTools/Source/Python/Workspace/WorkspaceCommon.py | 5 +-
> > BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 7 +-
> > BaseTools/Source/Python/build/BuildReport.py | 19 +-
> > BaseTools/Source/Python/build/build.py | 38 ++--
> > BaseTools/Tests/CheckPythonSyntax.py | 2 +-
> > BaseTools/Tests/TestTools.py | 13 +-
> > BaseTools/Tests/TianoCompress.py | 6 +-
> > BaseTools/gcc/mingw-gcc-build.py | 112 +++++-----
> > 175 files changed, 2092 insertions(+), 1927 deletions(-)
> >
>
> _______________________________________________
> edk2-devel mailing list
> edk2-devel@lists.01.org
> https://lists.01.org/mailman/listinfo/edk2-devel
> _______________________________________________
> edk2-devel mailing list
> edk2-devel@lists.01.org
> https://lists.01.org/mailman/listinfo/edk2-devel
>
^ permalink raw reply [flat|nested] 24+ messages in thread
end of thread, other threads:[~2018-06-20 8:08 UTC | newest]
Thread overview: 24+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2018-02-01 8:35 [PATCH v2 00/20] BaseTools: One step toward python3 Gary Lin
2018-02-01 8:35 ` [PATCH v2 01/20] BaseTools: Refactor python except statements Gary Lin
2018-02-01 8:35 ` [PATCH v2 02/20] BaseTools: Refactor python print statements Gary Lin
2018-02-01 8:35 ` [PATCH v2 03/20] BaseTools: Remove the old python "not-equal" Gary Lin
2018-02-01 8:35 ` [PATCH v2 04/20] BaseTools: Use the python3-range functions Gary Lin
2018-02-01 8:35 ` [PATCH v2 05/20] BaseTools: Remove tuple parameter in python scripts Gary Lin
2018-02-01 8:35 ` [PATCH v2 06/20] BaseTools: Remove the deprecated hash_key() Gary Lin
2018-02-01 8:35 ` [PATCH v2 07/20] BaseTools: Import reduce() from functools Gary Lin
2018-02-01 8:35 ` [PATCH v2 08/20] BaseTools: Replace StandardError with Expression Gary Lin
2018-02-01 8:35 ` [PATCH v2 09/20] BaseTools: Remove types.TypeType Gary Lin
2018-02-01 8:35 ` [PATCH v2 10/20] BaseTools: Refactor python raise statement Gary Lin
2018-02-01 8:35 ` [PATCH v2 11/20] BaseTools: Adjust the spaces around commas and colons Gary Lin
2018-02-01 8:36 ` [PATCH v2 12/20] BaseTools: Migrate to the new octal literal Gary Lin
2018-02-01 8:36 ` [PATCH v2 13/20] BaseTools: Unify long int and int in python scripts Gary Lin
2018-02-01 8:36 ` [PATCH v2 14/20] BaseTools: Adjust old python2 idioms Gary Lin
2018-02-01 8:36 ` [PATCH v2 15/20] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
2018-02-01 8:36 ` [PATCH v2 16/20] BaseTools: Treat GenFds.py and build.py as python modules Gary Lin
2018-02-01 8:36 ` [PATCH v2 17/20] BaseTools: Adopt absolute import for python scripts Gary Lin
2018-02-01 8:36 ` [PATCH v2 18/20] BaseTools: Move OverrideAttribs to OptRomInfStatement.py Gary Lin
2018-02-01 8:36 ` [PATCH v2 19/20] BaseTools: Move FindExtendTool to GenFdsGlobalVariable.py Gary Lin
2018-02-01 8:36 ` [PATCH v2 20/20] BaseTools: Move ImageBinDict " Gary Lin
2018-06-20 6:22 ` [PATCH v2 00/20] BaseTools: One step toward python3 Paolo Bonzini
2018-06-20 7:29 ` Zhu, Yonghong
2018-06-20 8:08 ` Gary Lin
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox