public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
* [PATCH 00/15] BaseTools: One step toward python3
@ 2018-01-19  4:43 Gary Lin
  2018-01-19  4:43 ` [PATCH 01/15] BaseTools: Refactor python except statements Gary Lin
                   ` (15 more replies)
  0 siblings, 16 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Since python2 will be EOF in 2020, we start to evaluate the impact of
the python2 removal. As expected, OMVF building failed the test. It's
actually a task noted in the wiki page:

https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support

Maybe it's time to convert the python scripts gradully.

This patchset doesn't make the python scripts in BaseTools compatible
with python3 immediately. It aims to do the trivial and safe conversion
and replacement to make some statements compatible with both python2 and
python3, so we can deal with the difficult cases later.

With the help of "futurize" from python-future, it's easier to refactor
the statements. This patchset is basically equivalent to "futurize -1"
plus "StringIO.StringIO => io.BytesIO" and minus "fix_absolute_import".
The reason to skip "fix_absolute_import" is that python2 failed to find
some modules after converting to absolute import, and it might take time
to figure out a proper fix.

For the "io.BytesIO" change, it MIGHT introduce slow down to the build
time since io.BytesIO is slower than StringIO.StringIO in python2(*).
For a quick test, I built OVMF with the following command based on
8ab0bd2397c9d3922e0c7dbb1aa6f7e08799079f:

$ rm -rf Build && make -C BaseTools/ clean
$ time ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
                          -D NETWORK_IP6_ENABLE \
                          -D HTTP_BOOT_ENABLE \
                          -D TLS_ENABLE

Before io.BytesIO:

  Build total time: 00:03:56
  real    4m22.991s
  user    3m55.874s
  sys     0m27.250s

After io.BytesIO:

  Build total time: 00:03:57
  real    4m23.953s
  user    3m57.526s
  sys     0m27.192s

The difference is only 1 second, and I would say the impact is subtle. 

The next step will be fixing relative import and maybe applying more
futurize fixes. We won't get there soon but at least we are moving... 

(*) https://stackoverflow.com/questions/37462075/confusing-about-stringio-cstringio-and-byteio

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>

Gary Lin (15):
  BaseTools: Refactor python except statements
  BaseTools: Refactor python print statements
  BaseTools: Remove the old python "not-equal"
  BaseTools: Use the python3-range functions
  BaseTools: Remove tuple parameter in python scripts
  BaseTools: Remove the deprecated hash_key()
  BaseTools: Import reduce() from functools
  BaseTools: Replace StandardError with Expression
  BaseTools: Remove types.TypeType
  BaseTools: Refactor python raise statement
  BaseTools: Adjust the spaces around commas and colons
  BaseTools: Migrate to the new octal literal
  BaseTools: Unify long int and int in python scripts
  BaseTools: Adjust old python2 idioms
  BaseTools: Replace StringIO.StringIO with io.BytesIO

 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                      |   5 +-
 BaseTools/Scripts/BinToPcd.py                                          |  46 +++---
 BaseTools/Scripts/ConvertMasmToNasm.py                                 |   1 +
 BaseTools/Scripts/ConvertUni.py                                        |   5 -
 BaseTools/Scripts/MemoryProfileSymbolGen.py                            |  22 +--
 BaseTools/Scripts/PatchCheck.py                                        |   7 +-
 BaseTools/Scripts/RunMakefile.py                                       |   2 +-
 BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                        |  20 +--
 BaseTools/Scripts/UpdateBuildVersions.py                               |  18 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                             |  91 +++++-----
 BaseTools/Source/Python/AutoGen/BuildEngine.py                         |  38 +++--
 BaseTools/Source/Python/AutoGen/GenC.py                                |   5 +-
 BaseTools/Source/Python/AutoGen/GenDepex.py                            |   8 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                             |   8 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                            | 142 ++++++++--------
 BaseTools/Source/Python/AutoGen/GenVar.py                              | 165 +++++++++----------
 BaseTools/Source/Python/AutoGen/IdfClassObject.py                      |   1 -
 BaseTools/Source/Python/AutoGen/InfSectionParser.py                    |   1 +
 BaseTools/Source/Python/AutoGen/StrGather.py                           |   5 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py                      |  18 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py             |  10 +-
 BaseTools/Source/Python/BPDG/BPDG.py                                   |   3 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                                 |  25 +--
 BaseTools/Source/Python/Common/DataType.py                             |   4 +-
 BaseTools/Source/Python/Common/DecClassObject.py                       |  39 ++---
 BaseTools/Source/Python/Common/Dictionary.py                           |   9 +-
 BaseTools/Source/Python/Common/DscClassObject.py                       |  70 ++++----
 BaseTools/Source/Python/Common/EdkIIWorkspace.py                       |  25 +--
 BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py                  | 133 +++++++--------
 BaseTools/Source/Python/Common/Expression.py                           |  81 ++++-----
 BaseTools/Source/Python/Common/FdfClassObject.py                       |   1 +
 BaseTools/Source/Python/Common/FdfParserLite.py                        |  47 +++---
 BaseTools/Source/Python/Common/InfClassObject.py                       | 113 ++++++-------
 BaseTools/Source/Python/Common/LongFilePathOs.py                       |   2 +-
 BaseTools/Source/Python/Common/MigrationUtilities.py                   |   1 +
 BaseTools/Source/Python/Common/Misc.py                                 |  70 ++++----
 BaseTools/Source/Python/Common/Parsing.py                              |   1 +
 BaseTools/Source/Python/Common/RangeExpression.py                      |  32 ++--
 BaseTools/Source/Python/Common/String.py                               |   7 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py                 |  15 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py                   |   3 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                          |  23 +--
 BaseTools/Source/Python/Ecc/CParser.py                                 | 173 ++++++++++----------
 BaseTools/Source/Python/Ecc/Check.py                                   |   1 +
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                   |  69 ++++----
 BaseTools/Source/Python/Ecc/Configuration.py                           |   5 +-
 BaseTools/Source/Python/Ecc/Exception.py                               |   3 +-
 BaseTools/Source/Python/Ecc/MetaDataParser.py                          |   3 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py         |   5 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py        |  41 ++---
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                         |   9 +-
 BaseTools/Source/Python/Ecc/c.py                                       |  15 +-
 BaseTools/Source/Python/Eot/CParser.py                                 | 173 ++++++++++----------
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py                   |  61 +++----
 BaseTools/Source/Python/Eot/FvImage.py                                 |  17 +-
 BaseTools/Source/Python/Eot/InfParserLite.py                           |   8 +-
 BaseTools/Source/Python/Eot/Parser.py                                  |   2 +-
 BaseTools/Source/Python/Eot/c.py                                       |  23 +--
 BaseTools/Source/Python/GenFds/AprioriSection.py                       |   7 +-
 BaseTools/Source/Python/GenFds/Capsule.py                              |  10 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                          |   6 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                           |   6 +-
 BaseTools/Source/Python/GenFds/Fd.py                                   |  12 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                            |  43 ++---
 BaseTools/Source/Python/GenFds/FfsFileStatement.py                     |   5 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                      |  16 +-
 BaseTools/Source/Python/GenFds/Fv.py                                   |  13 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py                       |   8 +-
 BaseTools/Source/Python/GenFds/GenFds.py                               |  20 ++-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                 |  10 +-
 BaseTools/Source/Python/GenFds/OptionRom.py                            |   3 -
 BaseTools/Source/Python/GenFds/Region.py                               |  14 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py           |   9 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py                 |   1 +
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  32 ++--
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py |  30 ++--
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         |  36 ++--
 BaseTools/Source/Python/TargetTool/TargetTool.py                       |  39 ++---
 BaseTools/Source/Python/Trim/Trim.py                                   |  25 +--
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                    |  12 +-
 BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py           |   4 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py                           |   2 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                              |   6 +-
 BaseTools/Source/Python/UPT/Core/PackageFile.py                        |  12 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py                  |  15 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                  |  42 ++---
 BaseTools/Source/Python/UPT/InstallPkg.py                              |   2 +-
 BaseTools/Source/Python/UPT/InventoryWs.py                             |   2 +-
 BaseTools/Source/Python/UPT/Library/CommentParsing.py                  |   5 +-
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py              |  11 +-
 BaseTools/Source/Python/UPT/Library/Misc.py                            |  11 +-
 BaseTools/Source/Python/UPT/Library/ParserValidate.py                  |   2 +-
 BaseTools/Source/Python/UPT/Library/Parsing.py                         |   3 +-
 BaseTools/Source/Python/UPT/Library/String.py                          |   5 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                  |  20 ++-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py                 |   4 +-
 BaseTools/Source/Python/UPT/MkPkg.py                                   |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py           |   6 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py           |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py             |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py   |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                   |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py         |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py              |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py              |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py         |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py           |   3 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py    |   4 +-
 BaseTools/Source/Python/UPT/Parser/DecParserMisc.py                    |   1 +
 BaseTools/Source/Python/UPT/Parser/InfSectionParser.py                 |   3 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py              |  57 +++----
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py              |   3 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py          |   3 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py                              |   2 +-
 BaseTools/Source/Python/UPT/RmPkg.py                                   |   2 +-
 BaseTools/Source/Python/UPT/TestInstall.py                             |   4 +-
 BaseTools/Source/Python/UPT/UPT.py                                     |   9 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                  |   5 +-
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py           |  10 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py                           |   2 +-
 BaseTools/Source/Python/UPT/Xml/IniToXml.py                            |   1 +
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                           |  25 +--
 BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py                       |   3 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py                  |   2 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py                      |  14 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                      | 142 ++++++++--------
 BaseTools/Source/Python/Workspace/InfBuildData.py                      |   3 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py                    |  74 +++++----
 BaseTools/Source/Python/Workspace/MetaFileTable.py                     |  10 +-
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py                   |   2 +-
 BaseTools/Source/Python/build/BuildReport.py                           |  17 +-
 BaseTools/Source/Python/build/build.py                                 |  35 ++--
 BaseTools/Tests/CheckPythonSyntax.py                                   |   2 +-
 BaseTools/Tests/TestTools.py                                           |  13 +-
 BaseTools/Tests/TianoCompress.py                                       |   6 +-
 BaseTools/gcc/mingw-gcc-build.py                                       | 112 ++++++-------
 136 files changed, 1559 insertions(+), 1477 deletions(-)

-- 
2.15.1



^ permalink raw reply	[flat|nested] 18+ messages in thread

* [PATCH 01/15] BaseTools: Refactor python except statements
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 02/15] BaseTools: Refactor python print statements Gary Lin
                   ` (14 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Convert "except ... ," to "except ... as" to be compatible with python3.
Based on "futurize -f lib2to3.fixes.fix_except"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/UpdateBuildVersions.py                        |  12 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                      |  10 +-
 BaseTools/Source/Python/AutoGen/GenDepex.py                     |   2 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                      |   2 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py               |   4 +-
 BaseTools/Source/Python/Common/Expression.py                    |  16 +--
 BaseTools/Source/Python/Common/FdfParserLite.py                 |   6 +-
 BaseTools/Source/Python/Common/Misc.py                          |   8 +-
 BaseTools/Source/Python/Common/RangeExpression.py               |   6 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                   |   2 +-
 BaseTools/Source/Python/Ecc/CParser.py                          | 142 ++++++++++----------
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py  |   2 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py |  14 +-
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                  |   2 +-
 BaseTools/Source/Python/Ecc/c.py                                |   2 +-
 BaseTools/Source/Python/Eot/CParser.py                          | 142 ++++++++++----------
 BaseTools/Source/Python/Eot/FvImage.py                          |   2 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                     |  10 +-
 BaseTools/Source/Python/GenFds/GenFds.py                        |   4 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py          |   2 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                |   2 +-
 BaseTools/Source/Python/Trim/Trim.py                            |   4 +-
 BaseTools/Source/Python/UPT/Core/DependencyRules.py             |   4 +-
 BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py    |   4 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                       |   2 +-
 BaseTools/Source/Python/UPT/Core/PackageFile.py                 |  12 +-
 BaseTools/Source/Python/UPT/InstallPkg.py                       |   2 +-
 BaseTools/Source/Python/UPT/InventoryWs.py                      |   2 +-
 BaseTools/Source/Python/UPT/Library/CommentParsing.py           |   2 +-
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py       |   8 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py           |   8 +-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py          |   2 +-
 BaseTools/Source/Python/UPT/MkPkg.py                            |   2 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py                       |   2 +-
 BaseTools/Source/Python/UPT/RmPkg.py                            |   2 +-
 BaseTools/Source/Python/UPT/TestInstall.py                      |   4 +-
 BaseTools/Source/Python/UPT/UPT.py                              |   4 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py               |   8 +-
 BaseTools/Source/Python/Workspace/InfBuildData.py               |   2 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py             |  16 +--
 BaseTools/Source/Python/Workspace/MetaFileTable.py              |   4 +-
 BaseTools/Source/Python/build/BuildReport.py                    |   2 +-
 BaseTools/Source/Python/build/build.py                          |  10 +-
 BaseTools/Tests/CheckPythonSyntax.py                            |   2 +-
 BaseTools/gcc/mingw-gcc-build.py                                |   2 +-
 45 files changed, 253 insertions(+), 249 deletions(-)

diff --git a/BaseTools/Scripts/UpdateBuildVersions.py b/BaseTools/Scripts/UpdateBuildVersions.py
index e62030aa9f0f..cff2e2263a8a 100755
--- a/BaseTools/Scripts/UpdateBuildVersions.py
+++ b/BaseTools/Scripts/UpdateBuildVersions.py
@@ -90,7 +90,8 @@ def ShellCommandResults(CmdLine, Opt):
             sys.stderr.flush()
         returnValue = err_val.returncode
 
-    except IOError as (errno, strerror):
+    except IOError as err_arg:
+        (errno, strerror) = err_arg.args
         file_list.close()
         if not Opt.silent:
             sys.stderr.write("I/O ERROR : %s : %s\n" % (str(errno), strerror))
@@ -100,7 +101,8 @@ def ShellCommandResults(CmdLine, Opt):
             sys.stderr.flush()
         returnValue = errno
 
-    except OSError as (errno, strerror):
+    except OSError as err_arg:
+        (errno, strerror) = err_arg.args
         file_list.close()
         if not Opt.silent:
             sys.stderr.write("OS ERROR : %s : %s\n" % (str(errno), strerror))
@@ -210,13 +212,15 @@ def RevertCmd(Filename, Opt):
             sys.stderr.write("Subprocess ERROR : %s\n" % err_val)
             sys.stderr.flush()
 
-    except IOError as (errno, strerror):
+    except IOError as err_arg:
+        (errno, strerror) = err_arg.args
         if not Opt.silent:
             sys.stderr.write("I/O ERROR : %d : %s\n" % (str(errno), strerror))
             sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
             sys.stderr.flush()
 
-    except OSError as (errno, strerror):
+    except OSError as err_arg:
+        (errno, strerror) = err_arg.args
         if not Opt.silent:
             sys.stderr.write("OS ERROR : %d : %s\n" % (str(errno), strerror))
             sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 0f7454f55a7a..faec5506a0e6 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -431,7 +431,7 @@ class WorkspaceAutoGen(AutoGen):
                                     if pcdvalue.startswith('H'):
                                         try:
                                             pcdvalue = ValueExpressionEx(pcdvalue[1:], PcdDatumType, self._GuidDict)(True)
-                                        except BadExpression, Value:
+                                        except BadExpression as Value:
                                             if Value.result > 1:
                                                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                                                 (TokenSpaceGuidCName, TokenCName, pcdvalue, Value))
@@ -448,7 +448,7 @@ class WorkspaceAutoGen(AutoGen):
                                             if pcdvalue.startswith('H'):
                                                 try:
                                                     pcdvalue = ValueExpressionEx(pcdvalue[1:], PcdDatumType, self._GuidDict)(True)
-                                                except BadExpression, Value:
+                                                except BadExpression as Value:
                                                     EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %
                                                                     (TokenSpaceGuidCName, TokenCName, pcdvalue, Value))
                                                 pcdvalue = 'H' + pcdvalue
@@ -2469,9 +2469,9 @@ class PlatformAutoGen(AutoGen):
             if PcdValue:
                 try:
                     ToPcd.DefaultValue = ValueExpression(PcdValue)(True)
-                except WrnExpression, Value:
+                except WrnExpression as Value:
                     ToPcd.DefaultValue = Value.result
-                except BadExpression, Value:
+                except BadExpression as Value:
                     EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
                                     File=self.MetaFile)
             if ToPcd.DefaultValue:
@@ -2481,7 +2481,7 @@ class PlatformAutoGen(AutoGen):
                     _GuidDict.update(Guids)
                 try:
                     ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, _GuidDict)(True)
-                except BadExpression, Value:
+                except BadExpression as Value:
                     EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
                                         File=self.MetaFile)
 
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 7aa22bd944a0..98a43db7a4e5 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -448,7 +448,7 @@ def Main():
                     os.utime(Option.OutputFile, None)
         else:
             Dpx.Generate()
-    except BaseException, X:
+    except BaseException as X:
         EdkLogger.quiet("")
         if Option != None and Option.debug != None:
             EdkLogger.quiet(traceback.format_exc())
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 7d3374a49373..3f98a34d81ec 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1027,7 +1027,7 @@ cleanlib:
             else:
                 try:
                     Fd = open(F.Path, 'r')
-                except BaseException, X:
+                except BaseException as X:
                     EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
 
                 FileContent = Fd.read()
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 856d19cda270..2711fc104f52 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -242,7 +242,7 @@ class UniFileClassObject(object):
         if len(Lang) != 3:
             try:
                 FileIn = self.OpenUniFile(LongFilePath(File.Path))
-            except UnicodeError, X:
+            except UnicodeError as X:
                 EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File);
             except:
                 EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File);
@@ -393,7 +393,7 @@ class UniFileClassObject(object):
 
         try:
             FileIn = self.OpenUniFile(LongFilePath(File.Path))
-        except UnicodeError, X:
+        except UnicodeError as X:
             EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File.Path);
         except:
             EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File.Path);
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 55fa06d414ea..216694325f96 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -231,7 +231,7 @@ class ValueExpression(object):
         }
         try:
             Val = eval(EvalStr, {}, Dict)
-        except Exception, Excpt:
+        except Exception as Excpt:
             raise BadExpression(str(Excpt))
 
         if Operator in ['and', 'or']:
@@ -351,7 +351,7 @@ class ValueExpression(object):
                 continue
             try:
                 Val = self.Eval(Op, Val, EvalFunc())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -390,7 +390,7 @@ class ValueExpression(object):
                 Op += ' ' + self._Token
             try:
                 Val = self.Eval(Op, Val, self._RelExpr())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -416,14 +416,14 @@ class ValueExpression(object):
             Val = self._UnaryExpr()
             try:
                 return self.Eval('not', Val)
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 return Warn.result
         if self._IsOperator(["~"]):
             Val = self._UnaryExpr()
             try:
                 return self.Eval('~', Val)
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 return Warn.result
         return self._IdenExpr()
@@ -734,7 +734,7 @@ class ValueExpressionEx(ValueExpression):
         PcdValue = self.PcdValue
         try:
             PcdValue = ValueExpression.__call__(self, RealValue, Depth)
-        except WrnExpression, Value:
+        except WrnExpression as Value:
             PcdValue = Value.result
 
         if PcdValue == 'True':
@@ -885,8 +885,8 @@ if __name__ == '__main__':
         try:
             print ValueExpression(input)(True)
             print ValueExpression(input)(False)
-        except WrnExpression, Ex:
+        except WrnExpression as Ex:
             print Ex.result
             print str(Ex)
-        except Exception, Ex:
+        except Exception as Ex:
             print str(Ex)
diff --git a/BaseTools/Source/Python/Common/FdfParserLite.py b/BaseTools/Source/Python/Common/FdfParserLite.py
index 7d129bfcab59..ac03c3fef5bb 100644
--- a/BaseTools/Source/Python/Common/FdfParserLite.py
+++ b/BaseTools/Source/Python/Common/FdfParserLite.py
@@ -1190,7 +1190,7 @@ class FdfParser(object):
 #                pass
             
 
-        except Warning, X:
+        except Warning as X:
             self.__UndoToken()
             FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
             X.message += '\nGot Token: \"%s\" from File %s\n' % (self.__Token, FileLineTuple[0]) + \
@@ -3659,7 +3659,7 @@ if __name__ == "__main__":
     import sys
     try:
         test_file = sys.argv[1]
-    except IndexError, v:
+    except IndexError as v:
         print "Usage: %s filename" % sys.argv[0]
         sys.exit(1)
 
@@ -3667,7 +3667,7 @@ if __name__ == "__main__":
     try:
         parser.ParseFile()
         parser.CycleReferenceCheck()
-    except Warning, X:
+    except Warning as X:
         print X.message
     else:
         print "Success!"
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index a8ed718aa5d8..f1eb4c5a7892 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -522,7 +522,7 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
             Fd = open(File, "wb")
             Fd.write(Content)
             Fd.close()
-    except IOError, X:
+    except IOError as X:
         EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
 
     return True
@@ -556,7 +556,7 @@ def DataRestore(File):
     try:
         Fd = open(File, 'rb')
         Data = cPickle.load(Fd)
-    except Exception, e:
+    except Exception as e:
         EdkLogger.verbose("Failed to load [%s]\n\t%s" % (File, str(e)))
         Data = None
     finally:
@@ -1494,7 +1494,7 @@ def ParseDevPathValue (Value):
     try:
         p = subprocess.Popen(Cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
         out, err = p.communicate()
-    except Exception, X:
+    except Exception as X:
         raise BadExpression("DevicePath: %s" % (str(X)) )
     finally:
         subprocess._cleanup()
@@ -1549,7 +1549,7 @@ def ParseFieldValue (Value):
             Value = Value[1:-1]
         try:
             Value = "'" + uuid.UUID(Value).get_bytes_le() + "'"
-        except ValueError, Message:
+        except ValueError as Message:
             raise BadExpression('%s' % Message)
         Value, Size = ParseFieldValue(Value)
         return Value, 16
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index b6c929fd885b..10b6ac55242b 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -441,7 +441,7 @@ class RangeExpression(object):
             Op = self._Token
             try:
                 Val = self.Eval(Op, Val, EvalFunc())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -464,7 +464,7 @@ class RangeExpression(object):
                 Op += ' ' + self._Token
             try:
                 Val = self.Eval(Op, Val, self._RelExpr())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -476,7 +476,7 @@ class RangeExpression(object):
             Val = self._NeExpr()
             try:
                 return self.Eval(Token, Val)
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 return Warn.result
         return self._IdenExpr()
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 716155e96d29..14ccabe833db 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -246,7 +246,7 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
                                         stdout=subprocess.PIPE, 
                                         stderr= subprocess.PIPE,
                                         shell=True)
-    except Exception, X:
+    except Exception as X:
         EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData="%s" % (str(X)))
     (out, error) = PopenObject.communicate()
     print out
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index baa521f43cc4..39883aca07c4 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -180,7 +180,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -539,7 +539,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -816,7 +816,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -971,7 +971,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1099,7 +1099,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1169,7 +1169,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1223,7 +1223,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1270,7 +1270,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1439,7 +1439,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1472,7 +1472,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1596,7 +1596,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1643,7 +1643,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1706,7 +1706,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1749,7 +1749,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1868,7 +1868,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1928,7 +1928,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2010,7 +2010,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2165,7 +2165,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2230,7 +2230,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2282,7 +2282,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2329,7 +2329,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2471,7 +2471,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3063,7 +3063,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3213,7 +3213,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3469,7 +3469,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3535,7 +3535,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3624,7 +3624,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3832,7 +3832,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3888,7 +3888,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3978,7 +3978,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4226,7 +4226,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4577,7 +4577,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4697,7 +4697,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4777,7 +4777,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4842,7 +4842,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4940,7 +4940,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5019,7 +5019,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5110,7 +5110,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5210,7 +5210,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5362,7 +5362,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5590,7 +5590,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5651,7 +5651,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5698,7 +5698,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5796,7 +5796,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6002,7 +6002,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6072,7 +6072,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6107,7 +6107,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8142,7 +8142,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8177,7 +8177,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8224,7 +8224,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8292,7 +8292,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8362,7 +8362,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8422,7 +8422,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8482,7 +8482,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8542,7 +8542,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8602,7 +8602,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8676,7 +8676,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8750,7 +8750,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8824,7 +8824,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9065,7 +9065,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9162,7 +9162,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9235,7 +9235,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9308,7 +9308,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12474,7 +12474,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12567,7 +12567,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -14537,7 +14537,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16258,7 +16258,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16329,7 +16329,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16442,7 +16442,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16593,7 +16593,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16710,7 +16710,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index a27e98c9752f..a4057ceb1775 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -98,7 +98,7 @@ class Table(object):
         SqlCommand = """drop table IF EXISTS %s""" % self.Table
         try:
             self.Cur.execute(SqlCommand)
-        except Exception, e:
+        except Exception as e:
             print "An error occurred when Drop a table:", e.args[0]
 
     ## Get count
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index ba478f9ecf10..2fef87c4180a 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -1183,7 +1183,7 @@ class DscParser(MetaFileParser):
 
             try:
                 Processer[self._ItemType]()
-            except EvaluationException, Excpt:
+            except EvaluationException as Excpt:
                 # 
                 # Only catch expression evaluation error here. We need to report
                 # the precise number of line on which the error occurred
@@ -1192,7 +1192,7 @@ class DscParser(MetaFileParser):
 #                 EdkLogger.error('Parser', FORMAT_INVALID, "Invalid expression: %s" % str(Excpt),
 #                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList),
 #                                 Line=self._LineIndex+1)
-            except MacroException, Excpt:
+            except MacroException as Excpt:
                 EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList), 
                                 Line=self._LineIndex+1)
@@ -1305,10 +1305,10 @@ class DscParser(MetaFileParser):
             Macros.update(GlobalData.gGlobalDefines)
             try:
                 Result = ValueExpression(self._ValueList[1], Macros)()
-            except SymbolNotFound, Exc:
+            except SymbolNotFound as Exc:
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
-            except WrnExpression, Excpt:
+            except WrnExpression as Excpt:
                 # 
                 # Catch expression evaluation warning here. We need to report
                 # the precise number of line and return the evaluation result
@@ -1317,7 +1317,7 @@ class DscParser(MetaFileParser):
                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList), 
                                 Line=self._LineIndex+1)
                 Result = Excpt.result
-            except BadExpression, Exc:
+            except BadExpression as Exc:
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
 
@@ -1437,13 +1437,13 @@ class DscParser(MetaFileParser):
             PcdValue = ValueList[0]      
             try:
                 ValueList[0] = ValueExpression(PcdValue, self._Macros)(True)
-            except WrnExpression, Value:
+            except WrnExpression as Value:
                 ValueList[0] = Value.result          
         else:
             PcdValue = ValueList[-1]
             try:
                 ValueList[-1] = ValueExpression(PcdValue, self._Macros)(True)
-            except WrnExpression, Value:
+            except WrnExpression as Value:
                 ValueList[-1] = Value.result
             
             if ValueList[-1] == 'True':
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index b93588eea61a..4ce8edf5573a 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -214,7 +214,7 @@ def XmlParseFile(FileName):
         Dom = xml.dom.minidom.parse(XmlFile)
         XmlFile.close()
         return Dom
-    except Exception, X:
+    except Exception as X:
         print X
         return ""
 
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 35b7405e550d..8a4b10727a07 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -2627,7 +2627,7 @@ if __name__ == '__main__':
 #    CollectSourceCodeDataIntoDB(sys.argv[1])
     try:
         test_file = sys.argv[1]
-    except IndexError, v:
+    except IndexError as v:
         print "Usage: %s filename" % sys.argv[0]
         sys.exit(1)
     MsgList = CheckFuncHeaderDoxygenComments(test_file)
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index baa521f43cc4..39883aca07c4 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -180,7 +180,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -539,7 +539,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -816,7 +816,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -971,7 +971,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1099,7 +1099,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1169,7 +1169,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1223,7 +1223,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1270,7 +1270,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1439,7 +1439,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1472,7 +1472,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1596,7 +1596,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1643,7 +1643,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1706,7 +1706,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1749,7 +1749,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1868,7 +1868,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1928,7 +1928,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2010,7 +2010,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2165,7 +2165,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2230,7 +2230,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2282,7 +2282,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2329,7 +2329,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2471,7 +2471,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3063,7 +3063,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3213,7 +3213,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3469,7 +3469,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3535,7 +3535,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3624,7 +3624,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3832,7 +3832,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3888,7 +3888,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3978,7 +3978,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4226,7 +4226,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4577,7 +4577,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4697,7 +4697,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4777,7 +4777,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4842,7 +4842,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4940,7 +4940,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5019,7 +5019,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5110,7 +5110,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5210,7 +5210,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5362,7 +5362,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5590,7 +5590,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5651,7 +5651,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5698,7 +5698,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5796,7 +5796,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6002,7 +6002,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6072,7 +6072,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6107,7 +6107,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8142,7 +8142,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8177,7 +8177,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8224,7 +8224,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8292,7 +8292,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8362,7 +8362,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8422,7 +8422,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8482,7 +8482,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8542,7 +8542,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8602,7 +8602,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8676,7 +8676,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8750,7 +8750,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8824,7 +8824,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9065,7 +9065,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9162,7 +9162,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9235,7 +9235,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9308,7 +9308,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12474,7 +12474,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12567,7 +12567,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -14537,7 +14537,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16258,7 +16258,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16329,7 +16329,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16442,7 +16442,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16593,7 +16593,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16710,7 +16710,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 0f742c7d86c2..6696623aba68 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -1411,7 +1411,7 @@ def Main():
     try:
         Option = GetOptions()
         build.main()
-    except Exception, e:
+    except Exception as e:
         print e
         return 1
 
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 0190be884a33..15b2b792b2e1 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -938,7 +938,7 @@ class FdfParser:
                     return ValueExpression(Expression, MacroPcdDict)(True)
                 else:
                     return ValueExpression(Expression, MacroPcdDict)()
-            except WrnExpression, Excpt:
+            except WrnExpression as Excpt:
                 # 
                 # Catch expression evaluation warning here. We need to report
                 # the precise number of line and return the evaluation result
@@ -947,7 +947,7 @@ class FdfParser:
                                 File=self.FileName, ExtraData=self.__CurrentLine(), 
                                 Line=Line)
                 return Excpt.result
-            except Exception, Excpt:
+            except Exception as Excpt:
                 if hasattr(Excpt, 'Pcd'):
                     if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
                         Info = GlobalData.gPlatformOtherPcds[Excpt.Pcd]
@@ -1414,7 +1414,7 @@ class FdfParser:
             while self.__GetFd() or self.__GetFv() or self.__GetFmp() or self.__GetCapsule() or self.__GetVtf() or self.__GetRule() or self.__GetOptionRom():
                 pass
 
-        except Warning, X:
+        except Warning as X:
             self.__UndoToken()
             #'\n\tGot Token: \"%s\" from File %s\n' % (self.__Token, FileLineTuple[0]) + \
             # At this point, the closest parent would be the included file itself
@@ -4817,7 +4817,7 @@ if __name__ == "__main__":
     import sys
     try:
         test_file = sys.argv[1]
-    except IndexError, v:
+    except IndexError as v:
         print "Usage: %s filename" % sys.argv[0]
         sys.exit(1)
 
@@ -4825,7 +4825,7 @@ if __name__ == "__main__":
     try:
         parser.ParseFile()
         parser.CycleReferenceCheck()
-    except Warning, X:
+    except Warning as X:
         print str(X)
     else:
         print "Success!"
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 4a5d6f476abd..51b79397337c 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -337,10 +337,10 @@ def main():
         """Display FV space info."""
         GenFds.DisplayFvSpaceInfo(FdfParserObj)
 
-    except FdfParser.Warning, X:
+    except FdfParser.Warning as X:
         EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName, Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
         ReturnCode = FORMAT_INVALID
-    except FatalError, X:
+    except FatalError as X:
         if Options.debug != None:
             import traceback
             EdkLogger.quiet(traceback.format_exc())
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 371d5a8217f7..da955fe1a4f7 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -722,7 +722,7 @@ class GenFdsGlobalVariable:
 
         try:
             PopenObject = subprocess.Popen(' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-        except Exception, X:
+        except Exception as X:
             EdkLogger.error("GenFds", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
         (out, error) = PopenObject.communicate()
 
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index bfdf763a7abc..882b016bf058 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -254,7 +254,7 @@ if __name__ == '__main__':
             FileHandle.RWFile('#', '=', 0)
         else:
             FileHandle.RWFile('#', '=', 1)
-    except Exception, e:
+    except Exception as e:
         last_type, last_value, last_tb = sys.exc_info()
         traceback.print_exception(last_type, last_value, last_tb)
 
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index d1e40b025caa..05ba86262133 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -667,7 +667,7 @@ def Main():
             EdkLogger.SetLevel(CommandOptions.LogLevel + 1)
         else:
             EdkLogger.SetLevel(CommandOptions.LogLevel)
-    except FatalError, X:
+    except FatalError as X:
         return 1
     
     try:
@@ -687,7 +687,7 @@ def Main():
             if CommandOptions.OutputFile == None:
                 CommandOptions.OutputFile = os.path.splitext(InputFile)[0] + '.iii'
             TrimPreprocessedFile(InputFile, CommandOptions.OutputFile, CommandOptions.ConvertHex, CommandOptions.TrimLong)
-    except FatalError, X:
+    except FatalError as X:
         import platform
         import traceback
         if CommandOptions != None and CommandOptions.LogLevel <= EdkLogger.DEBUG_9:
diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
index 26c5a97da80f..3a7c9809e31a 100644
--- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py
+++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
@@ -394,7 +394,7 @@ def VerifyRemoveModuleDep(Path, DpPackagePathList):
                 return False
         else:
             return True
-    except FatalError, ErrCode:
+    except FatalError as ErrCode:
         if ErrCode.message == EDK1_INF_ERROR:
             Logger.Warn("UPT",
                         ST.WRN_EDK1_INF_FOUND%Path)
@@ -446,7 +446,7 @@ def VerifyReplaceModuleDep(Path, DpPackagePathList, OtherPkgList):
                     return False
         else:
             return True
-    except FatalError, ErrCode:
+    except FatalError as ErrCode:
         if ErrCode.message == EDK1_INF_ERROR:
             Logger.Warn("UPT",
                         ST.WRN_EDK1_INF_FOUND%Path)
diff --git a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
index 9c55e0ea88a7..81c67fb510a2 100644
--- a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
+++ b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
@@ -155,7 +155,7 @@ class DistributionPackageClass(object):
                                     ModuleObj.GetName(), \
                                     ModuleObj.GetCombinePath())] = ModuleObj
                         PackageObj.SetModuleDict(ModuleDict)
-                    except FatalError, ErrCode:
+                    except FatalError as ErrCode:
                         if ErrCode.message == EDK1_INF_ERROR:
                             Logger.Warn("UPT",
                                         ST.WRN_EDK1_INF_FOUND%Filename)
@@ -181,7 +181,7 @@ class DistributionPackageClass(object):
                                  ModuleObj.GetName(), 
                                  ModuleObj.GetCombinePath())
                     self.ModuleSurfaceArea[ModuleKey] = ModuleObj
-                except FatalError, ErrCode:
+                except FatalError as ErrCode:
                     if ErrCode.message == EDK1_INF_ERROR:
                         Logger.Error("UPT",
                                      EDK1_INF_ERROR,
diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/Python/UPT/Core/IpiDb.py
index f147963288ad..baf687ef99ba 100644
--- a/BaseTools/Source/Python/UPT/Core/IpiDb.py
+++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py
@@ -230,7 +230,7 @@ class IpiDatabase(object):
             self._AddDp(DpObj.Header.GetGuid(), DpObj.Header.GetVersion(), \
                         NewDpPkgFileName, DpPkgFileName, RePackage)
     
-        except sqlite3.IntegrityError, DetailMsg:
+        except sqlite3.IntegrityError as DetailMsg:
             Logger.Error("UPT",
                          UPT_DB_UPDATE_ERROR,
                          ST.ERR_UPT_DB_UPDATE_ERROR,
diff --git a/BaseTools/Source/Python/UPT/Core/PackageFile.py b/BaseTools/Source/Python/UPT/Core/PackageFile.py
index 5fafd85bffbf..db4725b1a56d 100644
--- a/BaseTools/Source/Python/UPT/Core/PackageFile.py
+++ b/BaseTools/Source/Python/UPT/Core/PackageFile.py
@@ -51,7 +51,7 @@ class PackageFile:
             self._Files = {}
             for Filename in self._ZipFile.namelist():
                 self._Files[os.path.normpath(Filename)] = Filename
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_OPEN_FAILURE, 
                             ExtraData="%s (%s)" % (FileName, str(Xstr)))
 
@@ -106,7 +106,7 @@ class PackageFile:
                             ExtraData="[%s] in %s" % (Which, self._FileName))
         try:
             FileContent = self._ZipFile.read(self._Files[Which])
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_DECOMPRESS_FAILURE, 
                             ExtraData="[%s] in %s (%s)" % (Which, \
                                                            self._FileName, \
@@ -119,14 +119,14 @@ class PackageFile:
                 return
             else:
                 ToFile = __FileHookOpen__(ToDest, 'wb')
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_OPEN_FAILURE, 
                             ExtraData="%s (%s)" % (ToDest, str(Xstr)))
 
         try:
             ToFile.write(FileContent)
             ToFile.close()
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_WRITE_FAILURE, 
                             ExtraData="%s (%s)" % (ToDest, str(Xstr)))
 
@@ -228,7 +228,7 @@ class PackageFile:
                     return
             Logger.Info("packing ..." + File)
             self._ZipFile.write(File, ArcName)
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
                             ExtraData="%s (%s)" % (File, str(Xstr)))
 
@@ -242,7 +242,7 @@ class PackageFile:
             if os.path.splitext(ArcName)[1].lower() == '.pkg':
                 Data = Data.encode('utf_8')
             self._ZipFile.writestr(ArcName, Data)
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
                             ExtraData="%s (%s)" % (ArcName, str(Xstr)))
 
diff --git a/BaseTools/Source/Python/UPT/InstallPkg.py b/BaseTools/Source/Python/UPT/InstallPkg.py
index a8d0e1ec440a..e268f7892290 100644
--- a/BaseTools/Source/Python/UPT/InstallPkg.py
+++ b/BaseTools/Source/Python/UPT/InstallPkg.py
@@ -537,7 +537,7 @@ def Main(Options = None):
                       Options, Dep, WorkspaceDir, DataBase)
         ReturnCode = 0
         
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
diff --git a/BaseTools/Source/Python/UPT/InventoryWs.py b/BaseTools/Source/Python/UPT/InventoryWs.py
index 824e1c288947..cd92753a8d4b 100644
--- a/BaseTools/Source/Python/UPT/InventoryWs.py
+++ b/BaseTools/Source/Python/UPT/InventoryWs.py
@@ -92,7 +92,7 @@ def Main(Options = None):
         DataBase = GlobalData.gDB
         InventoryDistInstalled(DataBase)     
         ReturnCode = 0       
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
diff --git a/BaseTools/Source/Python/UPT/Library/CommentParsing.py b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
index e6d45103f94b..9cd7b60e16ab 100644
--- a/BaseTools/Source/Python/UPT/Library/CommentParsing.py
+++ b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
@@ -217,7 +217,7 @@ def ParsePcdErrorCode (Value = None, ContainerFile = None, LineNum = None):
         # To delete the tailing 'L'
         #
         return hex(ErrorCode)[:-1]
-    except ValueError, XStr:
+    except ValueError as XStr:
         if XStr:
             pass
         Logger.Error('Parser', 
diff --git a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
index 090c7eb95716..ca21e6995217 100644
--- a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
@@ -297,7 +297,7 @@ class _LogicalExpressionParser(_ExprBase):
         try:
             if self.LogicalExpression() not in [self.ARITH, self.LOGICAL, self.REALLOGICAL, self.STRINGITEM]:
                 return False, ST.ERR_EXPR_LOGICAL % self.Token
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
         self.SkipWhitespace()
         if self.Index != self.Len:
@@ -327,7 +327,7 @@ class _ValidRangeExpressionParser(_ExprBase):
         try:
             if self.RangeExpression() not in [self.HEX, self.INT]:
                 return False, ST.ERR_EXPR_RANGE % self.Token
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
         
         self.SkipWhitespace()
@@ -423,7 +423,7 @@ class _ValidListExpressionParser(_ExprBase):
         try:
             if self.ListExpression() not in [self.NUM]:
                 return False, ST.ERR_EXPR_LIST % self.Token
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
 
         self.SkipWhitespace()
@@ -457,7 +457,7 @@ class _StringTestParser(_ExprBase):
             return False, ST.ERR_EXPR_EMPTY
         try:
             self.StringTest()
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
         return True, ''
 
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index 1fbbf2e49887..b00bba1f8440 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -329,9 +329,9 @@ class UniFileClassObject(object):
         if len(Lang) != 3:
             try:
                 FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
-            except UnicodeError, Xstr:
+            except UnicodeError as Xstr:
                 FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
-            except UnicodeError, Xstr:
+            except UnicodeError as Xstr:
                 FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
             except:
                 EdkLogger.Error("Unicode File Parser", 
@@ -438,7 +438,7 @@ class UniFileClassObject(object):
 
         try:
             FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
-        except UnicodeError, Xstr:
+        except UnicodeError as Xstr:
             FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
         except UnicodeError:
             FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
@@ -1060,7 +1060,7 @@ class UniFileClassObject(object):
                              ExtraData=FilaPath)
         try:
             FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_8').readlines()
-        except UnicodeError, Xstr:
+        except UnicodeError as Xstr:
             FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16').readlines()
         except UnicodeError:
             FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16_le').readlines()
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
index d7614b884990..fd02efb6bf04 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
@@ -224,6 +224,6 @@ def XmlParseFile(FileName):
         Dom = xml.dom.minidom.parse(XmlFile)
         XmlFile.close()
         return Dom
-    except BaseException, XExcept:
+    except BaseException as XExcept:
         XmlFile.close()
         Logger.Error('\nUPT', PARSER_ERROR, XExcept, File=FileName, RaiseError=True)
diff --git a/BaseTools/Source/Python/UPT/MkPkg.py b/BaseTools/Source/Python/UPT/MkPkg.py
index 87c84f0cc25b..99d6bcc19220 100644
--- a/BaseTools/Source/Python/UPT/MkPkg.py
+++ b/BaseTools/Source/Python/UPT/MkPkg.py
@@ -213,7 +213,7 @@ def Main(Options = None):
         Logger.Quiet(ST.MSG_FINISH)
         ReturnCode = 0
 
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]        
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % \
diff --git a/BaseTools/Source/Python/UPT/ReplacePkg.py b/BaseTools/Source/Python/UPT/ReplacePkg.py
index efbf68a4ecc6..6f52b4f8f8e8 100644
--- a/BaseTools/Source/Python/UPT/ReplacePkg.py
+++ b/BaseTools/Source/Python/UPT/ReplacePkg.py
@@ -71,7 +71,7 @@ def Main(Options = None):
         InstallDp(DistPkg, DpPkgFileName, ContentZipFile, Options, Dep, WorkspaceDir, DataBase)
         ReturnCode = 0
         
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
diff --git a/BaseTools/Source/Python/UPT/RmPkg.py b/BaseTools/Source/Python/UPT/RmPkg.py
index ea842c11859f..6427a8f16c88 100644
--- a/BaseTools/Source/Python/UPT/RmPkg.py
+++ b/BaseTools/Source/Python/UPT/RmPkg.py
@@ -157,7 +157,7 @@ def Main(Options = None):
         
         ReturnCode = 0
         
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]        
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
diff --git a/BaseTools/Source/Python/UPT/TestInstall.py b/BaseTools/Source/Python/UPT/TestInstall.py
index 899cae56aa87..d8918737f907 100644
--- a/BaseTools/Source/Python/UPT/TestInstall.py
+++ b/BaseTools/Source/Python/UPT/TestInstall.py
@@ -68,12 +68,12 @@ def Main(Options=None):
         else:
             Logger.Quiet(ST.MSG_TEST_INSTALL_FAIL)
 
-    except TE.FatalError, XExcept:
+    except TE.FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
 
-    except Exception, x:
+    except Exception as x:
         ReturnCode = TE.CODE_ERROR
         Logger.Error(
                     "\nTestInstallPkg",
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 325b96bf560d..0bfcc44e3f19 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -179,7 +179,7 @@ def Main():
 
     try:
         GlobalData.gWORKSPACE, GlobalData.gPACKAGE_PATH = GetWorkspace()
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
         return XExcept.args[0]
@@ -294,7 +294,7 @@ def Main():
             return OPTION_MISSING
 
         ReturnCode = RunModule(Opt)
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 4a87fd176294..5824266dc4fe 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -819,11 +819,11 @@ class DscBuildData(PlatformBuildClassObject):
         if ValueList[Index] and PcdType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
             try:
                 ValueList[Index] = ValueExpression(ValueList[Index], GlobalData.gPlatformPcds)(True)
-            except WrnExpression, Value:
+            except WrnExpression as Value:
                 ValueList[Index] = Value.result
-            except BadExpression, Value:
+            except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, Value, File=self.MetaFile, Line=self._LineIndex + 1)
-            except EvaluationException, Excpt:
+            except EvaluationException as Excpt:
                 if hasattr(Excpt, 'Pcd'):
                     if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
                         EdkLogger.error('Parser', FORMAT_INVALID, "Cannot use this PCD (%s) in an expression as"
@@ -840,7 +840,7 @@ class DscBuildData(PlatformBuildClassObject):
             DatumType = self._DecPcds[PcdCName, TokenSpaceGuid].DatumType
             try:
                 ValueList[Index] = ValueExpressionEx(ValueList[Index], DatumType, self._GuidDict)(True)
-            except BadExpression, Value:
+            except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, Value, File=self.MetaFile, Line=LineNo,
                                 ExtraData="PCD [%s.%s] Value \"%s\" " % (TokenSpaceGuid, PcdCName, ValueList[Index]))
             Valid, ErrStr = CheckPcdDatum(self._DecPcds[PcdCName, TokenSpaceGuid].DatumType, ValueList[Index])
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 7ea9b56d5dec..67c08ee47841 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -1148,7 +1148,7 @@ class InfBuildData(ModuleBuildClassObject):
                     else:
                         try:
                             Pcd.DefaultValue = ValueExpressionEx(Pcd.DefaultValue, Pcd.DatumType, self.Guids)(True)
-                        except BadExpression, Value:
+                        except BadExpression as Value:
                             EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(TokenSpaceGuid, PcdRealName, Pcd.DefaultValue, Value),
                                             File=self.MetaFile, Line=LineNo)
                     break
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index b2b0e282eb91..74fa4d31b109 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -1327,7 +1327,7 @@ class DscParser(MetaFileParser):
                 self._InSubsection = False
             try:
                 Processer[self._ItemType]()
-            except EvaluationException, Excpt:
+            except EvaluationException as Excpt:
                 # 
                 # Only catch expression evaluation error here. We need to report
                 # the precise number of line on which the error occurred
@@ -1349,7 +1349,7 @@ class DscParser(MetaFileParser):
                     EdkLogger.error('Parser', FORMAT_INVALID, "Invalid expression: %s" % str(Excpt),
                                     File=self._FileWithError, ExtraData=' '.join(self._ValueList),
                                     Line=self._LineIndex + 1)
-            except MacroException, Excpt:
+            except MacroException as Excpt:
                 EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList),
                                 Line=self._LineIndex + 1)
@@ -1447,10 +1447,10 @@ class DscParser(MetaFileParser):
             Macros.update(GlobalData.gGlobalDefines)
             try:
                 Result = ValueExpression(self._ValueList[1], Macros)()
-            except SymbolNotFound, Exc:
+            except SymbolNotFound as Exc:
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
-            except WrnExpression, Excpt:
+            except WrnExpression as Excpt:
                 # 
                 # Catch expression evaluation warning here. We need to report
                 # the precise number of line and return the evaluation result
@@ -1591,7 +1591,7 @@ class DscParser(MetaFileParser):
         if PcdValue and "." not in self._ValueList[0]:
             try:
                 ValList[Index] = ValueExpression(PcdValue, self._Macros)(True)
-            except WrnExpression, Value:
+            except WrnExpression as Value:
                 ValList[Index] = Value.result
 
         if ValList[Index] == 'True':
@@ -1988,15 +1988,15 @@ class DecParser(MetaFileParser):
             if PcdValue:
                 try:
                     ValueList[0] = ValueExpression(PcdValue, self._AllPcdDict)(True)
-                except WrnExpression, Value:
+                except WrnExpression as Value:
                     ValueList[0] = Value.result
-                except BadExpression, Value:
+                except BadExpression as Value:
                     EdkLogger.error('Parser', FORMAT_INVALID, Value, File=self.MetaFile, Line=self._LineIndex + 1)
 
             if ValueList[0]:
                 try:
                     ValueList[0] = ValueExpressionEx(ValueList[0], ValueList[1], self._GuidDict)(True)
-                except BadExpression, Value:
+                except BadExpression as Value:
                     EdkLogger.error('Parser', FORMAT_INVALID, Value, ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
             # check format of default value against the datum type
             IsValid, Cause = CheckPcdDatum(ValueList[1], ValueList[0])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index d8549c9d66e6..92fcf6dd2b22 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -63,7 +63,7 @@ class MetaFileTable(Table):
                 # update the timestamp in database
                 self._FileIndexTable.SetFileTimeStamp(self.IdBase, TimeStamp)
                 return False
-        except Exception, Exc:
+        except Exception as Exc:
             EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc))
             return False
         return True
@@ -250,7 +250,7 @@ class PackageTable(MetaFileTable):
                 if comment.startswith("@Expression"):
                     comment = comment.replace("@Expression", "", 1)
                     expressions.append(comment.split("|")[1].strip())
-        except Exception, Exc:
+        except Exception as Exc:
             ValidType = ""
             if oricomment.startswith("@ValidRange"):
                 ValidType = "@ValidRange"
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index d6e943d2f1d4..c3bfecf8cc66 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -649,7 +649,7 @@ class ModuleReport(object):
                 cmd = ["GenFw", "--rebase", str(0), "-o", Tempfile, DefaultEFIfile]
                 try:
                     PopenObject = subprocess.Popen(' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-                except Exception, X:
+                except Exception as X:
                     EdkLogger.error("GenFw", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
                 EndOfProcedure = threading.Event()
                 EndOfProcedure.clear()
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index de19756d99cb..0379fd8baf1e 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -546,7 +546,7 @@ class BuildTask:
                 EdkLogger.debug(EdkLogger.DEBUG_8, "Threads [%s]" % ", ".join([Th.getName() for Th in threading.enumerate()]))
                 # avoid tense loop
                 time.sleep(0.1)
-        except BaseException, X:
+        except BaseException as X:
             #
             # TRICK: hide the output of threads left runing, so that the user can
             #        catch the error message easily
@@ -1316,7 +1316,7 @@ class Build():
             try:
                 #os.rmdir(AutoGenObject.BuildDir)
                 RemoveDirectory(AutoGenObject.BuildDir, True)
-            except WindowsError, X:
+            except WindowsError as X:
                 EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
         return True
 
@@ -1406,7 +1406,7 @@ class Build():
             try:
                 #os.rmdir(AutoGenObject.BuildDir)
                 RemoveDirectory(AutoGenObject.BuildDir, True)
-            except WindowsError, X:
+            except WindowsError as X:
                 EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
         return True
 
@@ -2488,14 +2488,14 @@ def Main():
         # All job done, no error found and no exception raised
         #
         BuildError = False
-    except FatalError, X:
+    except FatalError as X:
         if MyBuild != None:
             # for multi-thread build exits safely
             MyBuild.Relinquish()
         if Option != None and Option.debug != None:
             EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
         ReturnCode = X.args[0]
-    except Warning, X:
+    except Warning as X:
         # error from Fdf parser
         if MyBuild != None:
             # for multi-thread build exits safely
diff --git a/BaseTools/Tests/CheckPythonSyntax.py b/BaseTools/Tests/CheckPythonSyntax.py
index 61a048ad5d05..a55b29de4713 100644
--- a/BaseTools/Tests/CheckPythonSyntax.py
+++ b/BaseTools/Tests/CheckPythonSyntax.py
@@ -29,7 +29,7 @@ class Tests(TestTools.BaseToolsTest):
     def SingleFileTest(self, filename):
         try:
             py_compile.compile(filename, doraise=True)
-        except Exception, e:
+        except Exception as e:
             self.fail('syntax error: %s, Error is %s' % (filename, str(e)))
 
 def MakePythonSyntaxCheckTests():
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 420b3dea80f7..858b4020ef9f 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -337,7 +337,7 @@ class SourceFiles:
                     print '[KeyboardInterrupt]'
                     return False
 
-                except Exception, e:
+                except Exception as e:
                     print e
 
             if not completed: return False
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 02/15] BaseTools: Refactor python print statements
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
  2018-01-19  4:43 ` [PATCH 01/15] BaseTools: Refactor python except statements Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 03/15] BaseTools: Remove the old python "not-equal" Gary Lin
                   ` (13 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Refactor print statements to be compatible with python 3.
Based on "futurize -f libfuturize.fixes.fix_print_with_import"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                      |   3 +-
 BaseTools/Scripts/BinToPcd.py                                          |  37 +++---
 BaseTools/Scripts/MemoryProfileSymbolGen.py                            |  14 +--
 BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                        |  20 +--
 BaseTools/Source/Python/AutoGen/AutoGen.py                             |   5 +-
 BaseTools/Source/Python/AutoGen/BuildEngine.py                         |  31 ++---
 BaseTools/Source/Python/AutoGen/UniClassObject.py                      |   7 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py             |   5 +-
 BaseTools/Source/Python/BPDG/BPDG.py                                   |   3 +-
 BaseTools/Source/Python/Common/DecClassObject.py                       |  39 +++---
 BaseTools/Source/Python/Common/Dictionary.py                           |   7 +-
 BaseTools/Source/Python/Common/DscClassObject.py                       |  67 +++++-----
 BaseTools/Source/Python/Common/EdkIIWorkspace.py                       |  23 ++--
 BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py                  | 133 ++++++++++----------
 BaseTools/Source/Python/Common/Expression.py                           |  11 +-
 BaseTools/Source/Python/Common/FdfParserLite.py                        |  29 ++---
 BaseTools/Source/Python/Common/InfClassObject.py                       | 113 ++++++++---------
 BaseTools/Source/Python/Common/RangeExpression.py                      |   5 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py                 |  13 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                          |   3 +-
 BaseTools/Source/Python/Ecc/CParser.py                                 |   3 +-
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                   |  69 +++++-----
 BaseTools/Source/Python/Ecc/Configuration.py                           |   5 +-
 BaseTools/Source/Python/Ecc/Exception.py                               |   3 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py         |   3 +-
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                         |   5 +-
 BaseTools/Source/Python/Ecc/c.py                                       |  13 +-
 BaseTools/Source/Python/Eot/CParser.py                                 |   3 +-
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py                   |  61 ++++-----
 BaseTools/Source/Python/Eot/FvImage.py                                 |  13 +-
 BaseTools/Source/Python/Eot/InfParserLite.py                           |   7 +-
 BaseTools/Source/Python/Eot/c.py                                       |   3 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                            |   7 +-
 BaseTools/Source/Python/GenFds/GenFds.py                               |   3 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                 |   3 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py           |   7 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  23 ++--
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py |  15 +--
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         |  17 +--
 BaseTools/Source/Python/TargetTool/TargetTool.py                       |  23 ++--
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py              |   3 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                  |   9 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py              |  51 ++++----
 BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                  |   5 +-
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py           |   9 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                      |  19 +--
 BaseTools/Source/Python/Workspace/MetaFileParser.py                    |   3 +-
 BaseTools/Source/Python/build/build.py                                 |   3 +-
 BaseTools/Tests/TestTools.py                                           |   5 +-
 BaseTools/Tests/TianoCompress.py                                       |   5 +-
 BaseTools/gcc/mingw-gcc-build.py                                       |  99 +++++++--------
 51 files changed, 557 insertions(+), 508 deletions(-)

diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
index 69fd2d54413e..dd66c7111ac0 100755
--- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
+++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
@@ -23,6 +23,7 @@
 #
 # ExceptionList if a tool takes an argument with a / add it to the exception list
 #
+from __future__ import print_function
 import sys
 import os
 import subprocess
@@ -86,7 +87,7 @@ if __name__ == "__main__":
      ret = main(sys.argv[2:])
 
   except:
-    print "exiting: exception from " + sys.argv[0]
+    print("exiting: exception from " + sys.argv[0])
     ret = 2
 
   sys.exit(ret)
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 68a7ac652d70..c4e7b8a5c2e2 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -14,6 +14,7 @@
 '''
 BinToPcd
 '''
+from __future__ import print_function
 
 import sys
 import argparse
@@ -98,7 +99,7 @@ if __name__ == '__main__':
     Buffer = args.InputFile.read()
     args.InputFile.close()
   except:
-    print 'BinToPcd: error: can not read binary input file'
+    print('BinToPcd: error: can not read binary input file')
     sys.exit()
 
   #
@@ -109,7 +110,7 @@ if __name__ == '__main__':
     # If PcdName is None, then only a PCD value is being requested.
     Pcd = ByteArray (Buffer)
     if args.Verbose:
-      print 'PcdToBin: Convert binary file to PCD Value'
+      print('PcdToBin: Convert binary file to PCD Value')
   elif args.PcdType is None:
     #
     # If --type is neither VPD nor HII, then use PCD statement syntax that is
@@ -123,18 +124,18 @@ if __name__ == '__main__':
       #
       Pcd = '  %s|%s' % (args.PcdName, ByteArray (Buffer))
     elif args.MaxSize < len(Buffer):
-      print 'BinToPcd: error: argument --max-size is smaller than input file.'
+      print('BinToPcd: error: argument --max-size is smaller than input file.')
       sys.exit()
     else:
       Pcd = '  %s|%s|VOID*|%d' % (args.PcdName, ByteArray (Buffer), args.MaxSize)
       args.MaxSize = len(Buffer)
     
     if args.Verbose:
-      print 'PcdToBin: Convert binary file to PCD statement compatible with PCD sections:'
-      print '    [PcdsFixedAtBuild]'
-      print '    [PcdsPatchableInModule]'
-      print '    [PcdsDynamicDefault]'
-      print '    [PcdsDynamicExDefault]'
+      print('PcdToBin: Convert binary file to PCD statement compatible with PCD sections:')
+      print('    [PcdsFixedAtBuild]')
+      print('    [PcdsPatchableInModule]')
+      print('    [PcdsDynamicDefault]')
+      print('    [PcdsDynamicExDefault]')
   elif args.PcdType == 'VPD':
     if args.MaxSize is None:
       #
@@ -143,7 +144,7 @@ if __name__ == '__main__':
       #
       args.MaxSize = len(Buffer)
     if args.MaxSize < len(Buffer):
-      print 'BinToPcd: error: argument --max-size is smaller than input file.'
+      print('BinToPcd: error: argument --max-size is smaller than input file.')
       sys.exit()
     if args.Offset is None:
       #
@@ -157,15 +158,15 @@ if __name__ == '__main__':
       #
       Pcd = '  %s|%d|%d|%s' % (args.PcdName, args.Offset, args.MaxSize, ByteArray (Buffer))
     if args.Verbose:
-      print 'PcdToBin: Convert binary file to PCD statement compatible with PCD sections'
-      print '    [PcdsDynamicVpd]'
-      print '    [PcdsDynamicExVpd]'
+      print('PcdToBin: Convert binary file to PCD statement compatible with PCD sections')
+      print('    [PcdsDynamicVpd]')
+      print('    [PcdsDynamicExVpd]')
   elif args.PcdType == 'HII':
     if args.VariableGuid is None:
-      print 'BinToPcd: error: argument --variable-guid is required for --type HII.'
+      print('BinToPcd: error: argument --variable-guid is required for --type HII.')
       sys.exit()
     if args.VariableName is None:
-      print 'BinToPcd: error: argument --variable-name is required for --type HII.'
+      print('BinToPcd: error: argument --variable-name is required for --type HII.')
       sys.exit()
     if args.Offset is None:
       #
@@ -174,9 +175,9 @@ if __name__ == '__main__':
       args.Offset = 0
     Pcd = '  %s|L"%s"|%s|%d|%s' % (args.PcdName, args.VariableName, args.VariableGuid, args.Offset, ByteArray (Buffer))
     if args.Verbose:
-      print 'PcdToBin: Convert binary file to PCD statement compatible with PCD sections'
-      print '    [PcdsDynamicHii]'
-      print '    [PcdsDynamicExHii]'
+      print('PcdToBin: Convert binary file to PCD statement compatible with PCD sections')
+      print('    [PcdsDynamicHii]')
+      print('    [PcdsDynamicExHii]')
 
   #
   # Write PCD value or PCD statement to the output file
@@ -189,4 +190,4 @@ if __name__ == '__main__':
     # If output file is not specified or it can not be written, then write the
     # PCD value or PCD statement to the console
     #
-    print Pcd
+    print(Pcd)
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 5709ad4641cb..3bc6a8897bcc 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -13,7 +13,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 ##
-
+from __future__ import print_function
 import os
 import re
 import sys
@@ -58,10 +58,10 @@ class Symbols:
         try:
             nmCommand = "nm"
             nmLineOption = "-l"
-            print "parsing (debug) - " + pdbName
+            print("parsing (debug) - " + pdbName)
             os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
         except :
-            print 'ERROR: nm command not available.  Please verify PATH'
+            print('ERROR: nm command not available.  Please verify PATH')
             return
 
         #
@@ -111,11 +111,11 @@ class Symbols:
             DIA2DumpCommand = "Dia2Dump.exe"
             #DIA2SymbolOption = "-p"
             DIA2LinesOption = "-l"
-            print "parsing (pdb) - " + pdbName
+            print("parsing (pdb) - " + pdbName)
             #os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
             os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
         except :
-            print 'ERROR: DIA2Dump command not available.  Please verify PATH'
+            print('ERROR: DIA2Dump command not available.  Please verify PATH')
             return
 
         #
@@ -254,12 +254,12 @@ def main():
     try :
         file = open(Options.inputfilename)
     except Exception:
-        print "fail to open " + Options.inputfilename
+        print("fail to open " + Options.inputfilename)
         return 1
     try :
         newfile = open(Options.outputfilename, "w")
     except Exception:
-        print "fail to open " + Options.outputfilename
+        print("fail to open " + Options.outputfilename)
         return 1
 
     try:
diff --git a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
index f03278b64f8f..d0963a17e870 100644
--- a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
+++ b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
@@ -13,7 +13,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 ##
-
+from __future__ import print_function
 import os
 import re
 import sys
@@ -61,10 +61,10 @@ class Symbols:
         try:
             nmCommand = "nm"
             nmLineOption = "-l"
-            print "parsing (debug) - " + pdbName
+            print("parsing (debug) - " + pdbName)
             os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
         except :
-            print 'ERROR: nm command not available.  Please verify PATH'
+            print('ERROR: nm command not available.  Please verify PATH')
             return
 
         #
@@ -103,11 +103,11 @@ class Symbols:
             DIA2DumpCommand = "Dia2Dump.exe"
             #DIA2SymbolOption = "-p"
             DIA2LinesOption = "-l"
-            print "parsing (pdb) - " + pdbName
+            print("parsing (pdb) - " + pdbName)
             #os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
             os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
         except :
-            print 'ERROR: DIA2Dump command not available.  Please verify PATH'
+            print('ERROR: DIA2Dump command not available.  Please verify PATH')
             return
 
         #
@@ -235,14 +235,14 @@ def main():
     try :
         DOMTree = xml.dom.minidom.parse(Options.inputfilename)
     except Exception:
-        print "fail to open input " + Options.inputfilename
+        print("fail to open input " + Options.inputfilename)
         return 1
 
     if Options.guidreffilename is not None:
         try :
             guidreffile = open(Options.guidreffilename)
         except Exception:
-            print "fail to open guidref" + Options.guidreffilename
+            print("fail to open guidref" + Options.guidreffilename)
             return 1
         genGuidString(guidreffile)
         guidreffile.close()
@@ -277,7 +277,7 @@ def main():
 
                     Handler = smiHandler.getElementsByTagName("Handler")
                     RVA = Handler[0].getElementsByTagName("RVA")
-                    print "    Handler RVA: %s" % RVA[0].childNodes[0].data
+                    print("    Handler RVA: %s" % RVA[0].childNodes[0].data)
 
                     if (len(RVA)) >= 1:
                         rvaName = RVA[0].childNodes[0].data
@@ -289,7 +289,7 @@ def main():
 
                     Caller = smiHandler.getElementsByTagName("Caller")
                     RVA = Caller[0].getElementsByTagName("RVA")
-                    print "    Caller RVA: %s" % RVA[0].childNodes[0].data
+                    print("    Caller RVA: %s" % RVA[0].childNodes[0].data)
 
                     if (len(RVA)) >= 1:
                         rvaName = RVA[0].childNodes[0].data
@@ -302,7 +302,7 @@ def main():
     try :
         newfile = open(Options.outputfilename, "w")
     except Exception:
-        print "fail to open output" + Options.outputfilename
+        print("fail to open output" + Options.outputfilename)
         return 1
 
     newfile.write(DOMTree.toprettyxml(indent = "\t", newl = "\n", encoding = "utf-8"))
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index faec5506a0e6..5e55d5d655e3 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -13,6 +13,7 @@
 
 ## Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import os.path as path
@@ -757,7 +758,7 @@ class WorkspaceAutoGen(AutoGen):
             os.makedirs(self.BuildDir)
         with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file:
             for f in AllWorkSpaceMetaFiles:
-                print >> file, f
+                print(f, file=file)
         return True
 
     def _GenPkgLevelHash(self, Pkg):
@@ -4631,7 +4632,7 @@ class ModuleAutoGen(AutoGen):
             os.remove (self.GetTimeStampPath())
         with open(self.GetTimeStampPath(), 'w+') as file:
             for f in FileSet:
-                print >> file, f
+                print(f, file=file)
 
     Module          = property(_GetModule)
     Name            = property(_GetBaseName)
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index 63ed47d94bcb..46685967d1ee 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import copy
@@ -625,19 +626,19 @@ if __name__ == '__main__':
     EdkLogger.Initialize()
     if len(sys.argv) > 1:
         Br = BuildRule(sys.argv[1])
-        print str(Br[".c", "DXE_DRIVER", "IA32", "MSFT"][1])
-        print
-        print str(Br[".c", "DXE_DRIVER", "IA32", "INTEL"][1])
-        print
-        print str(Br[".c", "DXE_DRIVER", "IA32", "GCC"][1])
-        print
-        print str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1])
-        print
-        print str(Br[".h", "ACPI_TABLE", "IA32", "INTEL"][1])
-        print
-        print str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1])
-        print
-        print str(Br[".s", "SEC", "IPF", "COMMON"][1])
-        print
-        print str(Br[".s", "SEC"][1])
+        print(str(Br[".c", "DXE_DRIVER", "IA32", "MSFT"][1]))
+        print()
+        print(str(Br[".c", "DXE_DRIVER", "IA32", "INTEL"][1]))
+        print()
+        print(str(Br[".c", "DXE_DRIVER", "IA32", "GCC"][1]))
+        print()
+        print(str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1]))
+        print()
+        print(str(Br[".h", "ACPI_TABLE", "IA32", "INTEL"][1]))
+        print()
+        print(str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1]))
+        print()
+        print(str(Br[".s", "SEC", "IPF", "COMMON"][1]))
+        print()
+        print(str(Br[".s", "SEC"][1]))
 
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 2711fc104f52..264cf1546566 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -16,6 +16,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os, codecs, re
 import distutils.util
 import Common.EdkLogger as EdkLogger
@@ -684,12 +685,12 @@ class UniFileClassObject(object):
     # Show the instance itself
     #
     def ShowMe(self):
-        print self.LanguageDef
+        print(self.LanguageDef)
         #print self.OrderedStringList
         for Item in self.OrderedStringList:
-            print Item
+            print(Item)
             for Member in self.OrderedStringList[Item]:
-                print str(Member)
+                print(str(Member))
 
 # This acts like the main() function for the script, unless it is 'import'ed into another
 # script.
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 92ede7a82324..53da9b881f25 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -14,6 +14,7 @@
 # #
 # Import Modules
 #
+from __future__ import print_function
 import os
 from Common.RangeExpression import RangeExpression
 from Common.Misc import *
@@ -345,6 +346,6 @@ if __name__ == "__main__":
     test2 = TestObj(2)
     
     testarr = [test1, test2]
-    print TestObj(2) in testarr
-    print TestObj(2) == test2
+    print(TestObj(2) in testarr)
+    print(TestObj(2) == test2)
     
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index b1e328ff3f11..9ab13a39e8bf 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -20,6 +20,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import sys
 import encodings.ascii
@@ -132,7 +133,7 @@ def MyOptionParser():
 #
 def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
     if os.path.exists(VpdFileName) and not Force:
-        print "\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName
+        print("\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName)
         choice = sys.stdin.readline()
         if choice.strip().lower() not in ['y', 'yes', '']:
             return
diff --git a/BaseTools/Source/Python/Common/DecClassObject.py b/BaseTools/Source/Python/Common/DecClassObject.py
index d7c70a7336a0..970e644318d0 100644
--- a/BaseTools/Source/Python/Common/DecClassObject.py
+++ b/BaseTools/Source/Python/Common/DecClassObject.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 from String import *
 from DataType import *
@@ -517,31 +518,31 @@ class Dec(DecObject):
     def ShowPackage(self):
         M = self.Package
         for Arch in M.Header.keys():
-            print '\nArch =', Arch
-            print 'Filename =', M.Header[Arch].FileName
-            print 'FullPath =', M.Header[Arch].FullPath
-            print 'BaseName =', M.Header[Arch].Name
-            print 'Guid =', M.Header[Arch].Guid
-            print 'Version =', M.Header[Arch].Version
-            print 'DecSpecification =', M.Header[Arch].DecSpecification
-        print '\nIncludes =', M.Includes
+            print('\nArch =', Arch)
+            print('Filename =', M.Header[Arch].FileName)
+            print('FullPath =', M.Header[Arch].FullPath)
+            print('BaseName =', M.Header[Arch].Name)
+            print('Guid =', M.Header[Arch].Guid)
+            print('Version =', M.Header[Arch].Version)
+            print('DecSpecification =', M.Header[Arch].DecSpecification)
+        print('\nIncludes =', M.Includes)
         for Item in M.Includes:
-            print Item.FilePath, Item.SupArchList
-        print '\nGuids =', M.GuidDeclarations
+            print(Item.FilePath, Item.SupArchList)
+        print('\nGuids =', M.GuidDeclarations)
         for Item in M.GuidDeclarations:
-            print Item.CName, Item.Guid, Item.SupArchList
-        print '\nProtocols =', M.ProtocolDeclarations
+            print(Item.CName, Item.Guid, Item.SupArchList)
+        print('\nProtocols =', M.ProtocolDeclarations)
         for Item in M.ProtocolDeclarations:
-            print Item.CName, Item.Guid, Item.SupArchList
-        print '\nPpis =', M.PpiDeclarations
+            print(Item.CName, Item.Guid, Item.SupArchList)
+        print('\nPpis =', M.PpiDeclarations)
         for Item in M.PpiDeclarations:
-            print Item.CName, Item.Guid, Item.SupArchList
-        print '\nLibraryClasses =', M.LibraryClassDeclarations
+            print(Item.CName, Item.Guid, Item.SupArchList)
+        print('\nLibraryClasses =', M.LibraryClassDeclarations)
         for Item in M.LibraryClassDeclarations:
-            print Item.LibraryClass, Item.RecommendedInstance, Item.SupModuleList, Item.SupArchList
-        print '\nPcds =', M.PcdDeclarations
+            print(Item.LibraryClass, Item.RecommendedInstance, Item.SupModuleList, Item.SupArchList)
+        print('\nPcds =', M.PcdDeclarations)
         for Item in M.PcdDeclarations:
-            print 'CName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, 'Token=', Item.Token, 'DatumType=', Item.DatumType, Item.SupArchList
+            print('CName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, 'Token=', Item.Token, 'DatumType=', Item.DatumType, Item.SupArchList)
 
 ##
 #
diff --git a/BaseTools/Source/Python/Common/Dictionary.py b/BaseTools/Source/Python/Common/Dictionary.py
index 1c33fefabf98..5f2cc8f31ffa 100644
--- a/BaseTools/Source/Python/Common/Dictionary.py
+++ b/BaseTools/Source/Python/Common/Dictionary.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import EdkLogger
 from DataType import *
 from Common.LongFilePathSupport import OpenLongFilePath as open
@@ -58,7 +59,7 @@ def printDict(Dict):
         KeyList = Dict.keys()
         for Key in KeyList:
             if Dict[Key] != '':
-                print Key + ' = ' + str(Dict[Key])
+                print(Key + ' = ' + str(Dict[Key]))
 
 ## Print the dictionary
 #
@@ -71,6 +72,6 @@ def printList(Key, List):
     if type(List) == type([]):
         if len(List) > 0:
             if Key.find(TAB_SPLIT) != -1:
-                print "\n" + Key
+                print("\n" + Key)
                 for Item in List:
-                    print Item
+                    print(Item)
diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/Source/Python/Common/DscClassObject.py
index c2fa1c275a2d..3a27fbffc934 100644
--- a/BaseTools/Source/Python/Common/DscClassObject.py
+++ b/BaseTools/Source/Python/Common/DscClassObject.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import EdkLogger as EdkLogger
 import Database
@@ -1365,7 +1366,7 @@ class Dsc(DscObject):
     # Print all members and their values of Dsc class
     #
     def ShowDsc(self):
-        print TAB_SECTION_START + TAB_INF_DEFINES + TAB_SECTION_END
+        print(TAB_SECTION_START + TAB_INF_DEFINES + TAB_SECTION_END)
         printDict(self.Defines.DefinesDictionary)
 
         for Key in self.KeyList:
@@ -1382,47 +1383,47 @@ class Dsc(DscObject):
     def ShowPlatform(self):
         M = self.Platform
         for Arch in M.Header.keys():
-            print '\nArch =', Arch
-            print 'Filename =', M.Header[Arch].FileName
-            print 'FullPath =', M.Header[Arch].FullPath
-            print 'BaseName =', M.Header[Arch].Name
-            print 'Guid =', M.Header[Arch].Guid
-            print 'Version =', M.Header[Arch].Version
-            print 'DscSpecification =', M.Header[Arch].DscSpecification
-            print 'SkuId =', M.Header[Arch].SkuIdName
-            print 'SupArchList =', M.Header[Arch].SupArchList
-            print 'BuildTargets =', M.Header[Arch].BuildTargets
-            print 'OutputDirectory =', M.Header[Arch].OutputDirectory
-            print 'BuildNumber =', M.Header[Arch].BuildNumber
-            print 'MakefileName =', M.Header[Arch].MakefileName
-            print 'BsBaseAddress =', M.Header[Arch].BsBaseAddress
-            print 'RtBaseAddress =', M.Header[Arch].RtBaseAddress
-            print 'Define =', M.Header[Arch].Define
-        print 'Fdf =', M.FlashDefinitionFile.FilePath
-        print '\nBuildOptions =', M.BuildOptions, M.BuildOptions.IncludeFiles
+            print('\nArch =', Arch)
+            print('Filename =', M.Header[Arch].FileName)
+            print('FullPath =', M.Header[Arch].FullPath)
+            print('BaseName =', M.Header[Arch].Name)
+            print('Guid =', M.Header[Arch].Guid)
+            print('Version =', M.Header[Arch].Version)
+            print('DscSpecification =', M.Header[Arch].DscSpecification)
+            print('SkuId =', M.Header[Arch].SkuIdName)
+            print('SupArchList =', M.Header[Arch].SupArchList)
+            print('BuildTargets =', M.Header[Arch].BuildTargets)
+            print('OutputDirectory =', M.Header[Arch].OutputDirectory)
+            print('BuildNumber =', M.Header[Arch].BuildNumber)
+            print('MakefileName =', M.Header[Arch].MakefileName)
+            print('BsBaseAddress =', M.Header[Arch].BsBaseAddress)
+            print('RtBaseAddress =', M.Header[Arch].RtBaseAddress)
+            print('Define =', M.Header[Arch].Define)
+        print('Fdf =', M.FlashDefinitionFile.FilePath)
+        print('\nBuildOptions =', M.BuildOptions, M.BuildOptions.IncludeFiles)
         for Item in M.BuildOptions.BuildOptionList:
-            print '\t', 'ToolChainFamily =', Item.ToolChainFamily, 'ToolChain =', Item.ToolChain, 'Option =', Item.Option, 'Arch =', Item.SupArchList
-        print '\nSkuIds =', M.SkuInfos.SkuInfoList, M.SkuInfos.IncludeFiles
-        print '\nLibraries =', M.Libraries, M.Libraries.IncludeFiles
+            print('\t', 'ToolChainFamily =', Item.ToolChainFamily, 'ToolChain =', Item.ToolChain, 'Option =', Item.Option, 'Arch =', Item.SupArchList)
+        print('\nSkuIds =', M.SkuInfos.SkuInfoList, M.SkuInfos.IncludeFiles)
+        print('\nLibraries =', M.Libraries, M.Libraries.IncludeFiles)
         for Item in M.Libraries.LibraryList:
-            print '\t', Item.FilePath, Item.SupArchList, Item.Define
-        print '\nLibraryClasses =', M.LibraryClasses, M.LibraryClasses.IncludeFiles
+            print('\t', Item.FilePath, Item.SupArchList, Item.Define)
+        print('\nLibraryClasses =', M.LibraryClasses, M.LibraryClasses.IncludeFiles)
         for Item in M.LibraryClasses.LibraryList:
-            print '\t', Item.Name, Item.FilePath, Item.SupModuleList, Item.SupArchList, Item.Define
-        print '\nPcds =', M.DynamicPcdBuildDefinitions
+            print('\t', Item.Name, Item.FilePath, Item.SupModuleList, Item.SupArchList, Item.Define)
+        print('\nPcds =', M.DynamicPcdBuildDefinitions)
         for Item in M.DynamicPcdBuildDefinitions:
-            print '\tCname=', Item.CName, 'TSG=', Item.TokenSpaceGuidCName, 'Value=', Item.DefaultValue, 'Token=', Item.Token, 'Type=', Item.ItemType, 'Datum=', Item.DatumType, 'Size=', Item.MaxDatumSize, 'Arch=', Item.SupArchList, Item.SkuInfoList
+            print('\tCname=', Item.CName, 'TSG=', Item.TokenSpaceGuidCName, 'Value=', Item.DefaultValue, 'Token=', Item.Token, 'Type=', Item.ItemType, 'Datum=', Item.DatumType, 'Size=', Item.MaxDatumSize, 'Arch=', Item.SupArchList, Item.SkuInfoList)
             for Sku in Item.SkuInfoList.values():
-                print '\t\t', str(Sku)
-        print '\nComponents =', M.Modules.ModuleList, M.Modules.IncludeFiles
+                print('\t\t', str(Sku))
+        print('\nComponents =', M.Modules.ModuleList, M.Modules.IncludeFiles)
         for Item in M.Modules.ModuleList:
-            print '\t', Item.FilePath, Item.ExecFilePath, Item.SupArchList
+            print('\t', Item.FilePath, Item.ExecFilePath, Item.SupArchList)
             for Lib in Item.LibraryClasses.LibraryList:
-                print '\t\tLib:', Lib.Name, Lib.FilePath
+                print('\t\tLib:', Lib.Name, Lib.FilePath)
             for Bo in Item.ModuleSaBuildOption.BuildOptionList:
-                print '\t\tBuildOption:', Bo.ToolChainFamily, Bo.ToolChain, Bo.Option
+                print('\t\tBuildOption:', Bo.ToolChainFamily, Bo.ToolChain, Bo.Option)
             for Pcd in Item.PcdBuildDefinitions:
-                print '\t\tPcd:', Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.MaxDatumSize, Pcd.DefaultValue, Pcd.ItemType
+                print('\t\tPcd:', Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.MaxDatumSize, Pcd.DefaultValue, Pcd.ItemType)
 
 ##
 #
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspace.py b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
index f22a545b77ce..ed85e4ee0b06 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspace.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os, sys, time
 from DataType import *
 from Common.LongFilePathSupport import OpenLongFilePath as open
@@ -39,7 +40,7 @@ class EdkIIWorkspace:
         # Check environment valiable 'WORKSPACE'
         #
         if os.environ.get('WORKSPACE') == None:
-            print 'ERROR: WORKSPACE not defined.    Please run EdkSetup from the EDK II install directory.'
+            print('ERROR: WORKSPACE not defined.    Please run EdkSetup from the EDK II install directory.')
             return False
 
         self.CurrentWorkingDir = os.getcwd()
@@ -76,18 +77,18 @@ class EdkIIWorkspace:
         if self.PrintRunTime:
             Seconds = int(time.time() - self.StartTime)
             if Seconds < 60:
-                print 'Run Time: %d seconds' % (Seconds)
+                print('Run Time: %d seconds' % (Seconds))
             else:
                 Minutes = Seconds / 60
                 Seconds = Seconds % 60
                 if Minutes < 60:
-                    print 'Run Time: %d minutes %d seconds' % (Minutes, Seconds)
+                    print('Run Time: %d minutes %d seconds' % (Minutes, Seconds))
                 else:
                     Hours = Minutes / 60
                     Minutes = Minutes % 60
-                    print 'Run Time: %d hours %d minutes %d seconds' % (Hours, Minutes, Seconds)
+                    print('Run Time: %d hours %d minutes %d seconds' % (Hours, Minutes, Seconds))
         if self.RunStatus != '':
-            print self.RunStatus
+            print(self.RunStatus)
 
     ## Convert to a workspace relative filename
     #
@@ -136,7 +137,7 @@ class EdkIIWorkspace:
     #
     def XmlParseFile (self, FileName):
         if self.Verbose:
-            print FileName
+            print(FileName)
         return XmlParseFile (self.WorkspaceFile(FileName))
 
     ## Convert a XML section
@@ -150,7 +151,7 @@ class EdkIIWorkspace:
     #
     def XmlParseFileSection (self, FileName, SectionTag):
         if self.Verbose:
-            print FileName
+            print(FileName)
         return XmlParseFileSection (self.WorkspaceFile(FileName), SectionTag)
 
     ## Save a XML file
@@ -164,7 +165,7 @@ class EdkIIWorkspace:
     #
     def XmlSaveFile (self, Dom, FileName):
         if self.Verbose:
-            print FileName
+            print(FileName)
         return XmlSaveFile (Dom, self.WorkspaceFile(FileName))
 
     ## Convert Text File To Dictionary
@@ -182,7 +183,7 @@ class EdkIIWorkspace:
     #
     def ConvertTextFileToDictionary(self, FileName, Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter):
         if self.Verbose:
-            print FileName
+            print(FileName)
         return ConvertTextFileToDictionary(self.WorkspaceFile(FileName), Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter)
 
     ## Convert Dictionary To Text File
@@ -200,7 +201,7 @@ class EdkIIWorkspace:
     #
     def ConvertDictionaryToTextFile(self, FileName, Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter):
         if self.Verbose:
-            print FileName
+            print(FileName)
         return ConvertDictionaryToTextFile(self.WorkspaceFile(FileName), Dictionary, CommentCharacter, KeySplitCharacter, ValueSplitFlag, ValueSplitCharacter)
 
 ## Convert Text File To Dictionary
@@ -317,4 +318,4 @@ def CreateFile(Directory, FileName, Mode='w'):
 #
 if __name__ == '__main__':
     # Nothing to do here. Could do some unit tests
-    pass
\ No newline at end of file
+    pass
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py b/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
index d6df01d4ce06..a2f7c94c1ca7 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os, string, copy, pdb, copy
 import EdkLogger
 import DataType
@@ -1568,89 +1569,89 @@ class WorkspaceBuild(object):
     # Print each item of the workspacebuild with (Key = Value) pair
     #
     def ShowWorkspaceBuild(self):
-        print self.DscDatabase
-        print self.InfDatabase
-        print self.DecDatabase
-        print 'SupArchList', self.SupArchList
-        print 'BuildTarget', self.BuildTarget
-        print 'SkuId', self.SkuId
+        print(self.DscDatabase)
+        print(self.InfDatabase)
+        print(self.DecDatabase)
+        print('SupArchList', self.SupArchList)
+        print('BuildTarget', self.BuildTarget)
+        print('SkuId', self.SkuId)
 
         for Arch in self.SupArchList:
-            print Arch
-            print 'Platform'
+            print(Arch)
+            print('Platform')
             for Platform in self.Build[Arch].PlatformDatabase.keys():
                 P = self.Build[Arch].PlatformDatabase[Platform]
-                print 'DescFilePath = ', P.DescFilePath
-                print 'PlatformName = ', P.PlatformName
-                print 'Guid = ', P.Guid
-                print 'Version = ', P.Version
-                print 'OutputDirectory = ', P.OutputDirectory
-                print 'FlashDefinition = ', P.FlashDefinition
-                print 'SkuIds = ', P.SkuIds
-                print 'Modules = ', P.Modules
-                print 'LibraryClasses = ', P.LibraryClasses
-                print 'Pcds = ', P.Pcds
+                print('DescFilePath = ', P.DescFilePath)
+                print('PlatformName = ', P.PlatformName)
+                print('Guid = ', P.Guid)
+                print('Version = ', P.Version)
+                print('OutputDirectory = ', P.OutputDirectory)
+                print('FlashDefinition = ', P.FlashDefinition)
+                print('SkuIds = ', P.SkuIds)
+                print('Modules = ', P.Modules)
+                print('LibraryClasses = ', P.LibraryClasses)
+                print('Pcds = ', P.Pcds)
                 for item in P.Pcds.keys():
-                    print P.Pcds[item]
-                print 'BuildOptions = ', P.BuildOptions
-                print ''
+                    print(P.Pcds[item])
+                print('BuildOptions = ', P.BuildOptions)
+                print('')
             # End of Platform
 
-            print 'package'
+            print('package')
             for Package in self.Build[Arch].PackageDatabase.keys():
                 P = self.Build[Arch].PackageDatabase[Package]
-                print 'DescFilePath = ', P.DescFilePath
-                print 'PackageName = ', P.PackageName
-                print 'Guid = ', P.Guid
-                print 'Version = ', P.Version
-                print 'Protocols = ', P.Protocols
-                print 'Ppis = ', P.Ppis
-                print 'Guids = ', P.Guids
-                print 'Includes = ', P.Includes
-                print 'LibraryClasses = ', P.LibraryClasses
-                print 'Pcds = ', P.Pcds
+                print('DescFilePath = ', P.DescFilePath)
+                print('PackageName = ', P.PackageName)
+                print('Guid = ', P.Guid)
+                print('Version = ', P.Version)
+                print('Protocols = ', P.Protocols)
+                print('Ppis = ', P.Ppis)
+                print('Guids = ', P.Guids)
+                print('Includes = ', P.Includes)
+                print('LibraryClasses = ', P.LibraryClasses)
+                print('Pcds = ', P.Pcds)
                 for item in P.Pcds.keys():
-                    print P.Pcds[item]
-                print ''
+                    print(P.Pcds[item])
+                print('')
             # End of Package
 
-            print 'module'
+            print('module')
             for Module in self.Build[Arch].ModuleDatabase.keys():
                 P = self.Build[Arch].ModuleDatabase[Module]
-                print 'DescFilePath = ', P.DescFilePath
-                print 'BaseName = ', P.BaseName
-                print 'ModuleType = ', P.ModuleType
-                print 'Guid = ', P.Guid
-                print 'Version = ', P.Version
-                print 'CustomMakefile = ', P.CustomMakefile
-                print 'Specification = ', P.Specification
-                print 'Shadow = ', P.Shadow
-                print 'PcdIsDriver = ', P.PcdIsDriver
+                print('DescFilePath = ', P.DescFilePath)
+                print('BaseName = ', P.BaseName)
+                print('ModuleType = ', P.ModuleType)
+                print('Guid = ', P.Guid)
+                print('Version = ', P.Version)
+                print('CustomMakefile = ', P.CustomMakefile)
+                print('Specification = ', P.Specification)
+                print('Shadow = ', P.Shadow)
+                print('PcdIsDriver = ', P.PcdIsDriver)
                 for Lib in P.LibraryClass:
-                    print 'LibraryClassDefinition = ', Lib.LibraryClass, 'SupModList = ', Lib.SupModList
-                print 'ModuleEntryPointList = ', P.ModuleEntryPointList
-                print 'ModuleUnloadImageList = ', P.ModuleUnloadImageList
-                print 'ConstructorList = ', P.ConstructorList
-                print 'DestructorList = ', P.DestructorList
+                    print('LibraryClassDefinition = ', Lib.LibraryClass, 'SupModList = ', Lib.SupModList)
+                print('ModuleEntryPointList = ', P.ModuleEntryPointList)
+                print('ModuleUnloadImageList = ', P.ModuleUnloadImageList)
+                print('ConstructorList = ', P.ConstructorList)
+                print('DestructorList = ', P.DestructorList)
 
-                print 'Binaries = '
+                print('Binaries = ')
                 for item in P.Binaries:
-                    print item.BinaryFile, item.FeatureFlag, item.SupArchList
-                print 'Sources = '
+                    print(item.BinaryFile, item.FeatureFlag, item.SupArchList)
+                print('Sources = ')
                 for item in P.Sources:
-                    print item.SourceFile
-                print 'LibraryClasses = ', P.LibraryClasses
-                print 'Protocols = ', P.Protocols
-                print 'Ppis = ', P.Ppis
-                print 'Guids = ', P.Guids
-                print 'Includes = ', P.Includes
-                print 'Packages = ', P.Packages
-                print 'Pcds = ', P.Pcds
+                    print(item.SourceFile)
+                print('LibraryClasses = ', P.LibraryClasses)
+                print('Protocols = ', P.Protocols)
+                print('Ppis = ', P.Ppis)
+                print('Guids = ', P.Guids)
+                print('Includes = ', P.Includes)
+                print('Packages = ', P.Packages)
+                print('Pcds = ', P.Pcds)
                 for item in P.Pcds.keys():
-                    print P.Pcds[item]
-                print 'BuildOptions = ', P.BuildOptions
-                print 'Depex = ', P.Depex
-                print ''
+                    print(P.Pcds[item])
+                print('BuildOptions = ', P.BuildOptions)
+                print('Depex = ', P.Depex)
+                print('')
             # End of Module
 
 ##
@@ -1659,12 +1660,12 @@ class WorkspaceBuild(object):
 # script.
 #
 if __name__ == '__main__':
-    print 'Start!', time.strftime('%H:%M:%S', time.localtime())
+    print('Start!', time.strftime('%H:%M:%S', time.localtime()))
     EdkLogger.Initialize()
     EdkLogger.SetLevel(EdkLogger.QUIET)
     
     W = os.getenv('WORKSPACE')
     Ewb = WorkspaceBuild('Nt32Pkg/Nt32Pkg.dsc', W)
     Ewb.GenBuildDatabase({('PcdDevicePathSupportDevicePathFromText', 'gEfiMdeModulePkgTokenSpaceGuid') : 'KKKKKKKKKKKKKKKKKKKKK'}, ['Test.Inf'])
-    print 'Done!', time.strftime('%H:%M:%S', time.localtime())
+    print('Done!', time.strftime('%H:%M:%S', time.localtime()))
     Ewb.ShowWorkspaceBuild()
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 216694325f96..80e527dd3688 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -12,6 +12,7 @@
 
 ## Import Modules
 #
+from __future__ import print_function
 from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
@@ -883,10 +884,10 @@ if __name__ == '__main__':
         if input in 'qQ':
             break
         try:
-            print ValueExpression(input)(True)
-            print ValueExpression(input)(False)
+            print(ValueExpression(input)(True))
+            print(ValueExpression(input)(False))
         except WrnExpression as Ex:
-            print Ex.result
-            print str(Ex)
+            print(Ex.result)
+            print(str(Ex))
         except Exception as Ex:
-            print str(Ex)
+            print(str(Ex))
diff --git a/BaseTools/Source/Python/Common/FdfParserLite.py b/BaseTools/Source/Python/Common/FdfParserLite.py
index ac03c3fef5bb..f2741616c46f 100644
--- a/BaseTools/Source/Python/Common/FdfParserLite.py
+++ b/BaseTools/Source/Python/Common/FdfParserLite.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import re
 import Common.LongFilePathOs as os
 
@@ -1269,8 +1270,8 @@ class FdfParser(object):
         self.__UndoToken()
         if not self.__IsToken("[FD.", True):
             FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
-                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+            print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
             raise Warning("expected [FD.] At Line ", self.FileName, self.CurrentLineNumber)
         
         FdName = self.__GetUiName()
@@ -1837,8 +1838,8 @@ class FdfParser(object):
         self.__UndoToken()
         if not self.__IsToken("[FV.", True):
             FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
-                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+            print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
             raise Warning("Unknown Keyword At Line ", self.FileName, self.CurrentLineNumber)
         
         FvName = self.__GetUiName()
@@ -2643,8 +2644,8 @@ class FdfParser(object):
         self.__UndoToken()
         if not self.__IsToken("[CAPSULE.", True):
             FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
-                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+            print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
             raise Warning("expected [Capsule.] At Line ", self.FileName, self.CurrentLineNumber)        
             
         CapsuleObj = CommonDataClass.FdfClass.CapsuleClassObject()
@@ -2766,8 +2767,8 @@ class FdfParser(object):
         self.__UndoToken()
         if not self.__IsToken("[Rule.", True):
             FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
-                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+            print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
             raise Warning("expected [Rule.] At Line ", self.FileName, self.CurrentLineNumber)
 
         if not self.__SkipToToken("."):
@@ -3357,8 +3358,8 @@ class FdfParser(object):
         self.__UndoToken()
         if not self.__IsToken("[VTF.", True):
             FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
-                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
+            print('Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+                    % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine :], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine))
             raise Warning("expected [VTF.] At Line ", self.FileName, self.CurrentLineNumber)
 
         if not self.__SkipToToken("."):
@@ -3650,7 +3651,7 @@ class FdfParser(object):
                             raise Warning(LogStr)
         
         except Warning:
-            print LogStr
+            print(LogStr)
         
         finally:
             return CycleRefExists
@@ -3660,7 +3661,7 @@ if __name__ == "__main__":
     try:
         test_file = sys.argv[1]
     except IndexError as v:
-        print "Usage: %s filename" % sys.argv[0]
+        print("Usage: %s filename" % sys.argv[0])
         sys.exit(1)
 
     parser = FdfParser(test_file)
@@ -3668,7 +3669,7 @@ if __name__ == "__main__":
         parser.ParseFile()
         parser.CycleReferenceCheck()
     except Warning as X:
-        print X.message
+        print(X.message)
     else:
-        print "Success!"
+        print("Success!")
 
diff --git a/BaseTools/Source/Python/Common/InfClassObject.py b/BaseTools/Source/Python/Common/InfClassObject.py
index f24e4e41a0c1..fe82ffd8eb4e 100644
--- a/BaseTools/Source/Python/Common/InfClassObject.py
+++ b/BaseTools/Source/Python/Common/InfClassObject.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import EdkLogger
@@ -447,79 +448,79 @@ class Inf(InfObject):
     def ShowModule(self):
         M = self.Module
         for Arch in M.Header.keys():
-            print '\nArch =', Arch
-            print 'Filename =', M.Header[Arch].FileName
-            print 'FullPath =', M.Header[Arch].FullPath
-            print 'BaseName =', M.Header[Arch].Name
-            print 'Guid =', M.Header[Arch].Guid
-            print 'Version =', M.Header[Arch].Version
-            print 'InfVersion =', M.Header[Arch].InfVersion
-            print 'UefiSpecificationVersion =', M.Header[Arch].UefiSpecificationVersion
-            print 'EdkReleaseVersion =', M.Header[Arch].EdkReleaseVersion
-            print 'ModuleType =', M.Header[Arch].ModuleType
-            print 'BinaryModule =', M.Header[Arch].BinaryModule
-            print 'ComponentType =', M.Header[Arch].ComponentType
-            print 'MakefileName =', M.Header[Arch].MakefileName
-            print 'BuildNumber =', M.Header[Arch].BuildNumber
-            print 'BuildType =', M.Header[Arch].BuildType
-            print 'FfsExt =', M.Header[Arch].FfsExt
-            print 'FvExt =', M.Header[Arch].FvExt
-            print 'SourceFv =', M.Header[Arch].SourceFv
-            print 'PcdIsDriver =', M.Header[Arch].PcdIsDriver
-            print 'TianoEdkFlashMap_h =', M.Header[Arch].TianoEdkFlashMap_h
-            print 'Shadow =', M.Header[Arch].Shadow
-            print 'LibraryClass =', M.Header[Arch].LibraryClass
+            print('\nArch =', Arch)
+            print('Filename =', M.Header[Arch].FileName)
+            print('FullPath =', M.Header[Arch].FullPath)
+            print('BaseName =', M.Header[Arch].Name)
+            print('Guid =', M.Header[Arch].Guid)
+            print('Version =', M.Header[Arch].Version)
+            print('InfVersion =', M.Header[Arch].InfVersion)
+            print('UefiSpecificationVersion =', M.Header[Arch].UefiSpecificationVersion)
+            print('EdkReleaseVersion =', M.Header[Arch].EdkReleaseVersion)
+            print('ModuleType =', M.Header[Arch].ModuleType)
+            print('BinaryModule =', M.Header[Arch].BinaryModule)
+            print('ComponentType =', M.Header[Arch].ComponentType)
+            print('MakefileName =', M.Header[Arch].MakefileName)
+            print('BuildNumber =', M.Header[Arch].BuildNumber)
+            print('BuildType =', M.Header[Arch].BuildType)
+            print('FfsExt =', M.Header[Arch].FfsExt)
+            print('FvExt =', M.Header[Arch].FvExt)
+            print('SourceFv =', M.Header[Arch].SourceFv)
+            print('PcdIsDriver =', M.Header[Arch].PcdIsDriver)
+            print('TianoEdkFlashMap_h =', M.Header[Arch].TianoEdkFlashMap_h)
+            print('Shadow =', M.Header[Arch].Shadow)
+            print('LibraryClass =', M.Header[Arch].LibraryClass)
             for Item in M.Header[Arch].LibraryClass:
-                print Item.LibraryClass, DataType.TAB_VALUE_SPLIT.join(Item.SupModuleList)
-            print 'CustomMakefile =', M.Header[Arch].CustomMakefile
-            print 'Define =', M.Header[Arch].Define
-            print 'Specification =', M.Header[Arch].Specification
+                print(Item.LibraryClass, DataType.TAB_VALUE_SPLIT.join(Item.SupModuleList))
+            print('CustomMakefile =', M.Header[Arch].CustomMakefile)
+            print('Define =', M.Header[Arch].Define)
+            print('Specification =', M.Header[Arch].Specification)
         for Item in self.Module.ExternImages:
-            print '\nEntry_Point = %s, UnloadImage = %s' % (Item.ModuleEntryPoint, Item.ModuleUnloadImage)
+            print('\nEntry_Point = %s, UnloadImage = %s' % (Item.ModuleEntryPoint, Item.ModuleUnloadImage))
         for Item in self.Module.ExternLibraries:
-            print 'Constructor = %s, Destructor = %s' % (Item.Constructor, Item.Destructor)
-        print '\nBuildOptions =', M.BuildOptions
+            print('Constructor = %s, Destructor = %s' % (Item.Constructor, Item.Destructor))
+        print('\nBuildOptions =', M.BuildOptions)
         for Item in M.BuildOptions:
-            print Item.ToolChainFamily, Item.ToolChain, Item.Option, Item.SupArchList
-        print '\nIncludes =', M.Includes
+            print(Item.ToolChainFamily, Item.ToolChain, Item.Option, Item.SupArchList)
+        print('\nIncludes =', M.Includes)
         for Item in M.Includes:
-            print Item.FilePath, Item.SupArchList
-        print '\nLibraries =', M.Libraries
+            print(Item.FilePath, Item.SupArchList)
+        print('\nLibraries =', M.Libraries)
         for Item in M.Libraries:
-            print Item.Library, Item.SupArchList
-        print '\nLibraryClasses =', M.LibraryClasses
+            print(Item.Library, Item.SupArchList)
+        print('\nLibraryClasses =', M.LibraryClasses)
         for Item in M.LibraryClasses:
-            print Item.LibraryClass, Item.RecommendedInstance, Item.FeatureFlag, Item.SupModuleList, Item.SupArchList, Item.Define
-        print '\nPackageDependencies =', M.PackageDependencies
+            print(Item.LibraryClass, Item.RecommendedInstance, Item.FeatureFlag, Item.SupModuleList, Item.SupArchList, Item.Define)
+        print('\nPackageDependencies =', M.PackageDependencies)
         for Item in M.PackageDependencies:
-            print Item.FilePath, Item.SupArchList, Item.FeatureFlag
-        print '\nNmake =', M.Nmake
+            print(Item.FilePath, Item.SupArchList, Item.FeatureFlag)
+        print('\nNmake =', M.Nmake)
         for Item in M.Nmake:
-            print Item.Name, Item.Value, Item.SupArchList
-        print '\nPcds =', M.PcdCodes
+            print(Item.Name, Item.Value, Item.SupArchList)
+        print('\nPcds =', M.PcdCodes)
         for Item in M.PcdCodes:
-            print '\tCName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, Item.SupArchList
-        print '\nSources =', M.Sources
+            print('\tCName=', Item.CName, 'TokenSpaceGuidCName=', Item.TokenSpaceGuidCName, 'DefaultValue=', Item.DefaultValue, 'ItemType=', Item.ItemType, Item.SupArchList)
+        print('\nSources =', M.Sources)
         for Source in M.Sources:
-            print Source.SourceFile, 'Fam=', Source.ToolChainFamily, 'Pcd=', Source.FeatureFlag, 'Tag=', Source.TagName, 'ToolCode=', Source.ToolCode, Source.SupArchList
-        print '\nUserExtensions =', M.UserExtensions
+            print(Source.SourceFile, 'Fam=', Source.ToolChainFamily, 'Pcd=', Source.FeatureFlag, 'Tag=', Source.TagName, 'ToolCode=', Source.ToolCode, Source.SupArchList)
+        print('\nUserExtensions =', M.UserExtensions)
         for UserExtension in M.UserExtensions:
-            print UserExtension.UserID, UserExtension.Identifier, UserExtension.Content
-        print '\nGuids =', M.Guids
+            print(UserExtension.UserID, UserExtension.Identifier, UserExtension.Content)
+        print('\nGuids =', M.Guids)
         for Item in M.Guids:
-            print Item.CName, Item.SupArchList, Item.FeatureFlag
-        print '\nProtocols =', M.Protocols
+            print(Item.CName, Item.SupArchList, Item.FeatureFlag)
+        print('\nProtocols =', M.Protocols)
         for Item in M.Protocols:
-            print Item.CName, Item.SupArchList, Item.FeatureFlag
-        print '\nPpis =', M.Ppis
+            print(Item.CName, Item.SupArchList, Item.FeatureFlag)
+        print('\nPpis =', M.Ppis)
         for Item in M.Ppis:
-            print Item.CName, Item.SupArchList, Item.FeatureFlag
-        print '\nDepex =', M.Depex
+            print(Item.CName, Item.SupArchList, Item.FeatureFlag)
+        print('\nDepex =', M.Depex)
         for Item in M.Depex:
-            print Item.Depex, Item.SupArchList, Item.Define
-        print '\nBinaries =', M.Binaries
+            print(Item.Depex, Item.SupArchList, Item.Define)
+        print('\nBinaries =', M.Binaries)
         for Binary in M.Binaries:
-            print 'Type=', Binary.FileType, 'Target=', Binary.Target, 'Name=', Binary.BinaryFile, 'FeatureFlag=', Binary.FeatureFlag, 'SupArchList=', Binary.SupArchList
+            print('Type=', Binary.FileType, 'Target=', Binary.Target, 'Name=', Binary.BinaryFile, 'FeatureFlag=', Binary.FeatureFlag, 'SupArchList=', Binary.SupArchList)
 
     ## Convert [Defines] section content to ModuleHeaderClass
     #
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 10b6ac55242b..ee33ae3d3266 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -12,6 +12,7 @@
 
 # # Import Modules
 #
+from __future__ import print_function
 from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
@@ -93,11 +94,11 @@ class RangeContainer(object):
         self.__clean__()
         
     def dump(self):
-        print "----------------------"
+        print("----------------------")
         rangelist = ""
         for object in self.rangelist:
             rangelist = rangelist + "[%d , %d]" % (object.start, object.end)
-        print rangelist
+        print(rangelist)
         
         
 class XOROperatorObject(object):   
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 387e51523097..3408cff8d75e 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import EdkLogger
 import DataType
@@ -148,7 +149,7 @@ class TargetTxtClassObject(object):
             KeyList = Dict.keys()
             for Key in KeyList:
                 if Dict[Key] != '':
-                    print Key + ' = ' + str(Dict[Key])
+                    print(Key + ' = ' + str(Dict[Key]))
 
     ## Print the dictionary
     #
@@ -161,9 +162,9 @@ class TargetTxtClassObject(object):
         if type(List) == type([]):
             if len(List) > 0:
                 if Key.find(TAB_SPLIT) != -1:
-                    print "\n" + Key
+                    print("\n" + Key)
                     for Item in List:
-                        print Item
+                        print(Item)
 ## TargetTxtDict
 #
 # Load target.txt in input Conf dir
@@ -185,6 +186,6 @@ def TargetTxtDict(ConfDir):
 if __name__ == '__main__':
     pass
     Target = TargetTxtDict(os.getenv("WORKSPACE"))
-    print Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER]
-    print Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TARGET]
-    print Target.TargetTxtDictionary
+    print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER])
+    print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TARGET])
+    print(Target.TargetTxtDictionary)
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 14ccabe833db..a6c1fb70bd7d 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -15,6 +15,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import Common.EdkLogger as EdkLogger
@@ -249,7 +250,7 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
     except Exception as X:
         EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData="%s" % (str(X)))
     (out, error) = PopenObject.communicate()
-    print out
+    print(out)
     while PopenObject.returncode == None :
         PopenObject.wait()
     
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index 39883aca07c4..d1b6aed71087 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -1,3 +1,4 @@
+from __future__ import print_function
 # $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
 
 from antlr3 import *
@@ -109,7 +110,7 @@ class CParser(Parser):
               
             
     def printTokenInfo(self, line, offset, tokenText):
-    	print str(line)+ ',' + str(offset) + ':' + str(tokenText)
+    	print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
         
     def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
     	PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
diff --git a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
index 171600feebf9..7bdb3cc3aea5 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
@@ -16,6 +16,7 @@
 # Import Modules
 #
 
+from __future__ import print_function
 import re
 import Common.LongFilePathOs as os
 import sys
@@ -567,58 +568,58 @@ class CodeFragmentCollector:
         
     def PrintFragments(self):
         
-        print '################# ' + self.FileName + '#####################'
+        print('################# ' + self.FileName + '#####################')
         
-        print '/****************************************/'
-        print '/*************** COMMENTS ***************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*************** COMMENTS ***************/')
+        print('/****************************************/')
         for comment in FileProfile.CommentList:
-            print str(comment.StartPos) + comment.Content
+            print(str(comment.StartPos) + comment.Content)
         
-        print '/****************************************/'
-        print '/********* PREPROCESS DIRECTIVES ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* PREPROCESS DIRECTIVES ********/')
+        print('/****************************************/')
         for pp in FileProfile.PPDirectiveList:
-            print str(pp.StartPos) + pp.Content
+            print(str(pp.StartPos) + pp.Content)
         
-        print '/****************************************/'
-        print '/********* VARIABLE DECLARATIONS ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* VARIABLE DECLARATIONS ********/')
+        print('/****************************************/')
         for var in FileProfile.VariableDeclarationList:
-            print str(var.StartPos) + var.Modifier + ' '+ var.Declarator
+            print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
             
-        print '/****************************************/'
-        print '/********* FUNCTION DEFINITIONS *********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* FUNCTION DEFINITIONS *********/')
+        print('/****************************************/')
         for func in FileProfile.FunctionDefinitionList:
-            print str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos)
+            print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
             
-        print '/****************************************/'
-        print '/************ ENUMERATIONS **************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************ ENUMERATIONS **************/')
+        print('/****************************************/')
         for enum in FileProfile.EnumerationDefinitionList:
-            print str(enum.StartPos) + enum.Content
+            print(str(enum.StartPos) + enum.Content)
         
-        print '/****************************************/'
-        print '/*********** STRUCTS/UNIONS *************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*********** STRUCTS/UNIONS *************/')
+        print('/****************************************/')
         for su in FileProfile.StructUnionDefinitionList:
-            print str(su.StartPos) + su.Content
+            print(str(su.StartPos) + su.Content)
             
-        print '/****************************************/'
-        print '/********* PREDICATE EXPRESSIONS ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* PREDICATE EXPRESSIONS ********/')
+        print('/****************************************/')
         for predexp in FileProfile.PredicateExpressionList:
-            print str(predexp.StartPos) + predexp.Content
+            print(str(predexp.StartPos) + predexp.Content)
         
-        print '/****************************************/'    
-        print '/************** TYPEDEFS ****************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************** TYPEDEFS ****************/')
+        print('/****************************************/')
         for typedef in FileProfile.TypedefDefinitionList:
-            print str(typedef.StartPos) + typedef.ToType
+            print(str(typedef.StartPos) + typedef.ToType)
         
 if __name__ == "__main__":
     
     collector = CodeFragmentCollector(sys.argv[1])
     collector.PreprocessFile()
-    print "For Test."
+    print("For Test.")
diff --git a/BaseTools/Source/Python/Ecc/Configuration.py b/BaseTools/Source/Python/Ecc/Configuration.py
index b523858e1b1f..c3bbba09b744 100644
--- a/BaseTools/Source/Python/Ecc/Configuration.py
+++ b/BaseTools/Source/Python/Ecc/Configuration.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import Common.EdkLogger as EdkLogger
 from Common.DataType import *
@@ -315,6 +316,6 @@ class Configuration(object):
                 self.__dict__[List[0]] = List[1]
 
     def ShowMe(self):
-        print self.Filename
+        print(self.Filename)
         for Key in self.__dict__.keys():
-            print Key, '=', self.__dict__[Key]
+            print(Key, '=', self.__dict__[Key])
diff --git a/BaseTools/Source/Python/Ecc/Exception.py b/BaseTools/Source/Python/Ecc/Exception.py
index b0882afa6289..bde41c3a4b57 100644
--- a/BaseTools/Source/Python/Ecc/Exception.py
+++ b/BaseTools/Source/Python/Ecc/Exception.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 from Xml.XmlRoutines import *
 import Common.LongFilePathOs as os
 
@@ -84,4 +85,4 @@ class ExceptionCheck(object):
 #
 if __name__ == '__main__':
     El = ExceptionCheck('C:\\Hess\\Project\\BuildTool\\src\\Ecc\\exception.xml')
-    print El.ExceptionList
+    print(El.ExceptionList)
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index a4057ceb1775..5bb7759e2120 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 
 import Common.EdkLogger as EdkLogger
@@ -99,7 +100,7 @@ class Table(object):
         try:
             self.Cur.execute(SqlCommand)
         except Exception as e:
-            print "An error occurred when Drop a table:", e.args[0]
+            print("An error occurred when Drop a table:", e.args[0])
 
     ## Get count
     #
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index 4ce8edf5573a..eb76f4e6d54a 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import xml.dom.minidom
 from Common.LongFilePathSupport import OpenLongFilePath as open
 
@@ -215,7 +216,7 @@ def XmlParseFile(FileName):
         XmlFile.close()
         return Dom
     except Exception as X:
-        print X
+        print(X)
         return ""
 
 # This acts like the main() function for the script, unless it is 'import'ed
@@ -225,5 +226,5 @@ if __name__ == '__main__':
     A = CreateXmlElement('AAA', 'CCC',  [['AAA', '111'], ['BBB', '222']], [['A', '1'], ['B', '2']])
     B = CreateXmlElement('ZZZ', 'CCC',  [['XXX', '111'], ['YYY', '222']], [['A', '1'], ['B', '2']])
     C = CreateXmlList('DDD', 'EEE', [A, B], ['FFF', 'GGG'])
-    print C.toprettyxml(indent = " ")
+    print(C.toprettyxml(indent = " "))
     pass
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 8a4b10727a07..7f83387c08c8 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -11,6 +11,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from __future__ import print_function
 import sys
 import Common.LongFilePathOs as os
 import re
@@ -2279,7 +2280,7 @@ def CheckDoxygenTripleForwardSlash(FullFileName):
         for Result in ResultSet:
             CommentSet.append(Result)
     except:
-        print 'Unrecognized chars in comment of file %s', FullFileName
+        print('Unrecognized chars in comment of file %s', FullFileName)
 
 
     for Result in CommentSet:
@@ -2432,7 +2433,7 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
         for Result in ResultSet:
             CommentSet.append(Result)
     except:
-        print 'Unrecognized chars in comment of file %s', FullFileName
+        print('Unrecognized chars in comment of file %s', FullFileName)
 
     # Func Decl check
     SqlStatement = """ select Modifier, Name, StartLine, ID, Value
@@ -2463,7 +2464,7 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
         for Result in ResultSet:
             CommentSet.append(Result)
     except:
-        print 'Unrecognized chars in comment of file %s', FullFileName
+        print('Unrecognized chars in comment of file %s', FullFileName)
 
     SqlStatement = """ select Modifier, Header, StartLine, ID, Name
                        from Function
@@ -2628,9 +2629,9 @@ if __name__ == '__main__':
     try:
         test_file = sys.argv[1]
     except IndexError as v:
-        print "Usage: %s filename" % sys.argv[0]
+        print("Usage: %s filename" % sys.argv[0])
         sys.exit(1)
     MsgList = CheckFuncHeaderDoxygenComments(test_file)
     for Msg in MsgList:
-        print Msg
-    print 'Done!'
+        print(Msg)
+    print('Done!')
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index 39883aca07c4..d1b6aed71087 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -1,3 +1,4 @@
+from __future__ import print_function
 # $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
 
 from antlr3 import *
@@ -109,7 +110,7 @@ class CParser(Parser):
               
             
     def printTokenInfo(self, line, offset, tokenText):
-    	print str(line)+ ',' + str(offset) + ':' + str(tokenText)
+    	print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
         
     def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
     	PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
diff --git a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
index bb78a0f882d5..5d5336bee463 100644
--- a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import re
 import Common.LongFilePathOs as os
 import sys
@@ -413,49 +414,49 @@ class CodeFragmentCollector:
     #
     def PrintFragments(self):
 
-        print '################# ' + self.FileName + '#####################'
+        print('################# ' + self.FileName + '#####################')
 
-        print '/****************************************/'
-        print '/*************** ASSIGNMENTS ***************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*************** ASSIGNMENTS ***************/')
+        print('/****************************************/')
         for asign in FileProfile.AssignmentExpressionList:
-            print str(asign.StartPos) + asign.Name + asign.Operator + asign.Value
+            print(str(asign.StartPos) + asign.Name + asign.Operator + asign.Value)
 
-        print '/****************************************/'
-        print '/********* PREPROCESS DIRECTIVES ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* PREPROCESS DIRECTIVES ********/')
+        print('/****************************************/')
         for pp in FileProfile.PPDirectiveList:
-            print str(pp.StartPos) + pp.Content
+            print(str(pp.StartPos) + pp.Content)
 
-        print '/****************************************/'
-        print '/********* VARIABLE DECLARATIONS ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* VARIABLE DECLARATIONS ********/')
+        print('/****************************************/')
         for var in FileProfile.VariableDeclarationList:
-            print str(var.StartPos) + var.Modifier + ' '+ var.Declarator
+            print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
 
-        print '/****************************************/'
-        print '/********* FUNCTION DEFINITIONS *********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* FUNCTION DEFINITIONS *********/')
+        print('/****************************************/')
         for func in FileProfile.FunctionDefinitionList:
-            print str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos)
+            print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
 
-        print '/****************************************/'
-        print '/************ ENUMERATIONS **************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************ ENUMERATIONS **************/')
+        print('/****************************************/')
         for enum in FileProfile.EnumerationDefinitionList:
-            print str(enum.StartPos) + enum.Content
+            print(str(enum.StartPos) + enum.Content)
 
-        print '/****************************************/'
-        print '/*********** STRUCTS/UNIONS *************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*********** STRUCTS/UNIONS *************/')
+        print('/****************************************/')
         for su in FileProfile.StructUnionDefinitionList:
-            print str(su.StartPos) + su.Content
+            print(str(su.StartPos) + su.Content)
 
-        print '/****************************************/'
-        print '/************** TYPEDEFS ****************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************** TYPEDEFS ****************/')
+        print('/****************************************/')
         for typedef in FileProfile.TypedefDefinitionList:
-            print str(typedef.StartPos) + typedef.ToType
+            print(str(typedef.StartPos) + typedef.ToType)
 
 ##
 #
@@ -464,4 +465,4 @@ class CodeFragmentCollector:
 #
 if __name__ == "__main__":
 
-    print "For Test."
+    print("For Test.")
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 6696623aba68..9d8f0864dc41 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -13,6 +13,7 @@
 
 ## Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import sys
@@ -1190,17 +1191,17 @@ class PeImage:
         self.Machine, self.NumberOfSections, self.SizeOfOptionalHeader = \
             self._FileHeader.unpack_from(self._PeImageBuf, self.Offset + FileHeaderOffset)
 
-        print "Machine=%x NumberOfSections=%x SizeOfOptionalHeader=%x" % (self.Machine, self.NumberOfSections, self.SizeOfOptionalHeader)
+        print("Machine=%x NumberOfSections=%x SizeOfOptionalHeader=%x" % (self.Machine, self.NumberOfSections, self.SizeOfOptionalHeader))
         # optional header follows the FILE header
         OptionalHeaderOffset = FileHeaderOffset + struct.calcsize(self._FileHeaderFormat)
         Magic, self.SizeOfImage, SizeOfHeaders, self.Checksum, NumberOfRvaAndSizes = \
             self._OptionalHeader32.unpack_from(self._PeImageBuf, self.Offset + OptionalHeaderOffset)
-        print "Magic=%x SizeOfImage=%x SizeOfHeaders=%x, Checksum=%x, NumberOfRvaAndSizes=%x" % (Magic, self.SizeOfImage, SizeOfHeaders, self.Checksum, NumberOfRvaAndSizes)
+        print("Magic=%x SizeOfImage=%x SizeOfHeaders=%x, Checksum=%x, NumberOfRvaAndSizes=%x" % (Magic, self.SizeOfImage, SizeOfHeaders, self.Checksum, NumberOfRvaAndSizes))
 
         PeImageSectionTableOffset = OptionalHeaderOffset + self.SizeOfOptionalHeader
         PeSections = PeSectionTable(self._PeImageBuf, self.Offset + PeImageSectionTableOffset, self.NumberOfSections)
 
-        print "%x" % PeSections.GetFileAddress(0x3920)
+        print("%x" % PeSections.GetFileAddress(0x3920))
 
 ## PeSectionTable() class
 #
@@ -1215,7 +1216,7 @@ class PeSectionTable:
             SectionHeader = PeSectionHeader(Buf, SectionHeaderOffset)
             self._SectionList.append(SectionHeader)
             SectionHeaderOffset += len(SectionHeader)
-            print SectionHeader
+            print(SectionHeader)
 
     def GetFileAddress(self, Rva):
         for PeSection in self._SectionList:
@@ -1412,7 +1413,7 @@ def Main():
         Option = GetOptions()
         build.main()
     except Exception as e:
-        print e
+        print(e)
         return 1
 
     return 0
@@ -1435,7 +1436,7 @@ if __name__ == '__main__':
             fv = FirmwareVolume("FVRECOVERY")
             fv.frombuffer(buf, 0, len(buf))
             #fv.Dispatch(None)
-            print fv
+            print(fv)
         elif FilePath.endswith(".efi"):
             fd = open(FilePath, 'rb')
             buf = array('B')
diff --git a/BaseTools/Source/Python/Eot/InfParserLite.py b/BaseTools/Source/Python/Eot/InfParserLite.py
index 6bb2c5f9f1d6..f624837f2587 100644
--- a/BaseTools/Source/Python/Eot/InfParserLite.py
+++ b/BaseTools/Source/Python/Eot/InfParserLite.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import Common.EdkLogger as EdkLogger
 from Common.DataType import *
@@ -164,8 +165,8 @@ if __name__ == '__main__':
     Db.InitDatabase()
     P = EdkInfParser(os.path.normpath("C:\Framework\Edk\Sample\Platform\Nt32\Dxe\PlatformBds\PlatformBds.inf"), Db, '', '')
     for Inf in P.Sources:
-        print Inf
+        print(Inf)
     for Item in P.Macros:
-        print Item, P.Macros[Item]
+        print(Item, P.Macros[Item])
 
-    Db.Close()
\ No newline at end of file
+    Db.Close()
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index 8199ce5ee73e..c70f62f393a9 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import sys
 import Common.LongFilePathOs as os
 import re
@@ -384,4 +385,4 @@ if __name__ == '__main__':
     EdkLogger.SetLevel(EdkLogger.QUIET)
     CollectSourceCodeDataIntoDB(sys.argv[1])
 
-    print 'Done!'
+    print('Done!')
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 15b2b792b2e1..d4ba485bcdff 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -16,6 +16,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import re
 
 import Fd
@@ -4818,7 +4819,7 @@ if __name__ == "__main__":
     try:
         test_file = sys.argv[1]
     except IndexError as v:
-        print "Usage: %s filename" % sys.argv[0]
+        print("Usage: %s filename" % sys.argv[0])
         sys.exit(1)
 
     parser = FdfParser(test_file)
@@ -4826,7 +4827,7 @@ if __name__ == "__main__":
         parser.ParseFile()
         parser.CycleReferenceCheck()
     except Warning as X:
-        print str(X)
+        print(str(X))
     else:
-        print "Success!"
+        print("Success!")
 
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 51b79397337c..b2cc25d46cbc 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 from optparse import OptionParser
 import sys
 import Common.LongFilePathOs as os
@@ -745,7 +746,7 @@ class GenFds :
         ModuleDict = BuildDb.BuildObject[DscFile, 'COMMON', GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Modules
         for Key in ModuleDict:
             ModuleObj = BuildDb.BuildObject[Key, 'COMMON', GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
-            print ModuleObj.BaseName + ' ' + ModuleObj.ModuleType
+            print(ModuleObj.BaseName + ' ' + ModuleObj.ModuleType)
 
     def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
         GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index da955fe1a4f7..969f9f2e2137 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import sys
 import subprocess
@@ -737,7 +738,7 @@ class GenFdsGlobalVariable:
             GenFdsGlobalVariable.InfLogger (out)
             GenFdsGlobalVariable.InfLogger (error)
             if PopenObject.returncode != 0:
-                print "###", cmd
+                print("###", cmd)
                 EdkLogger.error("GenFds", COMMAND_FAILURE, errorMess)
 
     def VerboseLogger (msg):
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index fdad5a44dc3d..127385228fcf 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -17,6 +17,7 @@
 #
 
 #======================================  External Libraries ========================================
+from __future__ import print_function
 import optparse
 import Common.LongFilePathOs as os
 import re
@@ -215,7 +216,7 @@ if __name__ == '__main__':
     (options, args) = parser.parse_args()
 
     if options.mapfile == None or options.efifile == None:
-        print parser.get_usage()
+        print(parser.get_usage())
     elif os.path.exists(options.mapfile) and os.path.exists(options.efifile):
         list = parsePcdInfoFromMapFile(options.mapfile, options.efifile)
         if list != None:
@@ -224,6 +225,6 @@ if __name__ == '__main__':
             else:
                 generatePcdTable(list, options.mapfile.replace('.map', '.BinaryPcdTable.txt'))
         else:
-            print 'Fail to generate Patch PCD Table based on map file and efi file'
+            print('Fail to generate Patch PCD Table based on map file and efi file')
     else:
-        print 'Fail to generate Patch PCD Table for fail to find map file or efi file!'
+        print('Fail to generate Patch PCD Table for fail to find map file or efi file!')
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index de8575676cac..4f79d0f82967 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -19,6 +19,7 @@
 '''
 Pkcs7Sign
 '''
+from __future__ import print_function
 
 import os
 import sys
@@ -113,14 +114,14 @@ if __name__ == '__main__':
   try:
     Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
   except:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(1)
 
   Version = Process.communicate()
   if Process.returncode <> 0:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print Version[0]
+  print(Version[0])
 
   #
   # Read input file into a buffer and save input filename
@@ -134,7 +135,7 @@ if __name__ == '__main__':
   #
   OutputDir = os.path.dirname(args.OutputFile)
   if not os.path.exists(OutputDir):
-    print 'ERROR: The output path does not exist: %s' % OutputDir
+    print('ERROR: The output path does not exist: %s' % OutputDir)
     sys.exit(1)
   args.OutputFileName = args.OutputFile
 
@@ -170,7 +171,7 @@ if __name__ == '__main__':
         args.SignerPrivateCertFile = open(args.SignerPrivateCertFileName, 'rb')
         args.SignerPrivateCertFile.close()
       except:
-        print 'ERROR: test signer private cert file %s missing' % (args.SignerPrivateCertFileName)
+        print('ERROR: test signer private cert file %s missing' % (args.SignerPrivateCertFileName))
         sys.exit(1)
 
     #
@@ -196,7 +197,7 @@ if __name__ == '__main__':
         args.OtherPublicCertFile = open(args.OtherPublicCertFileName, 'rb')
         args.OtherPublicCertFile.close()
       except:
-        print 'ERROR: test other public cert file %s missing' % (args.OtherPublicCertFileName)
+        print('ERROR: test other public cert file %s missing' % (args.OtherPublicCertFileName))
         sys.exit(1)
 
     format = "%dsQ" % len(args.InputFileBuffer)
@@ -242,11 +243,11 @@ if __name__ == '__main__':
         args.TrustedPublicCertFile = open(args.TrustedPublicCertFileName, 'rb')
         args.TrustedPublicCertFile.close()
       except:
-        print 'ERROR: test trusted public cert file %s missing' % (args.TrustedPublicCertFileName)
+        print('ERROR: test trusted public cert file %s missing' % (args.TrustedPublicCertFileName))
         sys.exit(1)
 
     if not args.SignatureSizeStr:
-      print "ERROR: please use the option --signature-size to specify the size of the signature data!"
+      print("ERROR: please use the option --signature-size to specify the size of the signature data!")
       sys.exit(1)
     else:
       if args.SignatureSizeStr.upper().startswith('0X'):
@@ -254,10 +255,10 @@ if __name__ == '__main__':
       else:
         SignatureSize = (long)(args.SignatureSizeStr)
     if SignatureSize < 0:
-        print "ERROR: The value of option --signature-size can't be set to negative value!"
+        print("ERROR: The value of option --signature-size can't be set to negative value!")
         sys.exit(1)
     elif SignatureSize > len(args.InputFileBuffer):
-        print "ERROR: The value of option --signature-size is exceed the size of the input file !"
+        print("ERROR: The value of option --signature-size is exceed the size of the input file !")
         sys.exit(1)
 
     args.SignatureBuffer = args.InputFileBuffer[0:SignatureSize]
@@ -277,7 +278,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName, args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=args.SignatureBuffer)[0]
     if Process.returncode <> 0:
-      print 'ERROR: Verification failed'
+      print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
 
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 95a636966c59..06ed2610271f 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -22,6 +22,7 @@
 '''
 Rsa2048Sha256GenerateKeys
 '''
+from __future__ import print_function
 
 import os
 import sys
@@ -75,14 +76,14 @@ if __name__ == '__main__':
   try:
     Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
   except:  
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(1)
     
   Version = Process.communicate()
   if Process.returncode <> 0:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print Version[0]
+  print(Version[0])
   
   args.PemFileName = []
   
@@ -103,7 +104,7 @@ if __name__ == '__main__':
       Process = subprocess.Popen('%s genrsa -out %s 2048' % (OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
       Process.communicate()
       if Process.returncode <> 0:
-        print 'ERROR: RSA 2048 key generation failed'
+        print('ERROR: RSA 2048 key generation failed')
         sys.exit(Process.returncode)
       
   #
@@ -125,7 +126,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
     if Process.returncode <> 0:
-      print 'ERROR: Unable to extract public key from private key'
+      print('ERROR: Unable to extract public key from private key')
       sys.exit(Process.returncode)
     PublicKey = ''
     for Index in range (0, len(PublicKeyHexString), 2):
@@ -138,7 +139,7 @@ if __name__ == '__main__':
     Process.stdin.write (PublicKey)
     PublicKeyHash = PublicKeyHash + Process.communicate()[0]
     if Process.returncode <> 0:
-      print 'ERROR: Unable to extract SHA 256 hash of public key'
+      print('ERROR: Unable to extract SHA 256 hash of public key')
       sys.exit(Process.returncode)
 
   #
@@ -171,4 +172,4 @@ if __name__ == '__main__':
   # If verbose is enabled display the public key in C structure format
   #
   if args.Verbose:
-    print 'PublicKeySha256 = ' + PublicKeyHashC    
+    print('PublicKeySha256 = ' + PublicKeyHashC)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 1ae6ebb35886..99a5d8aa5a01 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -17,6 +17,7 @@
 '''
 Rsa2048Sha256Sign
 '''
+from __future__ import print_function
 
 import os
 import sys
@@ -96,14 +97,14 @@ if __name__ == '__main__':
   try:
     Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
   except:  
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(1)
     
   Version = Process.communicate()
   if Process.returncode <> 0:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print Version[0]
+  print(Version[0])
   
   #
   # Read input file into a buffer and save input filename
@@ -117,7 +118,7 @@ if __name__ == '__main__':
   #
   OutputDir = os.path.dirname(args.OutputFile)
   if not os.path.exists(OutputDir):
-    print 'ERROR: The output path does not exist: %s' % OutputDir
+    print('ERROR: The output path does not exist: %s' % OutputDir)
     sys.exit(1)
   args.OutputFileName = args.OutputFile
 
@@ -144,7 +145,7 @@ if __name__ == '__main__':
       args.PrivateKeyFile = open(args.PrivateKeyFileName, 'rb')
       args.PrivateKeyFile.close()
     except:
-      print 'ERROR: test signing private key file %s missing' % (args.PrivateKeyFileName)
+      print('ERROR: test signing private key file %s missing' % (args.PrivateKeyFileName))
       sys.exit(1)
 
   #
@@ -202,14 +203,14 @@ if __name__ == '__main__':
     # Verify that the Hash Type matches the expected SHA256 type
     #
     if uuid.UUID(bytes_le = Header.HashType) <> EFI_HASH_ALGORITHM_SHA256_GUID:
-      print 'ERROR: unsupport hash GUID'
+      print('ERROR: unsupport hash GUID')
       sys.exit(1)
 
     #
     # Verify the public key
     #
     if Header.PublicKey <> PublicKey:
-      print 'ERROR: Public key in input file does not match public key from private key file'
+      print('ERROR: Public key in input file does not match public key from private key file')
       sys.exit(1)
 
     FullInputFileBuffer = args.InputFileBuffer
@@ -228,7 +229,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=FullInputFileBuffer)
     if Process.returncode <> 0:
-      print 'ERROR: Verification failed'
+      print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
 
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index 882b016bf058..ebed7a0ea7b8 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -12,6 +12,7 @@
 #  WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import sys
 import traceback
@@ -32,7 +33,7 @@ class TargetTool():
         self.Arg       = args[0]
         self.FileName  = os.path.normpath(os.path.join(self.WorkSpace, 'Conf', 'target.txt'))
         if os.path.isfile(self.FileName) == False:
-            print "%s does not exist." % self.FileName
+            print("%s does not exist." % self.FileName)
             sys.exit(1)
         self.TargetTxtDictionary = {
             TAB_TAT_DEFINES_ACTIVE_PLATFORM                            : None,
@@ -84,14 +85,14 @@ class TargetTool():
         errMsg  = ''
         for Key in KeyList:
             if type(self.TargetTxtDictionary[Key]) == type([]):
-                print "%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key]))
+                print("%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key])))
             elif self.TargetTxtDictionary[Key] == None:
                 errMsg += "  Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep 
             else:
-                print "%-30s = %s" % (Key, self.TargetTxtDictionary[Key])
+                print("%-30s = %s" % (Key, self.TargetTxtDictionary[Key]))
         
         if errMsg != '':
-            print os.linesep + 'Warning:' + os.linesep + errMsg
+            print(os.linesep + 'Warning:' + os.linesep + errMsg)
             
     def RWFile(self, CommentCharacter, KeySplitCharacter, Num):
         try:
@@ -110,7 +111,7 @@ class TargetTool():
                             if Key not in existKeys:
                                 existKeys.append(Key)
                             else:
-                                print "Warning: Found duplicate key item in original configuration files!"
+                                print("Warning: Found duplicate key item in original configuration files!")
                                 
                             if Num == 0:
                                 Line = "%-30s = \n" % Key
@@ -121,7 +122,7 @@ class TargetTool():
                             fw.write(Line)
             for key in self.TargetTxtDictionary.keys():
                 if key not in existKeys:
-                    print "Warning: %s does not exist in original configuration file" % key
+                    print("Warning: %s does not exist in original configuration file" % key)
                     Line = GetConfigureKeyValue(self, key)
                     if Line == None:
                         Line = "%-30s = " % key
@@ -224,25 +225,25 @@ if __name__ == '__main__':
     EdkLogger.Initialize()
     EdkLogger.SetLevel(EdkLogger.QUIET)
     if os.getenv('WORKSPACE') == None:
-        print "ERROR: WORKSPACE should be specified or edksetup script should be executed before run TargetTool"
+        print("ERROR: WORKSPACE should be specified or edksetup script should be executed before run TargetTool")
         sys.exit(1)
         
     (opt, args) = MyOptionParser()
     if len(args) != 1 or (args[0].lower() != 'print' and args[0].lower() != 'clean' and args[0].lower() != 'set'):
-        print "The number of args isn't 1 or the value of args is invalid."
+        print("The number of args isn't 1 or the value of args is invalid.")
         sys.exit(1)
     if opt.NUM != None and opt.NUM < 1:
-        print "The MAX_CONCURRENT_THREAD_NUMBER must be larger than 0."
+        print("The MAX_CONCURRENT_THREAD_NUMBER must be larger than 0.")
         sys.exit(1)
     if opt.TARGET != None and len(opt.TARGET) > 1:
         for elem in opt.TARGET:
             if elem == '0':
-                print "0 will clear the TARGET setting in target.txt and can't combine with other value."
+                print("0 will clear the TARGET setting in target.txt and can't combine with other value.")
                 sys.exit(1)
     if opt.TARGET_ARCH != None and len(opt.TARGET_ARCH) > 1:
         for elem in opt.TARGET_ARCH:
             if elem == '0':
-                print "0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value."
+                print("0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value.")
                 sys.exit(1)
 
     try:
diff --git a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
index ca21e6995217..afa5b2407ec5 100644
--- a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
@@ -14,6 +14,7 @@
 '''
 ExpressionValidate
 '''
+from __future__ import print_function
 
 ##
 # Import Modules
@@ -566,7 +567,7 @@ def IsValidFeatureFlagExp(Token, Flag=False):
 
 if __name__ == '__main__':
 #    print IsValidRangeExpr('LT 9')
-    print _LogicalExpressionParser('gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression()
+    print(_LogicalExpressionParser('gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression())
 
 
     
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index b00bba1f8440..84958ae38cef 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -14,6 +14,7 @@
 """
 Collect all defined strings in multiple uni files
 """
+from __future__ import print_function
 
 ##
 # Import Modules
@@ -748,7 +749,7 @@ class UniFileClassObject(object):
                     EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
                 NewLines.append(Line)
             else:
-                print Line
+                print(Line)
                 EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
                     
         if StrName and not StrName.split()[1].startswith(u'STR_'):
@@ -1040,12 +1041,12 @@ class UniFileClassObject(object):
     # Show the instance itself
     #
     def ShowMe(self):
-        print self.LanguageDef
+        print(self.LanguageDef)
         #print self.OrderedStringList
         for Item in self.OrderedStringList:
-            print Item
+            print(Item)
             for Member in self.OrderedStringList[Item]:
-                print str(Member)
+                print(str(Member))
     
     #
     # Read content from '!include' UNI file 
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 436dc90e6dd3..074aa311f31d 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -15,6 +15,7 @@
 '''
 DecPomAlignment
 '''
+from __future__ import print_function
 
 ##
 # Import Modules
@@ -902,47 +903,47 @@ class DecPomAlignment(PackageObject):
     # Print all members and their values of Package class
     #
     def ShowPackage(self):
-        print '\nName =', self.GetName()
-        print '\nBaseName =', self.GetBaseName()
-        print '\nVersion =', self.GetVersion() 
-        print '\nGuid =', self.GetGuid()
+        print('\nName =', self.GetName())
+        print('\nBaseName =', self.GetBaseName())
+        print('\nVersion =', self.GetVersion())
+        print('\nGuid =', self.GetGuid())
         
-        print '\nStandardIncludes = %d ' \
-            % len(self.GetStandardIncludeFileList()),
+        print('\nStandardIncludes = %d ' \
+            % len(self.GetStandardIncludeFileList()), end=' ')
         for Item in self.GetStandardIncludeFileList():
-            print Item.GetFilePath(), '  ', Item.GetSupArchList()
-        print '\nPackageIncludes = %d \n' \
-            % len(self.GetPackageIncludeFileList()),
+            print(Item.GetFilePath(), '  ', Item.GetSupArchList())
+        print('\nPackageIncludes = %d \n' \
+            % len(self.GetPackageIncludeFileList()), end=' ')
         for Item in self.GetPackageIncludeFileList():
-            print Item.GetFilePath(), '  ', Item.GetSupArchList()
+            print(Item.GetFilePath(), '  ', Item.GetSupArchList())
              
-        print '\nGuids =', self.GetGuidList()
+        print('\nGuids =', self.GetGuidList())
         for Item in self.GetGuidList():
-            print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
-        print '\nProtocols =', self.GetProtocolList()
+            print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+        print('\nProtocols =', self.GetProtocolList())
         for Item in self.GetProtocolList():
-            print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
-        print '\nPpis =', self.GetPpiList()
+            print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+        print('\nPpis =', self.GetPpiList())
         for Item in self.GetPpiList():
-            print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
-        print '\nLibraryClasses =', self.GetLibraryClassList()
+            print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+        print('\nLibraryClasses =', self.GetLibraryClassList())
         for Item in self.GetLibraryClassList():
-            print Item.GetLibraryClass(), Item.GetRecommendedInstance(), \
-            Item.GetSupArchList()
-        print '\nPcds =', self.GetPcdList()
+            print(Item.GetLibraryClass(), Item.GetRecommendedInstance(), \
+            Item.GetSupArchList())
+        print('\nPcds =', self.GetPcdList())
         for Item in self.GetPcdList():
-            print 'CName=', Item.GetCName(), 'TokenSpaceGuidCName=', \
+            print('CName=', Item.GetCName(), 'TokenSpaceGuidCName=', \
                 Item.GetTokenSpaceGuidCName(), \
                 'DefaultValue=', Item.GetDefaultValue(), \
                 'ValidUsage=', Item.GetValidUsage(), \
                 'SupArchList', Item.GetSupArchList(), \
-                'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType()
+                'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType())
  
         for Item in self.GetMiscFileList():
-            print Item.GetName()
+            print(Item.GetName())
             for FileObjectItem in Item.GetFileList():
-                print FileObjectItem.GetURI()
-        print '****************\n'
+                print(FileObjectItem.GetURI())
+        print('****************\n')
 
 ## GenPcdDeclaration
 #
diff --git a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
index 8b4ece2617a1..5f0abcafef27 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
@@ -11,6 +11,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 
+from __future__ import print_function
 import os
 import unittest
 
@@ -66,7 +67,7 @@ def TestTemplate(TestString, TestFunc):
         # Close file
         f.close()
     except:
-        print 'Can not create temporary file [%s]!' % Path
+        print('Can not create temporary file [%s]!' % Path)
         exit(-1)
 
     # Call test function to test
@@ -279,6 +280,6 @@ if __name__ == '__main__':
     unittest.FunctionTestCase(TestDecPcd).runTest()
     unittest.FunctionTestCase(TestDecUserExtension).runTest()
 
-    print 'All tests passed...'
+    print('All tests passed...')
 
 
diff --git a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
index f3b43ee0bc27..626f17426de7 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
@@ -11,6 +11,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 
+from __future__ import print_function
 import os
 #import Object.Parser.InfObject as InfObject
 from Object.Parser.InfCommonObject import CurrentLine
@@ -271,7 +272,7 @@ def PrepareTest(String):
                 TempFile  = open (FileName, "w")    
                 TempFile.close()
             except:
-                print "File Create Error"
+                print("File Create Error")
         CurrentLine = CurrentLine()
         CurrentLine.SetFileName("Test")
         CurrentLine.SetLineString(Item[0])
@@ -376,11 +377,11 @@ if __name__ == '__main__':
             try:
                 InfBinariesInstance.SetBinary(Ver = Ver, ArchList = ArchList)
             except:
-                print "Test Failed!"
+                print("Test Failed!")
                 AllPassedFlag = False
     
     if AllPassedFlag :
-        print 'All tests passed...'
+        print('All tests passed...')
     else:
-        print 'Some unit test failed!'
+        print('Some unit test failed!')
 
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 5824266dc4fe..cce7a6273313 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -17,6 +17,7 @@
 #  This class is used to retrieve information stored in database and convert them
 # into PlatformBuildClassObject form for easier use for AutoGen.
 #
+from __future__ import print_function
 from Common.String import *
 from Common.DataType import *
 from Common.Misc import *
@@ -909,9 +910,9 @@ class DscBuildData(PlatformBuildClassObject):
             for skuid in pcdobj.SkuInfoList:
                 if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]):
                     for storename in pcdobj.SkuInfoList[skuid].DefaultStoreDict:
-                        print "PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,storename,str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename]))
+                        print("PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,storename,str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename])))
                 else:
-                    print "PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,str(pcdobj.SkuInfoList[skuid].DefaultValue))
+                    print("PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,str(pcdobj.SkuInfoList[skuid].DefaultValue)))
     ## Retrieve [BuildOptions]
     def _GetBuildOptions(self):
         if self._BuildOptions == None:
@@ -1067,7 +1068,7 @@ class DscBuildData(PlatformBuildClassObject):
             for (skuname,StoreName,PcdGuid,PcdName,PcdValue) in Str_Pcd_Values:
                 str_pcd_obj = S_pcd_set.get((PcdName, PcdGuid))
                 if str_pcd_obj is None:
-                    print PcdName, PcdGuid
+                    print(PcdName, PcdGuid)
                     raise
                 if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
                                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
@@ -1244,10 +1245,10 @@ class DscBuildData(PlatformBuildClassObject):
         if Value[0] == '{' and Value[-1] == '}':
             return True
         if Value.startswith("L'") and Value.endswith("'") and len(list(Value[2:-1])) > 1:
-            print 'foo = ', list(Value[2:-1])
+            print('foo = ', list(Value[2:-1]))
             return True
         if Value[0] == "'" and Value[-1] == "'" and len(list(Value[1:-1])) > 1:
-            print 'bar = ', list(Value[1:-1])
+            print('bar = ', list(Value[1:-1]))
             return True
         return False
 
@@ -1255,11 +1256,11 @@ class DscBuildData(PlatformBuildClassObject):
         try:
             Process = subprocess.Popen(Command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
         except:
-            print 'ERROR: Can not execute command:', Command
+            print('ERROR: Can not execute command:', Command)
             sys.exit(1)
         Result = Process.communicate()
         if Process.returncode <> 0:
-            print 'ERROR: Can not collect output from command:', Command
+            print('ERROR: Can not collect output from command:', Command)
         return Result[0], Result[1]
 
     def IntToCString(self, Value, ValueSize):
@@ -1376,7 +1377,7 @@ class DscBuildData(PlatformBuildClassObject):
                     try:
                         Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
                     except Exception:
-                        print FieldList[FieldName][0]
+                        print(FieldList[FieldName][0])
                     if isinstance(Value, str):
                         CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                     elif IsArray:
@@ -1414,7 +1415,7 @@ class DscBuildData(PlatformBuildClassObject):
                         try:
                             Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
                         except Exception:
-                            print FieldList[FieldName][0]
+                            print(FieldList[FieldName][0])
                         if isinstance(Value, str):
                             CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                         elif IsArray:
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 74fa4d31b109..afc8394efcf0 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import time
@@ -1605,7 +1606,7 @@ class DscParser(MetaFileParser):
         try:
             self._ValueList[2] = '|'.join(ValList)
         except Exception:
-            print ValList
+            print(ValList)
 
     def __ProcessComponent(self):
         self._ValueList[0] = ReplaceMacro(self._ValueList[0], self._Macros)
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 0379fd8baf1e..45cdcc89f168 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -16,6 +16,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import StringIO
@@ -2184,7 +2185,7 @@ class Build():
                     toolsFile = os.path.join(FvDir, 'GuidedSectionTools.txt')
                     toolsFile = open(toolsFile, 'wt')
                     for guidedSectionTool in guidAttribs:
-                        print >> toolsFile, ' '.join(guidedSectionTool)
+                        print(' '.join(guidedSectionTool), file=toolsFile)
                     toolsFile.close()
 
     ## Returns the full path of the tool.
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 27afd79f2094..c52b8bd94234 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import base64
 import os
 import os.path
@@ -91,9 +92,9 @@ class BaseToolsTest(unittest.TestCase):
             os.remove(path)
 
     def DisplayBinaryData(self, description, data):
-        print description, '(base64 encoded):'
+        print(description, '(base64 encoded):')
         b64data = base64.b64encode(data)
-        print b64data
+        print(b64data)
 
     def DisplayFile(self, fileName):
         sys.stdout.write(self.ReadTmpFile(fileName))
diff --git a/BaseTools/Tests/TianoCompress.py b/BaseTools/Tests/TianoCompress.py
index e14136416211..f6a4a6ae9c5d 100644
--- a/BaseTools/Tests/TianoCompress.py
+++ b/BaseTools/Tests/TianoCompress.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import os
 import random
 import sys
@@ -52,8 +53,8 @@ class Tests(TestTools.BaseToolsTest):
         finish = self.ReadTmpFile('output2')
         startEqualsFinish = start == finish
         if not startEqualsFinish:
-            print
-            print 'Original data did not match decompress(compress(data))'
+            print()
+            print('Original data did not match decompress(compress(data))')
             self.DisplayBinaryData('original data', start)
             self.DisplayBinaryData('after compression', self.ReadTmpFile('output1'))
             self.DisplayBinaryData('after decomression', finish)
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 858b4020ef9f..643fec58a457 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -17,6 +17,7 @@
 #
 
 
+from __future__ import print_function
 from optparse import OptionParser
 import os
 import shutil
@@ -34,7 +35,7 @@ if sys.version_info < (2, 5):
     #
     # This script (and edk2 BaseTools) require Python 2.5 or newer
     #
-    print 'Python version 2.5 or later is required.'
+    print('Python version 2.5 or later is required.')
     sys.exit(-1)
 
 #
@@ -146,37 +147,37 @@ class Config:
         if not self.options.skip_gcc:
             building.append('gcc')
         if len(building) == 0:
-            print "Nothing will be built!"
-            print
-            print "Please try using --help and then change the configuration."
+            print("Nothing will be built!")
+            print()
+            print("Please try using --help and then change the configuration.")
             return False
 
-        print "Current directory:"
-        print "   ", self.base_dir
-        print "Sources download/extraction:", self.Relative(self.src_dir)
-        print "Build directory            :", self.Relative(self.build_dir)
-        print "Prefix (install) directory :", self.Relative(self.prefix)
-        print "Create symlinks directory  :", self.Relative(self.symlinks)
-        print "Building                   :", ', '.join(building)
-        print
+        print("Current directory:")
+        print("   ", self.base_dir)
+        print("Sources download/extraction:", self.Relative(self.src_dir))
+        print("Build directory            :", self.Relative(self.build_dir))
+        print("Prefix (install) directory :", self.Relative(self.prefix))
+        print("Create symlinks directory  :", self.Relative(self.symlinks))
+        print("Building                   :", ', '.join(building))
+        print()
         answer = raw_input("Is this configuration ok? (default = no): ")
         if (answer.lower() not in ('y', 'yes')):
-            print
-            print "Please try using --help and then change the configuration."
+            print()
+            print("Please try using --help and then change the configuration.")
             return False
 
         if self.arch.lower() == 'ipf':
-            print
-            print 'Please note that the IPF compiler built by this script has'
-            print 'not yet been validated!'
-            print
+            print()
+            print('Please note that the IPF compiler built by this script has')
+            print('not yet been validated!')
+            print()
             answer = raw_input("Are you sure you want to build it? (default = no): ")
             if (answer.lower() not in ('y', 'yes')):
-                print
-                print "Please try using --help and then change the configuration."
+                print()
+                print("Please try using --help and then change the configuration.")
                 return False
 
-        print
+        print()
         return True
 
     def Relative(self, path):
@@ -275,7 +276,7 @@ class SourceFiles:
             wDots = (100 * received * blockSize) / fileSize / 10
             if wDots > self.dots:
                 for i in range(wDots - self.dots):
-                    print '.',
+                    print('.', end=' ')
                     sys.stdout.flush()
                     self.dots += 1
 
@@ -286,18 +287,18 @@ class SourceFiles:
                     self.dots = 0
                     local_file = os.path.join(self.config.src_dir, fdata['filename'])
                     url = fdata['url']
-                    print 'Downloading %s:' % fname, url
+                    print('Downloading %s:' % fname, url)
                     if retries > 0:
-                        print '(retry)',
+                        print('(retry)', end=' ')
                     sys.stdout.flush()
 
                     completed = False
                     if os.path.exists(local_file):
                         md5_pass = self.checkHash(fdata)
                         if md5_pass:
-                            print '[md5 match]',
+                            print('[md5 match]', end=' ')
                         else:
-                            print '[md5 mismatch]',
+                            print('[md5 mismatch]', end=' ')
                         sys.stdout.flush()
                         completed = md5_pass
 
@@ -313,32 +314,32 @@ class SourceFiles:
                     if not completed and os.path.exists(local_file):
                         md5_pass = self.checkHash(fdata)
                         if md5_pass:
-                            print '[md5 match]',
+                            print('[md5 match]', end=' ')
                         else:
-                            print '[md5 mismatch]',
+                            print('[md5 mismatch]', end=' ')
                         sys.stdout.flush()
                         completed = md5_pass
 
                     if completed:
-                        print '[done]'
+                        print('[done]')
                         break
                     else:
-                        print '[failed]'
-                        print '  Tried to retrieve', url
-                        print '  to', local_file
-                        print 'Possible fixes:'
-                        print '* If you are behind a web-proxy, try setting the',
-                        print 'http_proxy environment variable'
-                        print '* You can try to download this file separately',
-                        print 'and rerun this script'
+                        print('[failed]')
+                        print('  Tried to retrieve', url)
+                        print('  to', local_file)
+                        print('Possible fixes:')
+                        print('* If you are behind a web-proxy, try setting the', end=' ')
+                        print('http_proxy environment variable')
+                        print('* You can try to download this file separately', end=' ')
+                        print('and rerun this script')
                         raise Exception()
                 
                 except KeyboardInterrupt:
-                    print '[KeyboardInterrupt]'
+                    print('[KeyboardInterrupt]')
                     return False
 
                 except Exception as e:
-                    print e
+                    print(e)
 
             if not completed: return False
 
@@ -396,7 +397,7 @@ class Extracter:
             extractedMd5 = open(extracted).read()
 
         if extractedMd5 != moduleMd5:
-            print 'Extracting %s:' % self.config.Relative(local_file)
+            print('Extracting %s:' % self.config.Relative(local_file))
             tar = tarfile.open(local_file)
             tar.extractall(extractDst)
             open(extracted, 'w').write(moduleMd5)
@@ -480,7 +481,7 @@ class Builder:
 
         os.chdir(base_dir)
 
-        print '%s module is now built and installed' % module
+        print('%s module is now built and installed' % module)
 
     def RunCommand(self, cmd, module, stage, skipable=False):
         if skipable:
@@ -495,13 +496,13 @@ class Builder:
                 stderr=subprocess.STDOUT
                 )
 
-        print '%s [%s] ...' % (module, stage),
+        print('%s [%s] ...' % (module, stage), end=' ')
         sys.stdout.flush()
         p = popen(cmd)
         output = p.stdout.read()
         p.wait()
         if p.returncode != 0:
-            print '[failed!]'
+            print('[failed!]')
             logFile = os.path.join(self.config.build_dir, 'log.txt')
             f = open(logFile, "w")
             f.write(output)
@@ -509,7 +510,7 @@ class Builder:
             raise Exception, 'Failed to %s %s\n' % (stage, module) + \
                 'See output log at %s' % self.config.Relative(logFile)
         else:
-            print '[done]'
+            print('[done]')
 
         if skipable:
             self.MarkBuildStepComplete('%s.%s' % (module, stage))
@@ -526,13 +527,13 @@ class Builder:
             linkdst = os.path.join(links_dir, link)
             if not os.path.lexists(linkdst):
                 if not startPrinted:
-                    print 'Making symlinks in %s:' % self.config.Relative(links_dir),
+                    print('Making symlinks in %s:' % self.config.Relative(links_dir), end=' ')
                     startPrinted = True
-                print link,
+                print(link, end=' ')
                 os.symlink(src, linkdst)
 
         if startPrinted:
-            print '[done]'
+            print('[done]')
 
 class App:
     """class App
@@ -551,9 +552,9 @@ class App:
         sources = SourceFiles(config)
         result = sources.GetAll()
         if result:
-            print 'All files have been downloaded & verified'
+            print('All files have been downloaded & verified')
         else:
-            print 'An error occured while downloading a file'
+            print('An error occured while downloading a file')
             return
 
         Extracter(sources, config).ExtractAll()
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 03/15] BaseTools: Remove the old python "not-equal"
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
  2018-01-19  4:43 ` [PATCH 01/15] BaseTools: Refactor python except statements Gary Lin
  2018-01-19  4:43 ` [PATCH 02/15] BaseTools: Refactor python print statements Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 04/15] BaseTools: Use the python3-range functions Gary Lin
                   ` (12 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Replace "<>" with "!=" to be compatible with python3.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/BinToPcd.py                                          |  4 ++--
 BaseTools/Source/Python/AutoGen/AutoGen.py                             |  4 ++--
 BaseTools/Source/Python/AutoGen/BuildEngine.py                         |  4 ++--
 BaseTools/Source/Python/AutoGen/GenC.py                                |  4 ++--
 BaseTools/Source/Python/AutoGen/GenMake.py                             |  2 +-
 BaseTools/Source/Python/Common/Misc.py                                 |  2 +-
 BaseTools/Source/Python/GenFds/Fv.py                                   |  2 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  6 +++---
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 12 ++++++------
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         | 12 ++++++------
 BaseTools/Source/Python/Workspace/DscBuildData.py                      |  2 +-
 11 files changed, 27 insertions(+), 27 deletions(-)

diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index c4e7b8a5c2e2..1867f35e148e 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -41,13 +41,13 @@ if __name__ == '__main__':
     return Value
 
   def ValidatePcdName (Argument):
-    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) <> ['','']:
+    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
       Message = '%s is not in the form <PcdTokenSpaceGuidCName>.<PcdCName>' % (Argument)
       raise argparse.ArgumentTypeError(Message)
     return Argument
 
   def ValidateGuidName (Argument):
-    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) <> ['','']:
+    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
       Message = '%s is not a valid GUID C name' % (Argument)
       raise argparse.ArgumentTypeError(Message)
     return Argument
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 5e55d5d655e3..7b9054b73b51 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -4033,7 +4033,7 @@ class ModuleAutoGen(AutoGen):
             return
 
         # Skip the following code for modules without any binary files
-        if self.BinaryFileList <> None and self.BinaryFileList <> []:
+        if self.BinaryFileList != None and self.BinaryFileList != []:
             return
             
         ### TODO: How to handles mixed source and binary modules
@@ -4499,7 +4499,7 @@ class ModuleAutoGen(AutoGen):
             Dpx = GenDepex.DependencyExpression(self.DepexList[ModuleType], ModuleType, True)
             DpxFile = gAutoGenDepexFileName % {"module_name" : self.Name}
 
-            if len(Dpx.PostfixNotation) <> 0:
+            if len(Dpx.PostfixNotation) != 0:
                 self.DepexGenerated = True
 
             if Dpx.Generate(path.join(self.OutputDir, DpxFile)):
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index 46685967d1ee..f0a973c9f197 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -388,8 +388,8 @@ class BuildRule:
             self.RuleContent[Index] = Line
             
             # find the build_rule_version
-            if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) <> -1:
-                if Line.find("=") <> -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
+            if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) != -1:
+                if Line.find("=") != -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
                     self._FileVersion = (Line[(Line.find("=") + 1):]).split()[0]
             # skip empty or comment line
             if Line == "" or Line[0] == "#":
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 3e98506cc807..b8ba687bcda0 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1521,7 +1521,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
     }
 
     if Info.ModuleType in ['PEI_CORE', 'DXE_CORE', 'SMM_CORE', 'MM_CORE_STANDALONE']:
-        if Info.SourceFileList <> None and Info.SourceFileList <> []:
+        if Info.SourceFileList != None and Info.SourceFileList != []:
           if NumEntryPoints != 1:
               EdkLogger.error(
                   "build",
@@ -1683,7 +1683,7 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
     AutoGenH.Append("\n// Definition of SkuId Array\n")
     AutoGenH.Append("extern UINT64 _gPcd_SkuId_Array[];\n")
     # Add extern declarations to AutoGen.h if one or more Token Space GUIDs were found
-    if TokenSpaceList <> []:            
+    if TokenSpaceList != []:
         AutoGenH.Append("\n// Definition of PCD Token Space GUIDs used in this module\n\n")
         if Info.ModuleType in ["USER_DEFINED", "BASE"]:
             GuidType = "GUID"
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 3f98a34d81ec..8891b1b97d23 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -562,7 +562,7 @@ cleanlib:
 
         # convert source files and binary files to build targets
         self.ResultFileList = [str(T.Target) for T in self._AutoGenObject.CodaTargetList]
-        if len(self.ResultFileList) == 0 and len(self._AutoGenObject.SourceFileList) <> 0:
+        if len(self.ResultFileList) == 0 and len(self._AutoGenObject.SourceFileList) != 0:
             EdkLogger.error("build", AUTOGEN_ERROR, "Nothing to build",
                             ExtraData="[%s]" % str(self._AutoGenObject))
 
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index f1eb4c5a7892..223dc7971b0d 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1509,7 +1509,7 @@ def ParseDevPathValue (Value):
 def ParseFieldValue (Value):
     if type(Value) == type(0):
         return Value, (Value.bit_length() + 7) / 8
-    if type(Value) <> type(''):
+    if type(Value) != type(''):
         raise BadExpression('Type %s is %s' %(Value, type(Value)))
     Value = Value.strip()
     if Value.startswith('UINT8') and Value.endswith(')'):
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index c0b869d250f1..be8b885d069e 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -341,7 +341,7 @@ class FV (FvClassObject):
             if len(self.FvExtEntryType) > 0 or self.UsedSizeEnable:
                 GenFdsGlobalVariable.ErrorLogger("FV Extension Header Entries declared for %s with no FvNameGuid declaration." % (self.UiFvName))
         
-        if self.FvNameGuid <> None and self.FvNameGuid <> '':
+        if self.FvNameGuid != None and self.FvNameGuid != '':
             TotalSize = 16 + 4
             Buffer = ''
             if self.UsedSizeEnable:
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 4f79d0f82967..11d11700ed99 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -118,7 +118,7 @@ if __name__ == '__main__':
     sys.exit(1)
 
   Version = Process.communicate()
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
   print(Version[0])
@@ -208,7 +208,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s smime -sign -binary -signer "%s" -outform DER -md sha256 -certfile "%s"' % (OpenSslCommand, args.SignerPrivateCertFileName, args.OtherPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Signature = Process.communicate(input=FullInputFileBuffer)[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       sys.exit(Process.returncode)
 
     #
@@ -277,7 +277,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName, args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=args.SignatureBuffer)[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 06ed2610271f..2aa6877c92be 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -80,7 +80,7 @@ if __name__ == '__main__':
     sys.exit(1)
     
   Version = Process.communicate()
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
   print(Version[0])
@@ -90,7 +90,7 @@ if __name__ == '__main__':
   #
   # Check for output file argument
   #
-  if args.OutputFile <> None:
+  if args.OutputFile != None:
     for Item in args.OutputFile:
       #
       # Save PEM filename and close output file
@@ -103,14 +103,14 @@ if __name__ == '__main__':
       #
       Process = subprocess.Popen('%s genrsa -out %s 2048' % (OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
       Process.communicate()
-      if Process.returncode <> 0:
+      if Process.returncode != 0:
         print('ERROR: RSA 2048 key generation failed')
         sys.exit(Process.returncode)
       
   #
   # Check for input file argument
   #
-  if args.InputFile <> None:
+  if args.InputFile != None:
     for Item in args.InputFile:
       #
       # Save PEM filename and close input file
@@ -125,7 +125,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Unable to extract public key from private key')
       sys.exit(Process.returncode)
     PublicKey = ''
@@ -138,7 +138,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s dgst -sha256 -binary' % (OpenSslCommand), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.stdin.write (PublicKey)
     PublicKeyHash = PublicKeyHash + Process.communicate()[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Unable to extract SHA 256 hash of public key')
       sys.exit(Process.returncode)
 
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 99a5d8aa5a01..8c235ae51e7e 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -101,7 +101,7 @@ if __name__ == '__main__':
     sys.exit(1)
     
   Version = Process.communicate()
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
   print(Version[0])
@@ -157,7 +157,7 @@ if __name__ == '__main__':
   while len(PublicKeyHexString) > 0:
     PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2],16))
     PublicKeyHexString=PublicKeyHexString[2:]
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     sys.exit(Process.returncode)
 
   if args.MonotonicCountStr:
@@ -179,7 +179,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s sha256 -sign "%s"' % (OpenSslCommand, args.PrivateKeyFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Signature = Process.communicate(input=FullInputFileBuffer)[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       sys.exit(Process.returncode)
       
     #
@@ -202,14 +202,14 @@ if __name__ == '__main__':
     #
     # Verify that the Hash Type matches the expected SHA256 type
     #
-    if uuid.UUID(bytes_le = Header.HashType) <> EFI_HASH_ALGORITHM_SHA256_GUID:
+    if uuid.UUID(bytes_le = Header.HashType) != EFI_HASH_ALGORITHM_SHA256_GUID:
       print('ERROR: unsupport hash GUID')
       sys.exit(1)
 
     #
     # Verify the public key
     #
-    if Header.PublicKey <> PublicKey:
+    if Header.PublicKey != PublicKey:
       print('ERROR: Public key in input file does not match public key from private key file')
       sys.exit(1)
 
@@ -228,7 +228,7 @@ if __name__ == '__main__':
     #    
     Process = subprocess.Popen('%s sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=FullInputFileBuffer)
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index cce7a6273313..3ddbc4ca0b05 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1259,7 +1259,7 @@ class DscBuildData(PlatformBuildClassObject):
             print('ERROR: Can not execute command:', Command)
             sys.exit(1)
         Result = Process.communicate()
-        if Process.returncode <> 0:
+        if Process.returncode != 0:
             print('ERROR: Can not collect output from command:', Command)
         return Result[0], Result[1]
 
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 04/15] BaseTools: Use the python3-range functions
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (2 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 03/15] BaseTools: Remove the old python "not-equal" Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 05/15] BaseTools: Remove tuple parameter in python scripts Gary Lin
                   ` (11 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Replace xrange() and range() with the newer range() function
Based on "futurize -f libfuturize.fixes.fix_xrange_with_import"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/BinToPcd.py                                          |  3 ++-
 BaseTools/Scripts/ConvertMasmToNasm.py                                 |  1 +
 BaseTools/Scripts/PatchCheck.py                                        |  5 +++--
 BaseTools/Source/Python/AutoGen/AutoGen.py                             |  1 +
 BaseTools/Source/Python/AutoGen/BuildEngine.py                         |  1 +
 BaseTools/Source/Python/AutoGen/GenC.py                                |  1 +
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                            | 23 ++++++++++----------
 BaseTools/Source/Python/AutoGen/GenVar.py                              |  1 +
 BaseTools/Source/Python/AutoGen/InfSectionParser.py                    |  1 +
 BaseTools/Source/Python/AutoGen/StrGather.py                           |  1 +
 BaseTools/Source/Python/AutoGen/UniClassObject.py                      |  1 +
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py             |  1 +
 BaseTools/Source/Python/BPDG/GenVpd.py                                 |  7 +++---
 BaseTools/Source/Python/Common/DscClassObject.py                       |  1 +
 BaseTools/Source/Python/Common/Expression.py                           |  1 +
 BaseTools/Source/Python/Common/FdfClassObject.py                       |  1 +
 BaseTools/Source/Python/Common/MigrationUtilities.py                   |  1 +
 BaseTools/Source/Python/Common/Misc.py                                 |  3 ++-
 BaseTools/Source/Python/Common/Parsing.py                              |  1 +
 BaseTools/Source/Python/Common/RangeExpression.py                      |  1 +
 BaseTools/Source/Python/Common/String.py                               |  1 +
 BaseTools/Source/Python/Common/ToolDefClassObject.py                   |  1 +
 BaseTools/Source/Python/Ecc/Check.py                                   |  1 +
 BaseTools/Source/Python/Ecc/MetaDataParser.py                          |  3 ++-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py        |  1 +
 BaseTools/Source/Python/Eot/FvImage.py                                 |  1 +
 BaseTools/Source/Python/Eot/InfParserLite.py                           |  1 +
 BaseTools/Source/Python/GenFds/AprioriSection.py                       |  1 +
 BaseTools/Source/Python/GenFds/FfsFileStatement.py                     |  1 +
 BaseTools/Source/Python/GenFds/Fv.py                                   |  1 +
 BaseTools/Source/Python/GenFds/GenFds.py                               |  1 +
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                 |  1 +
 BaseTools/Source/Python/GenFds/Region.py                               |  3 ++-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py                 |  1 +
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  3 ++-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py |  3 ++-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         |  3 ++-
 BaseTools/Source/Python/Trim/Trim.py                                   |  1 +
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                  |  5 +++--
 BaseTools/Source/Python/UPT/Library/CommentParsing.py                  |  3 ++-
 BaseTools/Source/Python/UPT/Library/Misc.py                            |  5 +++--
 BaseTools/Source/Python/UPT/Library/Parsing.py                         |  3 ++-
 BaseTools/Source/Python/UPT/Library/String.py                          |  1 +
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                  |  3 ++-
 BaseTools/Source/Python/UPT/Parser/DecParserMisc.py                    |  1 +
 BaseTools/Source/Python/UPT/Parser/InfSectionParser.py                 |  3 ++-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py              |  1 +
 BaseTools/Source/Python/UPT/UPT.py                                     |  1 +
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py           |  1 +
 BaseTools/Source/Python/UPT/Xml/IniToXml.py                            |  1 +
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                           |  1 +
 BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py                       |  3 ++-
 BaseTools/Source/Python/Workspace/DscBuildData.py                      |  1 +
 BaseTools/Source/Python/Workspace/InfBuildData.py                      |  1 +
 BaseTools/Source/Python/Workspace/MetaFileParser.py                    |  1 +
 BaseTools/Tests/TestTools.py                                           |  3 ++-
 BaseTools/Tests/TianoCompress.py                                       |  1 +
 BaseTools/gcc/mingw-gcc-build.py                                       |  1 +
 58 files changed, 91 insertions(+), 33 deletions(-)

diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 1867f35e148e..7d8cd0a5cc25 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -16,6 +16,7 @@ BinToPcd
 '''
 from __future__ import print_function
 
+from builtins import range
 import sys
 import argparse
 import re
@@ -84,7 +85,7 @@ if __name__ == '__main__':
                       help = "Increase output messages")
   parser.add_argument("-q", "--quiet", dest = 'Quiet', action = "store_true",
                       help = "Reduce output messages")
-  parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = range(0,10), default = 0,
+  parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = list(range(0,10)), default = 0,
                       help = "Set debug level")
 
   #
diff --git a/BaseTools/Scripts/ConvertMasmToNasm.py b/BaseTools/Scripts/ConvertMasmToNasm.py
index 5b83724b3124..e7b5b096fccc 100755
--- a/BaseTools/Scripts/ConvertMasmToNasm.py
+++ b/BaseTools/Scripts/ConvertMasmToNasm.py
@@ -17,6 +17,7 @@ from __future__ import print_function
 #
 # Import Modules
 #
+from builtins import range
 import argparse
 import io
 import os.path
diff --git a/BaseTools/Scripts/PatchCheck.py b/BaseTools/Scripts/PatchCheck.py
index 43bfc2495c6b..51d4adf08b60 100755
--- a/BaseTools/Scripts/PatchCheck.py
+++ b/BaseTools/Scripts/PatchCheck.py
@@ -15,6 +15,7 @@
 
 from __future__ import print_function
 
+from builtins import range
 VersionNumber = '0.1'
 __copyright__ = "Copyright (c) 2015 - 2016, Intel Corporation  All rights reserved."
 
@@ -26,7 +27,7 @@ import subprocess
 import sys
 
 class Verbose:
-    SILENT, ONELINE, NORMAL = range(3)
+    SILENT, ONELINE, NORMAL = list(range(3))
     level = NORMAL
 
 class CommitMessageCheck:
@@ -234,7 +235,7 @@ class CommitMessageCheck:
                 break
             last_sig_line = line.strip()
 
-(START, PRE_PATCH, PATCH) = range(3)
+(START, PRE_PATCH, PATCH) = list(range(3))
 
 class GitDiffCheck:
     """Checks the contents of a git diff."""
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 7b9054b73b51..35c7a10de84b 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -14,6 +14,7 @@
 ## Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 import os.path as path
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index f0a973c9f197..e8f6788cdc40 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -15,6 +15,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 import copy
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index b8ba687bcda0..d68160deb4a1 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -13,6 +13,7 @@
 
 ## Import Modules
 #
+from builtins import range
 import string
 import collections
 import struct
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 22283ef7fe23..875ee5895fd9 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -10,6 +10,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
+from builtins import range
 from StringIO import StringIO
 from Common.Misc import *
 from Common.String import StringToArray
@@ -297,7 +298,7 @@ class DbItemList:
             # Variable length, need to calculate one by one
             #
             assert(Index < len(self.RawDataList))
-            for ItemIndex in xrange(Index):
+            for ItemIndex in range(Index):
                 Offset += len(self.RawDataList[ItemIndex])
         else:
             for Datas in self.RawDataList:
@@ -394,7 +395,7 @@ class DbComItemList (DbItemList):
             assert(False)
         else:
             assert(Index < len(self.RawDataList))
-            for ItemIndex in xrange(Index):
+            for ItemIndex in range(Index):
                 Offset += len(self.RawDataList[ItemIndex]) * self.ItemSize         
 
         return Offset
@@ -478,7 +479,7 @@ class DbStringHeadTableItemList(DbItemList):
             # Variable length, need to calculate one by one
             #
             assert(Index < len(self.RawDataList))
-            for ItemIndex in xrange(Index):
+            for ItemIndex in range(Index):
                 Offset += len(self.RawDataList[ItemIndex])
         else:
             for innerIndex in range(Index):
@@ -568,14 +569,14 @@ class DbStringItemList (DbComItemList):
         assert(len(RawDataList) == len(LenList))
         DataList = []
         # adjust DataList according to the LenList
-        for Index in xrange(len(RawDataList)):
+        for Index in range(len(RawDataList)):
             Len = LenList[Index]
             RawDatas = RawDataList[Index]
             assert(Len >= len(RawDatas))
             ActualDatas = []
-            for i in xrange(len(RawDatas)):
+            for i in range(len(RawDatas)):
                 ActualDatas.append(RawDatas[i])
-            for i in xrange(len(RawDatas), Len):
+            for i in range(len(RawDatas), Len):
                 ActualDatas.append(0)
             DataList.append(ActualDatas)
         self.LenList = LenList
@@ -584,7 +585,7 @@ class DbStringItemList (DbComItemList):
         Offset = 0
 
         assert(Index < len(self.LenList))
-        for ItemIndex in xrange(Index):
+        for ItemIndex in range(Index):
             Offset += self.LenList[ItemIndex]
 
         return Offset
@@ -772,7 +773,7 @@ def BuildExDataBase(Dict):
 
     # Get offset of SkuId table in the database 
     SkuIdTableOffset = FixedHeaderLen
-    for DbIndex in xrange(len(DbTotal)):
+    for DbIndex in range(len(DbTotal)):
         if DbTotal[DbIndex] is SkuidValue:
             break
         SkuIdTableOffset += DbItemTotal[DbIndex].GetListSize()
@@ -784,7 +785,7 @@ def BuildExDataBase(Dict):
     for (LocalTokenNumberTableIndex, (Offset, Table)) in enumerate(LocalTokenNumberTable):
         DbIndex = 0
         DbOffset = FixedHeaderLen
-        for DbIndex in xrange(len(DbTotal)):
+        for DbIndex in range(len(DbTotal)):
             if DbTotal[DbIndex] is Table:
                 DbOffset += DbItemTotal[DbIndex].GetInterOffset(Offset)
                 break
@@ -810,7 +811,7 @@ def BuildExDataBase(Dict):
             (VariableHeadGuidIndex, VariableHeadStringIndex, SKUVariableOffset, VariableOffset, VariableRefTable, VariableAttribute) = VariableEntryPerSku[:]
             DbIndex = 0
             DbOffset = FixedHeaderLen
-            for DbIndex in xrange(len(DbTotal)):
+            for DbIndex in range(len(DbTotal)):
                 if DbTotal[DbIndex] is VariableRefTable:
                     DbOffset += DbItemTotal[DbIndex].GetInterOffset(VariableOffset)
                     break
@@ -830,7 +831,7 @@ def BuildExDataBase(Dict):
 
     # calculate various table offset now
     DbTotalLength = FixedHeaderLen
-    for DbIndex in xrange(len(DbItemTotal)):
+    for DbIndex in range(len(DbItemTotal)):
         if DbItemTotal[DbIndex] is DbLocalTokenNumberTable:
             LocalTokenNumberTableOffset = DbTotalLength
         elif DbItemTotal[DbIndex] is DbExMapTable:
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 65d0bea36c58..d668c1edadbb 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -14,6 +14,7 @@
 # #
 # Import Modules
 #
+from builtins import range
 from struct import pack,unpack
 import collections
 import copy
diff --git a/BaseTools/Source/Python/AutoGen/InfSectionParser.py b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
index cdc9e5e8a849..ee2aae3b70e0 100644
--- a/BaseTools/Source/Python/AutoGen/InfSectionParser.py
+++ b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
@@ -14,6 +14,7 @@
 ## Import Modules
 #
 
+from builtins import range
 import Common.EdkLogger as EdkLogger
 from Common.BuildToolError import *
 from Common.DataType import *
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index ed33554cd7d2..718cd60514b4 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import re
 import Common.EdkLogger as EdkLogger
 from Common.BuildToolError import *
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 264cf1546566..cab7623bc4e6 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -17,6 +17,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os, codecs, re
 import distutils.util
 import Common.EdkLogger as EdkLogger
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 53da9b881f25..ff355d05d79f 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -15,6 +15,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import os
 from Common.RangeExpression import RangeExpression
 from Common.Misc import *
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 9861e7da68f1..33b62011b9d0 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -13,6 +13,7 @@
 #  WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from builtins import range
 import Common.LongFilePathOs as os
 import StringIO
 import StringTable as st
@@ -225,7 +226,7 @@ class PcdEntry:
 
         ReturnArray = array.array('B')
 
-        for Index in xrange(len(ValueList)):
+        for Index in range(len(ValueList)):
             Value = None
             if ValueList[Index].lower().startswith('0x'):
                 # translate hex value
@@ -251,7 +252,7 @@ class PcdEntry:
 
             ReturnArray.append(Value)
 
-        for Index in xrange(len(ValueList), Size):
+        for Index in range(len(ValueList), Size):
             ReturnArray.append(0)
 
         self.PcdValue = ReturnArray.tolist()
@@ -285,7 +286,7 @@ class PcdEntry:
                                 "Invalid unicode character %s in unicode string %s(File: %s Line: %s)" % \
                                 (Value, UnicodeString, self.FileName, self.Lineno))
 
-        for Index in xrange(len(UnicodeString) * 2, Size):
+        for Index in range(len(UnicodeString) * 2, Size):
             ReturnArray.append(0)
 
         self.PcdValue = ReturnArray.tolist()
diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/Source/Python/Common/DscClassObject.py
index 3a27fbffc934..f42d247cad33 100644
--- a/BaseTools/Source/Python/Common/DscClassObject.py
+++ b/BaseTools/Source/Python/Common/DscClassObject.py
@@ -15,6 +15,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os
 import EdkLogger as EdkLogger
 import Database
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 80e527dd3688..4b66307b7eb3 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -13,6 +13,7 @@
 ## Import Modules
 #
 from __future__ import print_function
+from builtins import range
 from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
diff --git a/BaseTools/Source/Python/Common/FdfClassObject.py b/BaseTools/Source/Python/Common/FdfClassObject.py
index 3e7d44954c88..7ec0235967b2 100644
--- a/BaseTools/Source/Python/Common/FdfClassObject.py
+++ b/BaseTools/Source/Python/Common/FdfClassObject.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 from FdfParserLite import FdfParser
 from Table.TableFdf import TableFdf
 from CommonDataClass.DataClass import MODEL_FILE_FDF, MODEL_PCD, MODEL_META_DATA_COMPONENT
diff --git a/BaseTools/Source/Python/Common/MigrationUtilities.py b/BaseTools/Source/Python/Common/MigrationUtilities.py
index e9f1cabcb794..2385988247d4 100644
--- a/BaseTools/Source/Python/Common/MigrationUtilities.py
+++ b/BaseTools/Source/Python/Common/MigrationUtilities.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 import EdkLogger
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 223dc7971b0d..49522474b36f 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Common.LongFilePathOs as os
 import sys
 import string
@@ -1883,7 +1884,7 @@ def SplitOption(OptionString):
 def CommonPath(PathList):
     P1 = min(PathList).split(os.path.sep)
     P2 = max(PathList).split(os.path.sep)
-    for Index in xrange(min(len(P1), len(P2))):
+    for Index in range(min(len(P1), len(P2))):
         if P1[Index] != P2[Index]:
             return os.path.sep.join(P1[:Index])
     return os.path.sep.join(P1)
diff --git a/BaseTools/Source/Python/Common/Parsing.py b/BaseTools/Source/Python/Common/Parsing.py
index 584fc7f3c3a0..9caa9424d8ed 100644
--- a/BaseTools/Source/Python/Common/Parsing.py
+++ b/BaseTools/Source/Python/Common/Parsing.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 from String import *
 from CommonDataClass.DataClass import *
 from DataType import *
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index ee33ae3d3266..4357f240f423 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -13,6 +13,7 @@
 # # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index 4a8c03e88e28..e6c7a3b74ee1 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import re
 import DataType
 import Common.LongFilePathOs as os
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index dc90b4783f2f..6dab179efc01 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 import EdkLogger
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index 5864758950ce..92259999853c 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -10,6 +10,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 from CommonDataClass.DataClass import *
diff --git a/BaseTools/Source/Python/Ecc/MetaDataParser.py b/BaseTools/Source/Python/Ecc/MetaDataParser.py
index 82ede3eb330c..9b8b96aa4b43 100644
--- a/BaseTools/Source/Python/Ecc/MetaDataParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaDataParser.py
@@ -11,6 +11,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from builtins import range
 import Common.LongFilePathOs as os
 from CommonDataClass.DataClass import *
 from EccToolError import *
@@ -112,7 +113,7 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
     #
     Last = 0
     HeaderCommentStage = HEADER_COMMENT_NOT_STARTED
-    for Index in xrange(len(CommentList)-1, 0, -1):
+    for Index in range(len(CommentList)-1, 0, -1):
         Line = CommentList[Index][0]
         if _IsCopyrightLine(Line):
             Last = Index
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 2fef87c4180a..e04b67732141 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 import time
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 9d8f0864dc41..64a27217e4a8 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -14,6 +14,7 @@
 ## Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 import sys
diff --git a/BaseTools/Source/Python/Eot/InfParserLite.py b/BaseTools/Source/Python/Eot/InfParserLite.py
index f624837f2587..4bdd60a6f71c 100644
--- a/BaseTools/Source/Python/Eot/InfParserLite.py
+++ b/BaseTools/Source/Python/Eot/InfParserLite.py
@@ -15,6 +15,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os
 import Common.EdkLogger as EdkLogger
 from Common.DataType import *
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 70e2e5a3baf2..27fe2619a35f 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 from struct import *
 import Common.LongFilePathOs as os
 import StringIO
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index 12ec95b56501..cbfea730ef18 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Ffs
 import Rule
 import Common.LongFilePathOs as os
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index be8b885d069e..615d9e39faf1 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Common.LongFilePathOs as os
 import subprocess
 import StringIO
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index b2cc25d46cbc..bc7ef6408509 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -16,6 +16,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 from optparse import OptionParser
 import sys
 import Common.LongFilePathOs as os
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 969f9f2e2137..6807ffdd6c3a 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -16,6 +16,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os
 import sys
 import subprocess
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index c946758cf549..5b9b203cf475 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import StringIO
@@ -56,7 +57,7 @@ class Region(RegionClassObject):
                 PadByte = pack('B', 0xFF)
             else:
                 PadByte = pack('B', 0)
-            PadData = ''.join(PadByte for i in xrange(0, Size))
+            PadData = ''.join(PadByte for i in range(0, Size))
             Buffer.write(PadData)
 
     ## AddToBuffer()
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index 882da81930da..9bb4d43a969f 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Common.LongFilePathOs as os
 from Common.LongFilePathSupport import OpenLongFilePath as open
 import sys
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 11d11700ed99..becf3e8eb9e8 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -21,6 +21,7 @@ Pkcs7Sign
 '''
 from __future__ import print_function
 
+from builtins import range
 import os
 import sys
 import argparse
@@ -88,7 +89,7 @@ if __name__ == '__main__':
   parser.add_argument("--signature-size", dest='SignatureSizeStr', type=str, help="specify the signature size for decode process.")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
   parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
 
   #
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 2aa6877c92be..1641968ace0e 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -24,6 +24,7 @@ Rsa2048Sha256GenerateKeys
 '''
 from __future__ import print_function
 
+from builtins import range
 import os
 import sys
 import argparse 
@@ -51,7 +52,7 @@ if __name__ == '__main__':
   parser.add_argument("--public-key-hash-c", dest='PublicKeyHashCFile', type=argparse.FileType('wb'), help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in C structure format")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
 
   #
   # Parse command line arguments
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 8c235ae51e7e..2a19ad973b91 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -19,6 +19,7 @@ Rsa2048Sha256Sign
 '''
 from __future__ import print_function
 
+from builtins import range
 import os
 import sys
 import argparse 
@@ -71,7 +72,7 @@ if __name__ == '__main__':
   parser.add_argument("--private-key", dest='PrivateKeyFile', type=argparse.FileType('rb'), help="specify the private key filename.  If not specified, a test signing key is used.")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
   parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
 
   #
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index 05ba86262133..94f6b1bc707a 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from builtins import range
 import Common.LongFilePathOs as os
 import sys
 import re
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index d7eaf3ea1d12..517f2a6cdecd 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -15,6 +15,7 @@
 '''
 GenInf
 '''
+from builtins import range
 import os
 import stat
 import codecs
@@ -409,7 +410,7 @@ def GenLibraryClasses(ModuleObject):
                 Statement += '|' + FFE
             ModuleList = LibraryClass.GetSupModuleList()
             ArchList = LibraryClass.GetSupArchList()
-            for Index in xrange(0, len(ArchList)):
+            for Index in range(0, len(ArchList)):
                 ArchList[Index] = ConvertArchForInstall(ArchList[Index])
             ArchList.sort()
             SortedArch = ' '.join(ArchList)
@@ -574,7 +575,7 @@ def GenUserExtensions(ModuleObject):
 #         if not Statement:
 #             continue
         ArchList = UserExtension.GetSupArchList()
-        for Index in xrange(0, len(ArchList)):
+        for Index in range(0, len(ArchList)):
             ArchList[Index] = ConvertArchForInstall(ArchList[Index])
         ArchList.sort()
         KeyList = []
diff --git a/BaseTools/Source/Python/UPT/Library/CommentParsing.py b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
index 9cd7b60e16ab..b97a051137e1 100644
--- a/BaseTools/Source/Python/UPT/Library/CommentParsing.py
+++ b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
@@ -19,6 +19,7 @@ CommentParsing
 ##
 # Import Modules
 #
+from builtins import range
 import re
 
 from Library.String import GetSplitValueList
@@ -74,7 +75,7 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
     # first find the last copyright line
     #
     Last = 0
-    for Index in xrange(len(CommentList)-1, 0, -1):
+    for Index in range(len(CommentList)-1, 0, -1):
         Line = CommentList[Index][0]
         if _IsCopyrightLine(Line):
             Last = Index
diff --git a/BaseTools/Source/Python/UPT/Library/Misc.py b/BaseTools/Source/Python/UPT/Library/Misc.py
index 0d92cb3767c6..24e0a20daf87 100644
--- a/BaseTools/Source/Python/UPT/Library/Misc.py
+++ b/BaseTools/Source/Python/UPT/Library/Misc.py
@@ -19,6 +19,7 @@ Misc
 ##
 # Import Modules
 #
+from builtins import range
 import os.path
 from os import access
 from os import F_OK
@@ -437,7 +438,7 @@ class Sdict(IterableUserDict):
 def CommonPath(PathList):
     Path1 = min(PathList).split(os.path.sep)
     Path2 = max(PathList).split(os.path.sep)
-    for Index in xrange(min(len(Path1), len(Path2))):
+    for Index in range(min(len(Path1), len(Path2))):
         if Path1[Index] != Path2[Index]:
             return os.path.sep.join(Path1[:Index])
     return os.path.sep.join(Path1)
@@ -890,7 +891,7 @@ def ProcessEdkComment(LineList):
             if FindEdkBlockComment:
                 if FirstPos == -1:
                     FirstPos = StartPos
-                for Index in xrange(StartPos, EndPos+1):
+                for Index in range(StartPos, EndPos+1):
                     LineList[Index] = ''
                 FindEdkBlockComment = False
         elif Line.find("//") != -1 and not Line.startswith("#"):
diff --git a/BaseTools/Source/Python/UPT/Library/Parsing.py b/BaseTools/Source/Python/UPT/Library/Parsing.py
index c34e7751442a..bac664506f4d 100644
--- a/BaseTools/Source/Python/UPT/Library/Parsing.py
+++ b/BaseTools/Source/Python/UPT/Library/Parsing.py
@@ -20,6 +20,7 @@ Parsing
 ##
 # Import Modules
 #
+from builtins import range
 import os.path
 import re
 
@@ -973,7 +974,7 @@ def GenSection(SectionName, SectionDict, SplitArch=True, NeedBlankLine=False):
                     ArchList = GetSplitValueList(SectionAttrs, DataType.TAB_COMMENT_SPLIT)
                 else:
                     ArchList = [SectionAttrs]
-            for Index in xrange(0, len(ArchList)):
+            for Index in range(0, len(ArchList)):
                 ArchList[Index] = ConvertArchForInstall(ArchList[Index])
             Section = '[' + SectionName + '.' + (', ' + SectionName + '.').join(ArchList) + ']'
         else:
diff --git a/BaseTools/Source/Python/UPT/Library/String.py b/BaseTools/Source/Python/UPT/Library/String.py
index 278073e4a379..2f916324bd13 100644
--- a/BaseTools/Source/Python/UPT/Library/String.py
+++ b/BaseTools/Source/Python/UPT/Library/String.py
@@ -18,6 +18,7 @@ String
 ##
 # Import Modules
 #
+from builtins import range
 import re
 import os.path
 from string import strip
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index 84958ae38cef..d07c26abd9c2 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -19,6 +19,7 @@ from __future__ import print_function
 ##
 # Import Modules
 #
+from builtins import range
 import os, codecs, re
 import distutils.util
 from Logger import ToolError
@@ -515,7 +516,7 @@ class UniFileClassObject(object):
                     FileIn[LineCount-1] = Line
                     FileIn[LineCount] = '\r\n'
                     LineCount -= 1
-                    for Index in xrange (LineCount + 1, len (FileIn) - 1):
+                    for Index in range (LineCount + 1, len (FileIn) - 1):
                         if (Index == len(FileIn) -1):
                             FileIn[Index] = '\r\n'
                         else:
diff --git a/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py b/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
index 22a50680fb8f..14539b0bd6c1 100644
--- a/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
+++ b/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
@@ -17,6 +17,7 @@ DecParserMisc
 
 ## Import modules
 #
+from builtins import range
 import os
 import Logger.Log as Logger
 from Logger.ToolError import FILE_PARSE_FAILURE
diff --git a/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
index 727164c2c244..ac821deded0a 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
@@ -18,6 +18,7 @@ InfSectionParser
 ##
 # Import Modules
 #
+from builtins import range
 from copy import deepcopy
 import re
 
@@ -455,7 +456,7 @@ class InfSectionParser(InfDefinSectionParser,
                     Arch = Match.groups(1)[0].upper()
                     ArchList.append(Arch)
             CommentSoFar = ''
-            for Index in xrange(1, len(List)):
+            for Index in range(1, len(List)):
                 Result = ParseComment(List[Index], DT.ALL_USAGE_TOKENS, TokenDict, [], False)
                 Usage = Result[0]
                 Type = Result[1]
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 074aa311f31d..4c28b7f5d22a 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -20,6 +20,7 @@ from __future__ import print_function
 ##
 # Import Modules
 #
+from builtins import range
 import os.path
 from os import sep
 import platform
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 0bfcc44e3f19..3296ee3d3d8f 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -19,6 +19,7 @@ UPT
 
 ## import modules
 #
+from builtins import range
 import locale
 import sys
 encoding = locale.getdefaultlocale()[1]
diff --git a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
index 626f17426de7..2c21823194e2 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
@@ -12,6 +12,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 
 from __future__ import print_function
+from builtins import range
 import os
 #import Object.Parser.InfObject as InfObject
 from Object.Parser.InfCommonObject import CurrentLine
diff --git a/BaseTools/Source/Python/UPT/Xml/IniToXml.py b/BaseTools/Source/Python/UPT/Xml/IniToXml.py
index 037471056d81..79db9a31a28b 100644
--- a/BaseTools/Source/Python/UPT/Xml/IniToXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/IniToXml.py
@@ -16,6 +16,7 @@
 IniToXml
 '''
 
+from builtins import range
 import os.path
 import re
 from time import strftime
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParser.py b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
index 58959081d0ab..b4d52f7bdc1f 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParser.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
@@ -19,6 +19,7 @@ XmlParser
 ##
 # Import Modules
 #
+from builtins import range
 import re
 
 from Library.Xml.XmlRoutines import XmlNode
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py b/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
index 7e3dc94edf64..28b146ff9183 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
@@ -15,6 +15,7 @@
 '''
 XmlParserMisc
 '''
+from builtins import range
 from Object.POM.CommonObject import TextObject
 from Logger.StringTable import ERR_XML_PARSER_REQUIRED_ITEM_MISSING
 from Logger.ToolError import PARSER_ERROR
@@ -53,7 +54,7 @@ def ConvertVariableName(VariableName):
         if SecondByte != 0:
             return None
   
-        if FirstByte not in xrange(0x20, 0x7F):
+        if FirstByte not in range(0x20, 0x7F):
             return None
         TransferedStr += ('%c')%FirstByte
         Index = Index + 2
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 3ddbc4ca0b05..cabad879b8d2 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -18,6 +18,7 @@
 # into PlatformBuildClassObject form for easier use for AutoGen.
 #
 from __future__ import print_function
+from builtins import range
 from Common.String import *
 from Common.DataType import *
 from Common.Misc import *
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 67c08ee47841..9fc2e681b73d 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -12,6 +12,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from builtins import range
 from Common.String import *
 from Common.DataType import *
 from Common.Misc import *
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index afc8394efcf0..5128dc2a6d2f 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -16,6 +16,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import Common.LongFilePathOs as os
 import re
 import time
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index c52b8bd94234..1202289616ee 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -16,6 +16,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import base64
 import os
 import os.path
@@ -162,7 +163,7 @@ class BaseToolsTest(unittest.TestCase):
         if maxlen is None: maxlen = minlen
         return ''.join(
             [chr(random.randint(0,255))
-             for x in xrange(random.randint(minlen, maxlen))
+             for x in range(random.randint(minlen, maxlen))
             ])
 
     def setUp(self):
diff --git a/BaseTools/Tests/TianoCompress.py b/BaseTools/Tests/TianoCompress.py
index f6a4a6ae9c5d..65f783d1be9e 100644
--- a/BaseTools/Tests/TianoCompress.py
+++ b/BaseTools/Tests/TianoCompress.py
@@ -16,6 +16,7 @@
 # Import Modules
 #
 from __future__ import print_function
+from builtins import range
 import os
 import random
 import sys
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 643fec58a457..f7d0308bd9fa 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -18,6 +18,7 @@
 
 
 from __future__ import print_function
+from builtins import range
 from optparse import OptionParser
 import os
 import shutil
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 05/15] BaseTools: Remove tuple parameter in python scripts
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (3 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 04/15] BaseTools: Use the python3-range functions Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 06/15] BaseTools: Remove the deprecated hash_key() Gary Lin
                   ` (10 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

According to PEP3113, tuple parameter is removed in python 3.
(PEP3113: https://www.python.org/dev/peps/pep-3113/)

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/Common/VpdInfoFile.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index a6c1fb70bd7d..280cdfb536a6 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -219,7 +219,8 @@ class VpdInfoFile:
             return None
         
         return self._VpdArray[vpd]
-    def GetVpdInfo(self,(PcdTokenName,TokenSpaceName)):
+    def GetVpdInfo(self, arg):
+        (PcdTokenName, TokenSpaceName) = arg
         return self._VpdInfo.get((TokenSpaceName, PcdTokenName))
     
 ## Call external BPDG tool to process VPD file
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 06/15] BaseTools: Remove the deprecated hash_key()
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (4 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 05/15] BaseTools: Remove tuple parameter in python scripts Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 07/15] BaseTools: Import reduce() from functools Gary Lin
                   ` (9 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Replace "has_key()" with "in" to be compatible with python3.
Based on "futurize -f lib2to3.fixes.fix_has_key"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                           |  4 ++--
 BaseTools/Source/Python/Common/VpdInfoFile.py                        |  2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                | 16 ++++++++--------
 BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py         |  6 +++---
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py         |  2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py           |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py |  2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                 |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py       |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py            |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py            |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py       |  2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py         |  3 +--
 BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py  |  4 ++--
 BaseTools/Source/Python/build/build.py                               |  2 +-
 15 files changed, 31 insertions(+), 32 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 35c7a10de84b..ab429f3e6d46 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -1895,8 +1895,8 @@ class PlatformAutoGen(AutoGen):
             # retrieve BPDG tool's path from tool_def.txt according to VPD_TOOL_GUID defined in DSC file.
             BPDGToolName = None
             for ToolDef in self.ToolDefinition.values():
-                if ToolDef.has_key("GUID") and ToolDef["GUID"] == self.Platform.VpdToolGuid:
-                    if not ToolDef.has_key("PATH"):
+                if "GUID" in ToolDef and ToolDef["GUID"] == self.Platform.VpdToolGuid:
+                    if "PATH" not in ToolDef:
                         EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt" % self.Platform.VpdToolGuid)
                     BPDGToolName = ToolDef["PATH"]
                     break
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 280cdfb536a6..84dd7ac563dd 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -212,7 +212,7 @@ class VpdInfoFile:
     #
     #  @param vpd    A given VPD PCD 
     def GetOffset(self, vpd):
-        if not self._VpdArray.has_key(vpd):
+        if vpd not in self._VpdArray:
             return None
         
         if len(self._VpdArray[vpd]) == 0:
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index 517f2a6cdecd..4a9528b500f2 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -439,14 +439,14 @@ def GenLibraryClasses(ModuleObject):
                 Statement = '# Guid: ' + LibraryItem.Guid + ' Version: ' + LibraryItem.Version
 
                 if len(BinaryFile.SupArchList) == 0:
-                    if LibraryClassDict.has_key('COMMON') and Statement not in LibraryClassDict['COMMON']:
+                    if 'COMMON' in LibraryClassDict and Statement not in LibraryClassDict['COMMON']:
                         LibraryClassDict['COMMON'].append(Statement)
                     else:
                         LibraryClassDict['COMMON'] = ['## @LIB_INSTANCES']
                         LibraryClassDict['COMMON'].append(Statement)
                 else:
                     for Arch in BinaryFile.SupArchList:
-                        if LibraryClassDict.has_key(Arch):
+                        if Arch in LibraryClassDict:
                             if Statement not in LibraryClassDict[Arch]:
                                 LibraryClassDict[Arch].append(Statement)
                             else:
@@ -918,14 +918,14 @@ def GenAsBuiltPacthPcdSections(ModuleObject):
             if FileNameObjList:
                 ArchList = FileNameObjList[0].GetSupArchList()
             if len(ArchList) == 0:
-                if PatchPcdDict.has_key(DT.TAB_ARCH_COMMON):
+                if DT.TAB_ARCH_COMMON in PatchPcdDict:
                     if Statement not in PatchPcdDict[DT.TAB_ARCH_COMMON]:
                         PatchPcdDict[DT.TAB_ARCH_COMMON].append(Statement)
                 else:
                     PatchPcdDict[DT.TAB_ARCH_COMMON] = [Statement]
             else:
                 for Arch in ArchList:
-                    if PatchPcdDict.has_key(Arch):
+                    if Arch in PatchPcdDict:
                         if Statement not in PatchPcdDict[Arch]:
                             PatchPcdDict[Arch].append(Statement)
                     else:
@@ -968,13 +968,13 @@ def GenAsBuiltPcdExSections(ModuleObject):
                 ArchList = FileNameObjList[0].GetSupArchList()
 
             if len(ArchList) == 0:
-                if PcdExDict.has_key('COMMON'):
+                if 'COMMON' in PcdExDict:
                     PcdExDict['COMMON'].append(Statement)
                 else:
                     PcdExDict['COMMON'] = [Statement]
             else:
                 for Arch in ArchList:
-                    if PcdExDict.has_key(Arch):
+                    if Arch in PcdExDict:
                         if Statement not in PcdExDict[Arch]:
                             PcdExDict[Arch].append(Statement)
                     else:
@@ -1072,7 +1072,7 @@ def GenBuildOptions(ModuleObject):
             for BuilOptionItem in BinaryFile.AsBuiltList[0].BinaryBuildFlagList:
                 Statement = '#' + BuilOptionItem.AsBuiltOptionFlags
                 if len(BinaryFile.SupArchList) == 0:
-                    if BuildOptionDict.has_key('COMMON'):
+                    if 'COMMON' in BuildOptionDict:
                         if Statement not in BuildOptionDict['COMMON']:
                             BuildOptionDict['COMMON'].append(Statement)
                     else:
@@ -1080,7 +1080,7 @@ def GenBuildOptions(ModuleObject):
                         BuildOptionDict['COMMON'].append(Statement)
                 else:
                     for Arch in BinaryFile.SupArchList:
-                        if BuildOptionDict.has_key(Arch):
+                        if Arch in BuildOptionDict:
                             if Statement not in BuildOptionDict[Arch]:
                                 BuildOptionDict[Arch].append(Statement)
                         else:
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
index f968beee6081..a829c0cfe34c 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
@@ -272,7 +272,7 @@ class InfBinariesObject(InfSectionCommonDef):
                                 pass
 
             if InfBianryVerItemObj != None:
-                if self.Binaries.has_key((InfBianryVerItemObj)):
+                if (InfBianryVerItemObj) in self.Binaries:
                     BinariesList = self.Binaries[InfBianryVerItemObj]
                     BinariesList.append((InfBianryVerItemObj, VerComment))
                     self.Binaries[InfBianryVerItemObj] = BinariesList
@@ -522,7 +522,7 @@ class InfBinariesObject(InfSectionCommonDef):
 #                                pass
 
             if InfBianryCommonItemObj != None:
-                if self.Binaries.has_key((InfBianryCommonItemObj)):
+                if (InfBianryCommonItemObj) in self.Binaries:
                     BinariesList = self.Binaries[InfBianryCommonItemObj]
                     BinariesList.append((InfBianryCommonItemObj, ItemComment))
                     self.Binaries[InfBianryCommonItemObj] = BinariesList
@@ -673,7 +673,7 @@ class InfBinariesObject(InfSectionCommonDef):
 #                                        pass
 
                     if InfBianryUiItemObj != None:
-                        if self.Binaries.has_key((InfBianryUiItemObj)):
+                        if (InfBianryUiItemObj) in self.Binaries:
                             BinariesList = self.Binaries[InfBianryUiItemObj]
                             BinariesList.append((InfBianryUiItemObj, UiComment))
                             self.Binaries[InfBianryUiItemObj] = BinariesList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
index 1d074ee638fd..2c9ea6ccd2cc 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
@@ -957,7 +957,7 @@ class InfDefObject(InfSectionCommonDef):
                     SpecValue = Name[Name.find("SPEC") + len("SPEC"):].strip()
                     Name = "SPEC"
                     Value = SpecValue + " = " + Value
-                if self.Defines.has_key(ArchListString):
+                if ArchListString in self.Defines:
                     DefineList = self.Defines[ArchListString]                 
                     LineInfo[0] = InfDefMemberObj.CurrentLine.GetFileName()
                     LineInfo[1] = InfDefMemberObj.CurrentLine.GetLineNo()
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
index 23125552e06d..e546127bd3e6 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
@@ -338,7 +338,7 @@ class InfGuidObject():
                                 #
                                 pass
                                                 
-            if self.Guids.has_key((InfGuidItemObj)):           
+            if (InfGuidItemObj) in self.Guids:
                 GuidList = self.Guids[InfGuidItemObj]                 
                 GuidList.append(InfGuidItemObj)
                 self.Guids[InfGuidItemObj] = GuidList
@@ -350,4 +350,4 @@ class InfGuidObject():
         return True
     
     def GetGuid(self):
-        return self.Guids
\ No newline at end of file
+        return self.Guids
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
index b18c4c381bc0..4c3233b73552 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
@@ -238,7 +238,7 @@ class InfLibraryClassObject():
                 LibItemObj.SetVersion(LibItem[1])
                 LibItemObj.SetSupArchList(__SupArchList)
 
-            if self.LibraryClasses.has_key((LibItemObj)):
+            if (LibItemObj) in self.LibraryClasses:
                 LibraryList = self.LibraryClasses[LibItemObj]
                 LibraryList.append(LibItemObj)
                 self.LibraryClasses[LibItemObj] = LibraryList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
index 74099e208860..081e69db5feb 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
@@ -114,7 +114,7 @@ class InfSpecialCommentObject(InfSectionCommonDef):
            Type == DT.TYPE_EVENT_SECTION or \
            Type == DT.TYPE_BOOTMODE_SECTION:
             for Item in SepcialSectionList:
-                if self.SpecialComments.has_key(Type):           
+                if Type in self.SpecialComments:
                     ObjList = self.SpecialComments[Type]
                     ObjList.append(Item)
                     self.SpecialComments[Type] = ObjList
@@ -145,4 +145,4 @@ def ErrorInInf(Message=None, ErrorCode=None, LineInfo=None, RaiseError=True):
                  File=LineInfo[0], 
                  Line=LineInfo[1],
                  ExtraData=LineInfo[2], 
-                 RaiseError=RaiseError)
\ No newline at end of file
+                 RaiseError=RaiseError)
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
index 37399134dbf3..164260ffbfef 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
@@ -171,7 +171,7 @@ class InfPackageObject():
                                 #
                                 pass
                                             
-            if self.Packages.has_key((PackageItemObj)):   
+            if (PackageItemObj) in self.Packages:
                 PackageList = self.Packages[PackageItemObj]
                 PackageList.append(PackageItemObj)
                 self.Packages[PackageItemObj] = PackageList
@@ -184,4 +184,4 @@ class InfPackageObject():
     
     def GetPackages(self, Arch = None):
         if Arch == None:
-            return self.Packages
\ No newline at end of file
+            return self.Packages
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
index 7b07036f91c2..b5ca01f148d1 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
@@ -411,7 +411,7 @@ class InfPcdObject():
                 else:
                     PcdItemObj.SetSupportArchList(SupArchList)
 
-                if self.Pcds.has_key((PcdTypeItem, PcdItemObj)):
+                if (PcdTypeItem, PcdItemObj) in self.Pcds:
                     PcdsList = self.Pcds[PcdTypeItem, PcdItemObj]
                     PcdsList.append(PcdItemObj)
                     self.Pcds[PcdTypeItem, PcdItemObj] = PcdsList
@@ -456,7 +456,7 @@ class InfPcdObject():
                                                       PackageInfo)
 
             PcdTypeItem = KeysList[0][0]
-            if self.Pcds.has_key((PcdTypeItem, PcdItemObj)):
+            if (PcdTypeItem, PcdItemObj) in self.Pcds:
                 PcdsList = self.Pcds[PcdTypeItem, PcdItemObj]
                 PcdsList.append(PcdItemObj)
                 self.Pcds[PcdTypeItem, PcdItemObj] = PcdsList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
index 4df62bb459ff..53e1f342cac5 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
@@ -327,7 +327,7 @@ class InfPpiObject():
                                 # 
                                 pass
             
-            if self.Ppis.has_key((InfPpiItemObj)):           
+            if (InfPpiItemObj) in self.Ppis:
                 PpiList = self.Ppis[InfPpiItemObj]
                 PpiList.append(InfPpiItemObj)
                 self.Ppis[InfPpiItemObj] = PpiList
@@ -340,4 +340,4 @@ class InfPpiObject():
         
     
     def GetPpi(self):
-        return self.Ppis
\ No newline at end of file
+        return self.Ppis
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
index c94e53c98f87..e552cb627b5e 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
@@ -296,7 +296,7 @@ class InfProtocolObject():
                                 #
                                 pass      
                                       
-            if self.Protocols.has_key((InfProtocolItemObj)):           
+            if (InfProtocolItemObj) in self.Protocols:
                 ProcotolList = self.Protocols[InfProtocolItemObj]
                 ProcotolList.append(InfProtocolItemObj)
                 self.Protocols[InfProtocolItemObj] = ProcotolList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
index 9988f8ecfeed..93ae21e16b76 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
@@ -224,7 +224,7 @@ class InfSourcesObject(InfSectionCommonDef):
                 
             ItemObj.SetSupArchList(__SupArchList) 
                                                                                                       
-            if self.Sources.has_key((ItemObj)):           
+            if (ItemObj) in self.Sources:
                 SourceContent = self.Sources[ItemObj]
                 SourceContent.append(ItemObj)
                 self.Sources[ItemObj] = SourceContent
@@ -237,4 +237,3 @@ class InfSourcesObject(InfSectionCommonDef):
      
     def GetSources(self):
         return self.Sources
-    
\ No newline at end of file
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
index 27a1c6ad25a0..f9db2944a495 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
@@ -103,7 +103,7 @@ class InfUserExtensionObject():
 #                            Line=LineNo,
 #                            ExtraData=None)
             
-            if self.UserExtension.has_key(IdContentItem):           
+            if IdContentItem in self.UserExtension:
                 #
                 # Each UserExtensions section header must have a unique set 
                 # of UserId, IdString and Arch values.
@@ -130,4 +130,4 @@ class InfUserExtensionObject():
         return True
         
     def GetUserExtension(self):
-        return self.UserExtension
\ No newline at end of file
+        return self.UserExtension
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 45cdcc89f168..216a25446f23 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -75,7 +75,7 @@ TmpTableDict = {}
 #   Otherwise, False is returned
 #
 def IsToolInPath(tool):
-    if os.environ.has_key('PATHEXT'):
+    if 'PATHEXT' in os.environ:
         extns = os.environ['PATHEXT'].split(os.path.pathsep)
     else:
         extns = ('',)
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 07/15] BaseTools: Import reduce() from functools
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (5 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 06/15] BaseTools: Remove the deprecated hash_key() Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 08/15] BaseTools: Replace StandardError with Expression Gary Lin
                   ` (8 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

In python3, reduce() is not a built-in function anymore.
Import it from "functools" to be compatible with python 3.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/AutoGen/GenPcdDb.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 875ee5895fd9..891ec5fd95c7 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -21,6 +21,7 @@ from ValidCheckingInfoObject import VAR_VALID_OBJECT_FACTORY
 from Common.VariableAttributes import VariableAttributes
 import copy
 from struct import unpack
+from functools import reduce
 
 DATABASE_VERSION = 7
 
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 08/15] BaseTools: Replace StandardError with Expression
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (6 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 07/15] BaseTools: Import reduce() from functools Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 09/15] BaseTools: Remove types.TypeType Gary Lin
                   ` (7 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

StandardError has been removed from python 3.
Replace it with Exception.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/UPT/UPT.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 3296ee3d3d8f..84b3c353201a 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -310,7 +310,7 @@ def Main():
             else:
                 GlobalData.gDB.Commit()
                 Mgr.commit()
-        except StandardError:
+        except Exception:
             Logger.Quiet(ST.MSG_RECOVER_FAIL)
         GlobalData.gDB.CloseDb()
 
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 09/15] BaseTools: Remove types.TypeType
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (7 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 08/15] BaseTools: Replace StandardError with Expression Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 10/15] BaseTools: Refactor python raise statement Gary Lin
                   ` (6 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

"types.TypeType" is now an alias of the built-in "type" and is not
compatible with python 3.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Tests/TestTools.py | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 1202289616ee..1cf2ce13be2b 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -24,7 +24,6 @@ import random
 import shutil
 import subprocess
 import sys
-import types
 import unittest
 
 TestsDir = os.path.realpath(os.path.split(sys.argv[0])[0])
@@ -43,7 +42,7 @@ if PythonSourceDir not in sys.path:
 def MakeTheTestSuite(localItems):
     tests = []
     for name, item in localItems.iteritems():
-        if isinstance(item, types.TypeType):
+        if isinstance(item, type):
             if issubclass(item, unittest.TestCase):
                 tests.append(unittest.TestLoader().loadTestsFromTestCase(item))
             elif issubclass(item, unittest.TestSuite):
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 10/15] BaseTools: Refactor python raise statement
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (8 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 09/15] BaseTools: Remove types.TypeType Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 11/15] BaseTools: Adjust the spaces around commas and colons Gary Lin
                   ` (5 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Make "raise" to be compatible with python3.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/gcc/mingw-gcc-build.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index f7d0308bd9fa..49ff656c066f 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -508,8 +508,8 @@ class Builder:
             f = open(logFile, "w")
             f.write(output)
             f.close()
-            raise Exception, 'Failed to %s %s\n' % (stage, module) + \
-                'See output log at %s' % self.config.Relative(logFile)
+            raise Exception('Failed to %s %s\n' % (stage, module) + \
+                'See output log at %s' % self.config.Relative(logFile))
         else:
             print('[done]')
 
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 11/15] BaseTools: Adjust the spaces around commas and colons
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (9 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 10/15] BaseTools: Refactor python raise statement Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 12/15] BaseTools: Migrate to the new octal literal Gary Lin
                   ` (4 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Based on "futurize -f lib2to3.fixes.fix_ws_comma"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                      |   2 +-
 BaseTools/Scripts/BinToPcd.py                                          |   8 +-
 BaseTools/Scripts/MemoryProfileSymbolGen.py                            |   6 +-
 BaseTools/Scripts/PatchCheck.py                                        |   2 +-
 BaseTools/Scripts/RunMakefile.py                                       |   2 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                             |  52 +++----
 BaseTools/Source/Python/AutoGen/GenMake.py                             |   4 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                            | 114 +++++++-------
 BaseTools/Source/Python/AutoGen/GenVar.py                              | 164 ++++++++++----------
 BaseTools/Source/Python/BPDG/GenVpd.py                                 |  12 +-
 BaseTools/Source/Python/Common/DataType.py                             |   4 +-
 BaseTools/Source/Python/Common/DscClassObject.py                       |   2 +-
 BaseTools/Source/Python/Common/EdkIIWorkspace.py                       |   2 +-
 BaseTools/Source/Python/Common/Expression.py                           |   6 +-
 BaseTools/Source/Python/Common/FdfParserLite.py                        |  12 +-
 BaseTools/Source/Python/Common/Misc.py                                 |  46 +++---
 BaseTools/Source/Python/Common/RangeExpression.py                      |   4 +-
 BaseTools/Source/Python/Common/String.py                               |   2 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                          |  10 +-
 BaseTools/Source/Python/Ecc/CParser.py                                 |  28 ++--
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py        |  14 +-
 BaseTools/Source/Python/Eot/CParser.py                                 |  28 ++--
 BaseTools/Source/Python/Eot/c.py                                       |  20 +--
 BaseTools/Source/Python/GenFds/AprioriSection.py                       |   2 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                          |   2 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                           |   6 +-
 BaseTools/Source/Python/GenFds/Fd.py                                   |   6 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                            |  26 ++--
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                      |  12 +-
 BaseTools/Source/Python/GenFds/Fv.py                                   |   4 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py                       |   4 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                 |   4 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py           |   2 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |   2 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py |   2 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         |   6 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                       |  12 +-
 BaseTools/Source/Python/Trim/Trim.py                                   |  14 +-
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                    |   8 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                              |   4 +-
 BaseTools/Source/Python/UPT/Library/String.py                          |   2 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py              |   2 +-
 BaseTools/Source/Python/UPT/UPT.py                                     |   2 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py                           |   2 +-
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                           |  24 +--
 BaseTools/Source/Python/Workspace/DecBuildData.py                      |  14 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                      | 110 ++++++-------
 BaseTools/Source/Python/Workspace/MetaFileParser.py                    |  36 ++---
 BaseTools/Source/Python/Workspace/MetaFileTable.py                     |   6 +-
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py                   |   2 +-
 BaseTools/Source/Python/build/BuildReport.py                           |   8 +-
 BaseTools/Source/Python/build/build.py                                 |   8 +-
 BaseTools/Tests/TestTools.py                                           |   2 +-
 BaseTools/gcc/mingw-gcc-build.py                                       |   2 +-
 54 files changed, 440 insertions(+), 440 deletions(-)

diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
index dd66c7111ac0..b226499e8450 100755
--- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
+++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
@@ -48,7 +48,7 @@ def ConvertCygPathToDos(CygPath):
     DosPath = CygPath
   
   # pipes.quote will add the extra \\ for us.
-  return DosPath.replace('/','\\')
+  return DosPath.replace('/', '\\')
 
 
 # we receive our options as a list, but we will be passing them to the shell as a line
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 7d8cd0a5cc25..0997ee408c05 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -42,13 +42,13 @@ if __name__ == '__main__':
     return Value
 
   def ValidatePcdName (Argument):
-    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
+    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
       Message = '%s is not in the form <PcdTokenSpaceGuidCName>.<PcdCName>' % (Argument)
       raise argparse.ArgumentTypeError(Message)
     return Argument
 
   def ValidateGuidName (Argument):
-    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
+    if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
       Message = '%s is not a valid GUID C name' % (Argument)
       raise argparse.ArgumentTypeError(Message)
     return Argument
@@ -71,7 +71,7 @@ if __name__ == '__main__':
                       help = "Output filename for PCD value or PCD statement")
   parser.add_argument("-p", "--pcd", dest = 'PcdName', type = ValidatePcdName,
                       help = "Name of the PCD in the form <PcdTokenSpaceGuidCName>.<PcdCName>")
-  parser.add_argument("-t", "--type", dest = 'PcdType', default = None, choices = ['VPD','HII'],
+  parser.add_argument("-t", "--type", dest = 'PcdType', default = None, choices = ['VPD', 'HII'],
                       help = "PCD statement type (HII or VPD).  Default is standard.")
   parser.add_argument("-m", "--max-size", dest = 'MaxSize', type = ValidateUnsignedInteger,
                       help = "Maximum size of the PCD.  Ignored with --type HII.")
@@ -85,7 +85,7 @@ if __name__ == '__main__':
                       help = "Increase output messages")
   parser.add_argument("-q", "--quiet", dest = 'Quiet', action = "store_true",
                       help = "Reduce output messages")
-  parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = list(range(0,10)), default = 0,
+  parser.add_argument("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = list(range(0, 10)), default = 0,
                       help = "Set debug level")
 
   #
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 3bc6a8897bcc..c9158800668d 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -190,7 +190,7 @@ def processLine(newline):
 
     driverPrefixLen = len("Driver - ")
     # get driver name
-    if cmp(newline[0:driverPrefixLen],"Driver - ") == 0 :
+    if cmp(newline[0:driverPrefixLen], "Driver - ") == 0 :
         driverlineList = newline.split(" ")
         driverName = driverlineList[2]
         #print "Checking : ", driverName
@@ -213,7 +213,7 @@ def processLine(newline):
         else :
             symbolsFile.symbolsTable[driverName].parse_debug_file (driverName, pdbName)
 
-    elif cmp(newline,"") == 0 :
+    elif cmp(newline, "") == 0 :
         driverName = ""
 
     # check entry line
@@ -226,7 +226,7 @@ def processLine(newline):
         rvaName = ""
         symbolName = ""
 
-    if cmp(rvaName,"") == 0 :
+    if cmp(rvaName, "") == 0 :
         return newline
     else :
         return newline + symbolName
diff --git a/BaseTools/Scripts/PatchCheck.py b/BaseTools/Scripts/PatchCheck.py
index 51d4adf08b60..211db566cb25 100755
--- a/BaseTools/Scripts/PatchCheck.py
+++ b/BaseTools/Scripts/PatchCheck.py
@@ -286,7 +286,7 @@ class GitDiffCheck:
         if self.state == START:
             if line.startswith('diff --git'):
                 self.state = PRE_PATCH
-                self.filename = line[13:].split(' ',1)[0]
+                self.filename = line[13:].split(' ', 1)[0]
                 self.is_newfile = False
                 self.force_crlf = not self.filename.endswith('.sh')
             elif len(line.rstrip()) != 0:
diff --git a/BaseTools/Scripts/RunMakefile.py b/BaseTools/Scripts/RunMakefile.py
index 48bc198c7671..6d0c4553c9eb 100644
--- a/BaseTools/Scripts/RunMakefile.py
+++ b/BaseTools/Scripts/RunMakefile.py
@@ -149,7 +149,7 @@ if __name__ == '__main__':
     for Item in gArgs.Define:
       if '=' not in Item[0]:
         continue
-      Item = Item[0].split('=',1)
+      Item = Item[0].split('=', 1)
       CommandLine.append('%s="%s"' % (Item[0], Item[1]))
   CommandLine.append('EXTRA_FLAGS="%s"' % (gArgs.Remaining))
   CommandLine.append(gArgs.BuildType)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index ab429f3e6d46..e8914df7310c 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -46,7 +46,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
 import InfSectionParser
 import datetime
 import hashlib
-from GenVar import VariableMgr,var_info
+from GenVar import VariableMgr, var_info
 
 ## Regular expression for splitting Dependency Expression string into tokens
 gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
@@ -1361,7 +1361,7 @@ class PlatformAutoGen(AutoGen):
             ShareFixedAtBuildPcdsSameValue = {} 
             for Module in LibAuto._ReferenceModules:                
                 for Pcd in Module.FixedAtBuildPcds + LibAuto.FixedAtBuildPcds:
-                    key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))  
+                    key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
                     if key not in FixedAtBuildPcds:
                         ShareFixedAtBuildPcdsSameValue[key] = True
                         FixedAtBuildPcds[key] = Pcd.DefaultValue
@@ -1369,11 +1369,11 @@ class PlatformAutoGen(AutoGen):
                         if FixedAtBuildPcds[key] != Pcd.DefaultValue:
                             ShareFixedAtBuildPcdsSameValue[key] = False      
             for Pcd in LibAuto.FixedAtBuildPcds:
-                key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
-                if (Pcd.TokenCName,Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
+                key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+                if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
                     continue
                 else:
-                    DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)]
+                    DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
                     if DscPcd.Type != "FixedAtBuild":
                         continue
                 if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:                    
@@ -1393,12 +1393,12 @@ class PlatformAutoGen(AutoGen):
                         break
 
 
-        VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(),self.DscBuildDataObj._GetSkuIds())
+        VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(), self.DscBuildDataObj._GetSkuIds())
         VariableInfo.SetVpdRegionMaxSize(VpdRegionSize)
         VariableInfo.SetVpdRegionOffset(VpdRegionBase)
         Index = 0
         for Pcd in DynamicPcdSet:
-            pcdname = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
+            pcdname = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
             for SkuName in Pcd.SkuInfoList:
                 Sku = Pcd.SkuInfoList[SkuName]
                 SkuId = Sku.SkuId
@@ -1408,11 +1408,11 @@ class PlatformAutoGen(AutoGen):
                     VariableGuidStructure = Sku.VariableGuidValue
                     VariableGuid = GuidStructureStringToGuidString(VariableGuidStructure)
                     for StorageName in Sku.DefaultStoreDict:
-                        VariableInfo.append_variable(var_info(Index,pcdname,StorageName,SkuName, StringToArray(Sku.VariableName),VariableGuid, Sku.VariableAttribute , Sku.HiiDefaultValue,Sku.DefaultStoreDict[StorageName],Pcd.DatumType))
+                        VariableInfo.append_variable(var_info(Index, pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultStoreDict[StorageName], Pcd.DatumType))
             Index += 1
         return VariableInfo
 
-    def UpdateNVStoreMaxSize(self,OrgVpdFile):
+    def UpdateNVStoreMaxSize(self, OrgVpdFile):
         if self.VariableInfo:
             VpdMapFilePath = os.path.join(self.BuildDir, "FV", "%s.map" % self.Platform.VpdToolGuid)
             PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
@@ -1425,7 +1425,7 @@ class PlatformAutoGen(AutoGen):
                 else:
                     EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
 
-                NvStoreOffset = int(NvStoreOffset,16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
+                NvStoreOffset = int(NvStoreOffset, 16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
                 default_skuobj = PcdNvStoreDfBuffer[0].SkuInfoList.get("DEFAULT")
                 maxsize = self.VariableInfo.VpdRegionSize  - NvStoreOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultValue.split(","))
                 var_data = self.VariableInfo.PatchNVStoreDefaultMaxSize(maxsize)
@@ -1663,7 +1663,7 @@ class PlatformAutoGen(AutoGen):
                     VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
 
             #Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
-            PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer","gEfiMdeModulePkgTokenSpaceGuid"))
+            PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid"))
             if PcdNvStoreDfBuffer:
                 self.VariableInfo = self.CollectVariables(self._DynamicPcdList)
                 vardump = self.VariableInfo.dump()
@@ -1690,10 +1690,10 @@ class PlatformAutoGen(AutoGen):
                         PcdValue = DefaultSku.DefaultValue
                         if PcdValue not in SkuValueMap:
                             SkuValueMap[PcdValue] = []
-                            VpdFile.Add(Pcd, 'DEFAULT',DefaultSku.VpdOffset)
+                            VpdFile.Add(Pcd, 'DEFAULT', DefaultSku.VpdOffset)
                         SkuValueMap[PcdValue].append(DefaultSku)
 
-                    for (SkuName,Sku) in Pcd.SkuInfoList.items():
+                    for (SkuName, Sku) in Pcd.SkuInfoList.items():
                         Sku.VpdOffset = Sku.VpdOffset.strip()
                         PcdValue = Sku.DefaultValue
                         if PcdValue == "":
@@ -1719,7 +1719,7 @@ class PlatformAutoGen(AutoGen):
                                     EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Alignment))
                         if PcdValue not in SkuValueMap:
                             SkuValueMap[PcdValue] = []
-                            VpdFile.Add(Pcd, SkuName,Sku.VpdOffset)
+                            VpdFile.Add(Pcd, SkuName, Sku.VpdOffset)
                         SkuValueMap[PcdValue].append(Sku)
                         # if the offset of a VPD is *, then it need to be fixed up by third party tool.
                         if not NeedProcessVpdMapFile and Sku.VpdOffset == "*":
@@ -1753,9 +1753,9 @@ class PlatformAutoGen(AutoGen):
                                 PcdValue = DefaultSku.DefaultValue
                                 if PcdValue not in SkuValueMap:
                                     SkuValueMap[PcdValue] = []
-                                    VpdFile.Add(DscPcdEntry, 'DEFAULT',Sku.VpdOffset)
+                                    VpdFile.Add(DscPcdEntry, 'DEFAULT', Sku.VpdOffset)
                                 SkuValueMap[PcdValue].append(Sku)
-                            for (SkuName,Sku) in DscPcdEntry.SkuInfoList.items():
+                            for (SkuName, Sku) in DscPcdEntry.SkuInfoList.items():
                                 Sku.VpdOffset = Sku.VpdOffset.strip() 
                                 
                                 # Need to iterate DEC pcd information to get the value & datumtype
@@ -1805,7 +1805,7 @@ class PlatformAutoGen(AutoGen):
                                             EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment))
                                 if PcdValue not in SkuValueMap:
                                     SkuValueMap[PcdValue] = []
-                                    VpdFile.Add(DscPcdEntry, SkuName,Sku.VpdOffset)
+                                    VpdFile.Add(DscPcdEntry, SkuName, Sku.VpdOffset)
                                 SkuValueMap[PcdValue].append(Sku)
                                 if not NeedProcessVpdMapFile and Sku.VpdOffset == "*":
                                     NeedProcessVpdMapFile = True 
@@ -1871,17 +1871,17 @@ class PlatformAutoGen(AutoGen):
         self._DynamicPcdList.extend(list(UnicodePcdArray))
         self._DynamicPcdList.extend(list(HiiPcdArray))
         self._DynamicPcdList.extend(list(OtherPcdArray))
-        allskuset = [(SkuName,Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName,Sku) in pcd.SkuInfoList.items()]
+        allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName, Sku) in pcd.SkuInfoList.items()]
         for pcd in self._DynamicPcdList:
             if len(pcd.SkuInfoList) == 1:
-                for (SkuName,SkuId) in allskuset:
-                    if type(SkuId) in (str,unicode) and eval(SkuId) == 0 or SkuId == 0:
+                for (SkuName, SkuId) in allskuset:
+                    if type(SkuId) in (str, unicode) and eval(SkuId) == 0 or SkuId == 0:
                         continue
                     pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList['DEFAULT'])
                     pcd.SkuInfoList[SkuName].SkuId = SkuId
         self.AllPcdList = self._NonDynamicPcdList + self._DynamicPcdList
 
-    def FixVpdOffset(self,VpdFile ):
+    def FixVpdOffset(self, VpdFile):
         FvPath = os.path.join(self.BuildDir, "FV")
         if not os.path.exists(FvPath):
             try:
@@ -2143,7 +2143,7 @@ class PlatformAutoGen(AutoGen):
         if self._NonDynamicPcdDict:
             return self._NonDynamicPcdDict
         for Pcd in self.NonDynamicPcdList:
-            self._NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)] = Pcd
+            self._NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
         return self._NonDynamicPcdDict
 
     ## Get list of non-dynamic PCDs
@@ -3967,7 +3967,7 @@ class ModuleAutoGen(AutoGen):
         try:
             fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
         except:
-            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
+            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
 
         # Use a instance of StringIO to cache data
         fStringIO = StringIO('')  
@@ -4003,7 +4003,7 @@ class ModuleAutoGen(AutoGen):
             fInputfile.write (fStringIO.getvalue())
         except:
             EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
-                            "file been locked or using by other applications." %UniVfrOffsetFileName,None)
+                            "file been locked or using by other applications." %UniVfrOffsetFileName, None)
 
         fStringIO.close ()
         fInputfile.close ()
@@ -4448,7 +4448,7 @@ class ModuleAutoGen(AutoGen):
     def CopyBinaryFiles(self):
         for File in self.Module.Binaries:
             SrcPath = File.Path
-            DstPath = os.path.join(self.OutputDir , os.path.basename(SrcPath))
+            DstPath = os.path.join(self.OutputDir, os.path.basename(SrcPath))
             CopyLongFilePath(SrcPath, DstPath)
     ## Create autogen code for the module and its dependent libraries
     #
@@ -4599,7 +4599,7 @@ class ModuleAutoGen(AutoGen):
         if SrcTimeStamp > DstTimeStamp:
             return False
 
-        with open(self.GetTimeStampPath(),'r') as f:
+        with open(self.GetTimeStampPath(), 'r') as f:
             for source in f:
                 source = source.rstrip('\n')
                 if not os.path.exists(source):
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 8891b1b97d23..eb56d0e7c5a3 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -746,7 +746,7 @@ cleanlib:
                         if CmdName == 'Trim':
                             SecDepsFileList.append(os.path.join('$(DEBUG_DIR)', os.path.basename(OutputFile).replace('offset', 'efi')))
                         if OutputFile.endswith('.ui') or OutputFile.endswith('.ver'):
-                            SecDepsFileList.append(os.path.join('$(MODULE_DIR)','$(MODULE_FILE)'))
+                            SecDepsFileList.append(os.path.join('$(MODULE_DIR)', '$(MODULE_FILE)'))
                         self.FfsOutputFileList.append((OutputFile, ' '.join(SecDepsFileList), SecCmdStr))
                         if len(SecDepsFileList) > 0:
                             self.ParseSecCmd(SecDepsFileList, CmdTuple)
@@ -864,7 +864,7 @@ cleanlib:
                         for Target in BuildTargets:
                             for i, SingleCommand in enumerate(BuildTargets[Target].Commands):
                                 if FlagDict[Flag]['Macro'] in SingleCommand:
-                                    BuildTargets[Target].Commands[i] = SingleCommand.replace('$(INC)','').replace(FlagDict[Flag]['Macro'], RespMacro)
+                                    BuildTargets[Target].Commands[i] = SingleCommand.replace('$(INC)', '').replace(FlagDict[Flag]['Macro'], RespMacro)
         return RespDict
 
     def ProcessBuildTargetList(self):
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 891ec5fd95c7..f158b999d89b 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -270,7 +270,7 @@ def toHex(s):
             hv = '0'+hv
         lst.append(hv)
     if lst:
-        return reduce(lambda x,y:x+y, lst)
+        return reduce(lambda x, y:x+y, lst)
     else:
         return 'empty'
 ## DbItemList
@@ -650,22 +650,22 @@ def StringArrayToList(StringArray):
 #
 def GetTokenTypeValue(TokenType):
     TokenTypeDict = {
-        "PCD_TYPE_SHIFT":28,
-        "PCD_TYPE_DATA":(0x0 << 28),
-        "PCD_TYPE_HII":(0x8 << 28),
-        "PCD_TYPE_VPD":(0x4 << 28),
+        "PCD_TYPE_SHIFT": 28,
+        "PCD_TYPE_DATA": (0x0 << 28),
+        "PCD_TYPE_HII": (0x8 << 28),
+        "PCD_TYPE_VPD": (0x4 << 28),
 #        "PCD_TYPE_SKU_ENABLED":(0x2 << 28),
-        "PCD_TYPE_STRING":(0x1 << 28),
+        "PCD_TYPE_STRING": (0x1 << 28),
 
-        "PCD_DATUM_TYPE_SHIFT":24,
-        "PCD_DATUM_TYPE_POINTER":(0x0 << 24),
-        "PCD_DATUM_TYPE_UINT8":(0x1 << 24),
-        "PCD_DATUM_TYPE_UINT16":(0x2 << 24),
-        "PCD_DATUM_TYPE_UINT32":(0x4 << 24),
-        "PCD_DATUM_TYPE_UINT64":(0x8 << 24),
+        "PCD_DATUM_TYPE_SHIFT": 24,
+        "PCD_DATUM_TYPE_POINTER": (0x0 << 24),
+        "PCD_DATUM_TYPE_UINT8": (0x1 << 24),
+        "PCD_DATUM_TYPE_UINT16": (0x2 << 24),
+        "PCD_DATUM_TYPE_UINT32": (0x4 << 24),
+        "PCD_DATUM_TYPE_UINT64": (0x8 << 24),
 
-        "PCD_DATUM_TYPE_SHIFT2":20,
-        "PCD_DATUM_TYPE_UINT8_BOOLEAN":(0x1 << 20 | 0x1 << 24),
+        "PCD_DATUM_TYPE_SHIFT2": 20,
+        "PCD_DATUM_TYPE_UINT8_BOOLEAN": (0x1 << 20 | 0x1 << 24),
         }
     return eval(TokenType, TokenTypeDict)
 
@@ -719,7 +719,7 @@ def BuildExDataBase(Dict):
     DbPcdCNameTable = DbStringItemList(0, RawDataList = PcdCNameTableValue, LenList = PcdCNameLen)
     
     PcdNameOffsetTable = Dict['PCD_NAME_OFFSET']
-    DbPcdNameOffsetTable = DbItemList(4,RawDataList = PcdNameOffsetTable)
+    DbPcdNameOffsetTable = DbItemList(4, RawDataList = PcdNameOffsetTable)
     
     SizeTableValue = zip(Dict['SIZE_TABLE_MAXIMUM_LENGTH'], Dict['SIZE_TABLE_CURRENT_LENGTH'])
     DbSizeTableValue = DbSizeTableItemList(2, RawDataList = SizeTableValue)
@@ -754,16 +754,16 @@ def BuildExDataBase(Dict):
     PcdTokenNumberMap = Dict['PCD_ORDER_TOKEN_NUMBER_MAP']
  
     DbNameTotle = ["SkuidValue",  "InitValueUint64", "VardefValueUint64", "InitValueUint32", "VardefValueUint32", "VpdHeadValue", "ExMapTable",
-               "LocalTokenNumberTable", "GuidTable", "StringHeadValue",  "PcdNameOffsetTable","VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
+               "LocalTokenNumberTable", "GuidTable", "StringHeadValue",  "PcdNameOffsetTable", "VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
                "SizeTableValue", "InitValueUint16", "VardefValueUint16", "InitValueUint8", "VardefValueUint8", "InitValueBoolean",
                "VardefValueBoolean", "UnInitValueUint64", "UnInitValueUint32", "UnInitValueUint16", "UnInitValueUint8", "UnInitValueBoolean"]
  
     DbTotal = [SkuidValue,  InitValueUint64, VardefValueUint64, InitValueUint32, VardefValueUint32, VpdHeadValue, ExMapTable,
-               LocalTokenNumberTable, GuidTable, StringHeadValue,  PcdNameOffsetTable,VariableTable, StringTableLen, PcdTokenTable,PcdCNameTable,
+               LocalTokenNumberTable, GuidTable, StringHeadValue,  PcdNameOffsetTable, VariableTable, StringTableLen, PcdTokenTable, PcdCNameTable,
                SizeTableValue, InitValueUint16, VardefValueUint16, InitValueUint8, VardefValueUint8, InitValueBoolean,
                VardefValueBoolean, UnInitValueUint64, UnInitValueUint32, UnInitValueUint16, UnInitValueUint8, UnInitValueBoolean]
     DbItemTotal = [DbSkuidValue,  DbInitValueUint64, DbVardefValueUint64, DbInitValueUint32, DbVardefValueUint32, DbVpdHeadValue, DbExMapTable,
-               DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue,  DbPcdNameOffsetTable,DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
+               DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue,  DbPcdNameOffsetTable, DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
                DbSizeTableValue, DbInitValueUint16, DbVardefValueUint16, DbInitValueUint8, DbVardefValueUint8, DbInitValueBoolean,
                DbVardefValueBoolean, DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean]
     
@@ -822,7 +822,7 @@ def BuildExDataBase(Dict):
                         DbOffset += (8 - DbOffset % 8)
             else:
                 assert(False)
-            if isinstance(VariableRefTable[0],list):
+            if isinstance(VariableRefTable[0], list):
                 DbOffset += skuindex * 4   
             skuindex += 1
             if DbIndex >= InitTableNum:
@@ -984,72 +984,72 @@ def CreatePcdDataBase(PcdDBData):
     basedata = {}
     if not PcdDBData:
         return ""
-    for skuname,skuid in PcdDBData:
-        if len(PcdDBData[(skuname,skuid)][1]) != len(PcdDBData[("DEFAULT","0")][1]):
+    for skuname, skuid in PcdDBData:
+        if len(PcdDBData[(skuname, skuid)][1]) != len(PcdDBData[("DEFAULT", "0")][1]):
             EdkLogger.ERROR("The size of each sku in one pcd are not same")
-    for skuname,skuid in PcdDBData:
+    for skuname, skuid in PcdDBData:
         if skuname == "DEFAULT":
             continue
-        delta[(skuname,skuid)] = [(index,data,hex(data)) for index,data in enumerate(PcdDBData[(skuname,skuid)][1]) if PcdDBData[(skuname,skuid)][1][index] != PcdDBData[("DEFAULT","0")][1][index]]
-        basedata[(skuname,skuid)] = [(index,PcdDBData[("DEFAULT","0")][1][index],hex(PcdDBData[("DEFAULT","0")][1][index])) for index,data in enumerate(PcdDBData[(skuname,skuid)][1]) if PcdDBData[(skuname,skuid)][1][index] != PcdDBData[("DEFAULT","0")][1][index]]
-    databasebuff = PcdDBData[("DEFAULT","0")][0]
+        delta[(skuname, skuid)] = [(index, data, hex(data)) for index, data in enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1][index] != PcdDBData[("DEFAULT", "0")][1][index]]
+        basedata[(skuname, skuid)] = [(index, PcdDBData[("DEFAULT", "0")][1][index], hex(PcdDBData[("DEFAULT", "0")][1][index])) for index, data in enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1][index] != PcdDBData[("DEFAULT", "0")][1][index]]
+    databasebuff = PcdDBData[("DEFAULT", "0")][0]
 
-    for skuname,skuid in delta:
+    for skuname, skuid in delta:
         # 8 byte align
         if len(databasebuff) % 8 > 0:
             for i in range(8 - (len(databasebuff) % 8)):
-                databasebuff += pack("=B",0)
+                databasebuff += pack("=B", 0)
         databasebuff += pack('=Q', int(skuid))
         databasebuff += pack('=Q', 0)
-        databasebuff += pack('=L', 8+8+4+4*len(delta[(skuname,skuid)]))
-        for item in delta[(skuname,skuid)]:
-            databasebuff += pack("=L",item[0])
-            databasebuff = databasebuff[:-1] + pack("=B",item[1])
+        databasebuff += pack('=L', 8+8+4+4*len(delta[(skuname, skuid)]))
+        for item in delta[(skuname, skuid)]:
+            databasebuff += pack("=L", item[0])
+            databasebuff = databasebuff[:-1] + pack("=B", item[1])
     totallen = len(databasebuff)
-    totallenbuff = pack("=L",totallen)
+    totallenbuff = pack("=L", totallen)
     newbuffer = databasebuff[:32]
     for i in range(4):
         newbuffer += totallenbuff[i]
-    for i in range(36,totallen):
+    for i in range(36, totallen):
         newbuffer += databasebuff[i]
 
     return newbuffer
 def CreateVarCheckBin(VarCheckTab):
-    return VarCheckTab[('DEFAULT',"0")]
+    return VarCheckTab[('DEFAULT', "0")]
 def CreateAutoGen(PcdDriverAutoGenData):
     autogenC = TemplateString()
-    for skuname,skuid in PcdDriverAutoGenData:
+    for skuname, skuid in PcdDriverAutoGenData:
         autogenC.Append("//SKUID: %s" % skuname)
-        autogenC.Append(PcdDriverAutoGenData[(skuname,skuid)][1].String)
-    return (PcdDriverAutoGenData[(skuname,skuid)][0],autogenC)
-def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform,Phase):
-    def prune_sku(pcd,skuname):
+        autogenC.Append(PcdDriverAutoGenData[(skuname, skuid)][1].String)
+    return (PcdDriverAutoGenData[(skuname, skuid)][0], autogenC)
+def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase):
+    def prune_sku(pcd, skuname):
         new_pcd = copy.deepcopy(pcd)
         new_pcd.SkuInfoList = {skuname:pcd.SkuInfoList[skuname]}
         return new_pcd
     DynamicPcds = Platform.DynamicPcdList
-    DynamicPcdSet_Sku = {(SkuName,skuobj.SkuId):[] for pcd in DynamicPcds for (SkuName,skuobj) in pcd.SkuInfoList.items() }
-    for skuname,skuid in DynamicPcdSet_Sku:
-        DynamicPcdSet_Sku[(skuname,skuid)] = [prune_sku(pcd,skuname) for pcd in DynamicPcds]
+    DynamicPcdSet_Sku = {(SkuName, skuobj.SkuId):[] for pcd in DynamicPcds for (SkuName, skuobj) in pcd.SkuInfoList.items() }
+    for skuname, skuid in DynamicPcdSet_Sku:
+        DynamicPcdSet_Sku[(skuname, skuid)] = [prune_sku(pcd, skuname) for pcd in DynamicPcds]
     PcdDBData = {}
     PcdDriverAutoGenData = {}
     VarCheckTableData = {}
     if DynamicPcdSet_Sku:
-        for skuname,skuid in DynamicPcdSet_Sku:
-            AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform,DynamicPcdSet_Sku[(skuname,skuid)], Phase)
+        for skuname, skuid in DynamicPcdSet_Sku:
+            AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdSet_Sku[(skuname, skuid)], Phase)
             final_data = ()
             for item in PcdDbBuffer:
-                final_data += unpack("B",item)
-            PcdDBData[(skuname,skuid)] = (PcdDbBuffer, final_data)
-            PcdDriverAutoGenData[(skuname,skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
-            VarCheckTableData[(skuname,skuid)] = VarCheckTab
+                final_data += unpack("B", item)
+            PcdDBData[(skuname, skuid)] = (PcdDbBuffer, final_data)
+            PcdDriverAutoGenData[(skuname, skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
+            VarCheckTableData[(skuname, skuid)] = VarCheckTab
         if Platform.Platform.VarCheckFlag:
             dest = os.path.join(Platform.BuildDir, 'FV')
             VarCheckTable = CreateVarCheckBin(VarCheckTableData)
             VarCheckTable.dump(dest, Phase)
         AdditionalAutoGenH, AdditionalAutoGenC =  CreateAutoGen(PcdDriverAutoGenData)
     else:
-        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform,{}, Phase)
+        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, {}, Phase)
 
     PcdDbBuffer = CreatePcdDataBase(PcdDBData)
     return AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer
@@ -1090,20 +1090,20 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
 
     Dict['PCD_INFO_FLAG'] = Platform.Platform.PcdInfoFlag
 
-    for DatumType in ['UINT64','UINT32','UINT16','UINT8','BOOLEAN', "VOID*"]:
+    for DatumType in ['UINT64', 'UINT32', 'UINT16', 'UINT8', 'BOOLEAN', "VOID*"]:
         Dict['VARDEF_CNAME_' + DatumType] = []
         Dict['VARDEF_GUID_' + DatumType]  = []
         Dict['VARDEF_SKUID_' + DatumType] = []
         Dict['VARDEF_VALUE_' + DatumType] = []
         Dict['VARDEF_DB_VALUE_' + DatumType] = []
-        for Init in ['INIT','UNINIT']:
+        for Init in ['INIT', 'UNINIT']:
             Dict[Init+'_CNAME_DECL_' + DatumType]   = []
             Dict[Init+'_GUID_DECL_' + DatumType]    = []
             Dict[Init+'_NUMSKUS_DECL_' + DatumType] = []
             Dict[Init+'_VALUE_' + DatumType]        = []
             Dict[Init+'_DB_VALUE_'+DatumType] = []
             
-    for Type in ['STRING_HEAD','VPD_HEAD','VARIABLE_HEAD']:
+    for Type in ['STRING_HEAD', 'VPD_HEAD', 'VARIABLE_HEAD']:
         Dict[Type + '_CNAME_DECL']   = []
         Dict[Type + '_GUID_DECL']    = []
         Dict[Type + '_NUMSKUS_DECL'] = []
@@ -1271,7 +1271,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                         Dict['STRING_TABLE_INDEX'].append('')
                     else:
                         Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
-                    VarNameSize = len(VariableNameStructure.replace(',',' ').split())
+                    VarNameSize = len(VariableNameStructure.replace(',', ' ').split())
                     Dict['STRING_TABLE_LENGTH'].append(VarNameSize )
                     Dict['STRING_TABLE_VALUE'].append(VariableNameStructure)
                     StringHeadOffsetList.append(str(StringTableSize) + 'U')
@@ -1279,7 +1279,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     VarStringDbOffsetList.append(StringTableSize)
                     Dict['STRING_DB_VALUE'].append(VarStringDbOffsetList)
                     StringTableIndex += 1
-                    StringTableSize += len(VariableNameStructure.replace(',',' ').split())
+                    StringTableSize += len(VariableNameStructure.replace(',', ' ').split())
                 VariableHeadStringIndex = 0
                 for Index in range(Dict['STRING_TABLE_VALUE'].index(VariableNameStructure)):
                     VariableHeadStringIndex += Dict['STRING_TABLE_LENGTH'][Index]
@@ -1318,7 +1318,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     elif Pcd.DatumType in ("UINT32", "UINT16", "UINT8"):
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "U")
                     elif Pcd.DatumType == "BOOLEAN":
-                        if eval(Sku.HiiDefaultValue) in [1,0]:
+                        if eval(Sku.HiiDefaultValue) in [1, 0]:
                             Dict['VARDEF_VALUE_'+Pcd.DatumType].append(str(eval(Sku.HiiDefaultValue)) + "U")
                     else:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
@@ -1368,7 +1368,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                         Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
                     if Sku.DefaultValue[0] == 'L':
                         DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
-                        Size = len(DefaultValueBinStructure.replace(',',' ').split())
+                        Size = len(DefaultValueBinStructure.replace(',', ' ').split())
                         Dict['STRING_TABLE_VALUE'].append(DefaultValueBinStructure)
                     elif Sku.DefaultValue[0] == '"':
                         DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
@@ -1684,7 +1684,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
 
 #     print Phase
     Buffer = BuildExDataBase(Dict)
-    return AutoGenH, AutoGenC, Buffer,VarCheckTab
+    return AutoGenH, AutoGenC, Buffer, VarCheckTab
 
 def GetOrderedDynamicPcdList(DynamicPcdList, PcdTokenNumberList):
     ReorderedDyPcdList = [None for i in range(len(DynamicPcdList))]
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index d668c1edadbb..5a914eb7ffc0 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -15,7 +15,7 @@
 # Import Modules
 #
 from builtins import range
-from struct import pack,unpack
+from struct import pack, unpack
 import collections
 import copy
 from Common.VariableAttributes import VariableAttributes
@@ -48,7 +48,7 @@ def PackGUID(Guid):
     return GuidBuffer
 
 class VariableMgr(object):
-    def __init__(self, DefaultStoreMap,SkuIdMap):
+    def __init__(self, DefaultStoreMap, SkuIdMap):
         self.VarInfo = []
         self.DefaultStoreMap = DefaultStoreMap
         self.SkuIdMap = SkuIdMap
@@ -58,19 +58,19 @@ class VariableMgr(object):
         self.VarDefaultBuff = None
         self.VarDeltaBuff = None
 
-    def append_variable(self,uefi_var):
+    def append_variable(self, uefi_var):
         self.VarInfo.append(uefi_var)
 
-    def SetVpdRegionMaxSize(self,maxsize):
+    def SetVpdRegionMaxSize(self, maxsize):
         self.VpdRegionSize = maxsize
 
-    def SetVpdRegionOffset(self,vpdoffset):
+    def SetVpdRegionOffset(self, vpdoffset):
         self.VpdRegionOffset = vpdoffset
 
-    def PatchNVStoreDefaultMaxSize(self,maxsize):
+    def PatchNVStoreDefaultMaxSize(self, maxsize):
         if not self.NVHeaderBuff:
             return ""
-        self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q",maxsize)
+        self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q", maxsize)
         default_var_bin = self.format_data(self.NVHeaderBuff + self.VarDefaultBuff + self.VarDeltaBuff)
         value_str = "{"
         default_var_bin_strip = [ data.strip("""'""") for data in default_var_bin]
@@ -86,7 +86,7 @@ class VariableMgr(object):
         for item in self.VarInfo:
             if item.pcdindex not in indexedvarinfo:
                 indexedvarinfo[item.pcdindex] = dict()
-            indexedvarinfo[item.pcdindex][(item.skuname,item.defaultstoragename)] = item
+            indexedvarinfo[item.pcdindex][(item.skuname, item.defaultstoragename)] = item
 
         for index in indexedvarinfo:
             sku_var_info = indexedvarinfo[index]
@@ -94,44 +94,44 @@ class VariableMgr(object):
             default_data_buffer = ""
             others_data_buffer = ""
             tail = None
-            default_sku_default = indexedvarinfo.get(index).get(("DEFAULT","STANDARD"))
+            default_sku_default = indexedvarinfo.get(index).get(("DEFAULT", "STANDARD"))
 
-            if default_sku_default.data_type not in ["UINT8","UINT16","UINT32","UINT64","BOOLEAN"]:
+            if default_sku_default.data_type not in ["UINT8", "UINT16", "UINT32", "UINT64", "BOOLEAN"]:
                 var_max_len = max([len(var_item.default_value.split(",")) for var_item in sku_var_info.values()])
                 if len(default_sku_default.default_value.split(",")) < var_max_len:
                     tail = ",".join([ "0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(",")))])
 
-            default_data_buffer = self.PACK_VARIABLES_DATA(default_sku_default.default_value,default_sku_default.data_type,tail)
+            default_data_buffer = self.PACK_VARIABLES_DATA(default_sku_default.default_value, default_sku_default.data_type, tail)
 
             default_data_array = ()
             for item in default_data_buffer:
-                default_data_array += unpack("B",item)
+                default_data_array += unpack("B", item)
 
-            if ("DEFAULT","STANDARD") not in var_data:
-                var_data[("DEFAULT","STANDARD")] = collections.OrderedDict()
-            var_data[("DEFAULT","STANDARD")][index] = (default_data_buffer,sku_var_info[("DEFAULT","STANDARD")])
+            if ("DEFAULT", "STANDARD") not in var_data:
+                var_data[("DEFAULT", "STANDARD")] = collections.OrderedDict()
+            var_data[("DEFAULT", "STANDARD")][index] = (default_data_buffer, sku_var_info[("DEFAULT", "STANDARD")])
 
-            for (skuid,defaultstoragename) in indexedvarinfo.get(index):
+            for (skuid, defaultstoragename) in indexedvarinfo.get(index):
                 tail = None
-                if (skuid,defaultstoragename) == ("DEFAULT","STANDARD"):
+                if (skuid, defaultstoragename) == ("DEFAULT", "STANDARD"):
                     continue
-                other_sku_other = indexedvarinfo.get(index).get((skuid,defaultstoragename))
+                other_sku_other = indexedvarinfo.get(index).get((skuid, defaultstoragename))
 
-                if default_sku_default.data_type not in ["UINT8","UINT16","UINT32","UINT64","BOOLEAN"]:
+                if default_sku_default.data_type not in ["UINT8", "UINT16", "UINT32", "UINT64", "BOOLEAN"]:
                     if len(other_sku_other.default_value.split(",")) < var_max_len:
                         tail = ",".join([ "0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(",")))])
 
-                others_data_buffer = self.PACK_VARIABLES_DATA(other_sku_other.default_value,other_sku_other.data_type,tail)
+                others_data_buffer = self.PACK_VARIABLES_DATA(other_sku_other.default_value, other_sku_other.data_type, tail)
 
                 others_data_array = ()
                 for item in others_data_buffer:
-                    others_data_array += unpack("B",item)
+                    others_data_array += unpack("B", item)
 
                 data_delta = self.calculate_delta(default_data_array, others_data_array)
 
-                if (skuid,defaultstoragename) not in var_data:
-                    var_data[(skuid,defaultstoragename)] = collections.OrderedDict()
-                var_data[(skuid,defaultstoragename)][index] = (data_delta,sku_var_info[(skuid,defaultstoragename)])
+                if (skuid, defaultstoragename) not in var_data:
+                    var_data[(skuid, defaultstoragename)] = collections.OrderedDict()
+                var_data[(skuid, defaultstoragename)][index] = (data_delta, sku_var_info[(skuid, defaultstoragename)])
         return var_data
 
     def new_process_varinfo(self):
@@ -141,17 +141,17 @@ class VariableMgr(object):
         if not var_data:
             return []
 
-        pcds_default_data = var_data.get(("DEFAULT","STANDARD"),{})
+        pcds_default_data = var_data.get(("DEFAULT", "STANDARD"), {})
         NvStoreDataBuffer = ""
         var_data_offset = collections.OrderedDict()
         offset = NvStorageHeaderSize
-        for default_data,default_info in pcds_default_data.values():
+        for default_data, default_info in pcds_default_data.values():
             var_name_buffer = self.PACK_VARIABLE_NAME(default_info.var_name)
 
             vendorguid = default_info.var_guid.split('-')
 
             if default_info.var_attribute:
-                var_attr_value,_ = VariableAttributes.GetVarAttributes(default_info.var_attribute)
+                var_attr_value, _ = VariableAttributes.GetVarAttributes(default_info.var_attribute)
             else:
                 var_attr_value = 0x07
 
@@ -170,22 +170,22 @@ class VariableMgr(object):
         nv_default_part = self.AlignData(self.PACK_DEFAULT_DATA(0, 0, self.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
 
         data_delta_structure_buffer = ""
-        for skuname,defaultstore in var_data:
-            if (skuname,defaultstore) == ("DEFAULT","STANDARD"):
+        for skuname, defaultstore in var_data:
+            if (skuname, defaultstore) == ("DEFAULT", "STANDARD"):
                 continue
-            pcds_sku_data = var_data.get((skuname,defaultstore))
+            pcds_sku_data = var_data.get((skuname, defaultstore))
             delta_data_set = []
             for pcdindex in pcds_sku_data:
                 offset = var_data_offset[pcdindex]
-                delta_data,_ = pcds_sku_data[pcdindex]
+                delta_data, _ = pcds_sku_data[pcdindex]
                 delta_data = [(item[0] + offset, item[1]) for item in delta_data]
                 delta_data_set.extend(delta_data)
 
-            data_delta_structure_buffer += self.AlignData(self.PACK_DELTA_DATA(skuname,defaultstore,delta_data_set), 8)
+            data_delta_structure_buffer += self.AlignData(self.PACK_DELTA_DATA(skuname, defaultstore, delta_data_set), 8)
 
         size = len(nv_default_part + data_delta_structure_buffer) + 16
         maxsize = self.VpdRegionSize if self.VpdRegionSize else size
-        NV_Store_Default_Header = self.PACK_NV_STORE_DEFAULT_HEADER(size,maxsize)
+        NV_Store_Default_Header = self.PACK_NV_STORE_DEFAULT_HEADER(size, maxsize)
 
         self.NVHeaderBuff =  NV_Store_Default_Header
         self.VarDefaultBuff =nv_default_part
@@ -193,14 +193,14 @@ class VariableMgr(object):
         return self.format_data(NV_Store_Default_Header + nv_default_part + data_delta_structure_buffer)
 
 
-    def format_data(self,data):
+    def format_data(self, data):
 
         return  [hex(item) for item in self.unpack_data(data)]
 
-    def unpack_data(self,data):
+    def unpack_data(self, data):
         final_data = ()
         for item in data:
-            final_data += unpack("B",item)
+            final_data += unpack("B", item)
         return final_data
 
     def calculate_delta(self, default, theother):
@@ -209,7 +209,7 @@ class VariableMgr(object):
         data_delta = []
         for i in range(len(default)):
             if default[i] != theother[i]:
-                data_delta.append((i,theother[i]))
+                data_delta.append((i, theother[i]))
         return data_delta
 
     def dump(self):
@@ -223,40 +223,40 @@ class VariableMgr(object):
             return value_str
         return ""
 
-    def PACK_VARIABLE_STORE_HEADER(self,size):
+    def PACK_VARIABLE_STORE_HEADER(self, size):
         #Signature: gEfiVariableGuid
         Guid = "{ 0xddcf3616, 0x3275, 0x4164, { 0x98, 0xb6, 0xfe, 0x85, 0x70, 0x7f, 0xfe, 0x7d }}"
         Guid = GuidStructureStringToGuidString(Guid)
         GuidBuffer = PackGUID(Guid.split('-'))
 
-        SizeBuffer = pack('=L',size)
-        FormatBuffer = pack('=B',0x5A)
-        StateBuffer = pack('=B',0xFE)
-        reservedBuffer = pack('=H',0)
-        reservedBuffer += pack('=L',0)
+        SizeBuffer = pack('=L', size)
+        FormatBuffer = pack('=B', 0x5A)
+        StateBuffer = pack('=B', 0xFE)
+        reservedBuffer = pack('=H', 0)
+        reservedBuffer += pack('=L', 0)
 
         return GuidBuffer + SizeBuffer + FormatBuffer + StateBuffer + reservedBuffer
 
-    def PACK_NV_STORE_DEFAULT_HEADER(self,size,maxsize):
-        Signature = pack('=B',ord('N'))
-        Signature += pack("=B",ord('S'))
-        Signature += pack("=B",ord('D'))
-        Signature += pack("=B",ord('B'))
+    def PACK_NV_STORE_DEFAULT_HEADER(self, size, maxsize):
+        Signature = pack('=B', ord('N'))
+        Signature += pack("=B", ord('S'))
+        Signature += pack("=B", ord('D'))
+        Signature += pack("=B", ord('B'))
 
-        SizeBuffer = pack("=L",size)
-        MaxSizeBuffer = pack("=Q",maxsize)
+        SizeBuffer = pack("=L", size)
+        MaxSizeBuffer = pack("=Q", maxsize)
 
         return Signature + SizeBuffer + MaxSizeBuffer
 
-    def PACK_VARIABLE_HEADER(self,attribute,namesize,datasize,vendorguid):
+    def PACK_VARIABLE_HEADER(self, attribute, namesize, datasize, vendorguid):
 
-        Buffer = pack('=H',0x55AA) # pack StartID
-        Buffer += pack('=B',0x3F)  # pack State
-        Buffer += pack('=B',0)     # pack reserved
+        Buffer = pack('=H', 0x55AA) # pack StartID
+        Buffer += pack('=B', 0x3F)  # pack State
+        Buffer += pack('=B', 0)     # pack reserved
 
-        Buffer += pack('=L',attribute)
-        Buffer += pack('=L',namesize)
-        Buffer += pack('=L',datasize)
+        Buffer += pack('=L', attribute)
+        Buffer += pack('=L', namesize)
+        Buffer += pack('=L', datasize)
 
         Buffer += PackGUID(vendorguid)
 
@@ -267,63 +267,63 @@ class VariableMgr(object):
         data_len = 0
         if data_type == "VOID*":
             for value_char in var_value.strip("{").strip("}").split(","):
-                Buffer += pack("=B",int(value_char,16))
+                Buffer += pack("=B", int(value_char, 16))
             data_len += len(var_value.split(","))
             if tail:
                 for value_char in tail.split(","):
-                    Buffer += pack("=B",int(value_char,16))
+                    Buffer += pack("=B", int(value_char, 16))
                 data_len += len(tail.split(","))
         elif data_type == "BOOLEAN":
-            Buffer += pack("=B",True) if var_value.upper() == "TRUE" else pack("=B",False)
+            Buffer += pack("=B", True) if var_value.upper() == "TRUE" else pack("=B", False)
             data_len += 1
         elif data_type  == "UINT8":
-            Buffer += pack("=B",GetIntegerValue(var_value))
+            Buffer += pack("=B", GetIntegerValue(var_value))
             data_len += 1
         elif data_type == "UINT16":
-            Buffer += pack("=H",GetIntegerValue(var_value))
+            Buffer += pack("=H", GetIntegerValue(var_value))
             data_len += 2
         elif data_type == "UINT32":
-            Buffer += pack("=L",GetIntegerValue(var_value))
+            Buffer += pack("=L", GetIntegerValue(var_value))
             data_len += 4
         elif data_type == "UINT64":
-            Buffer += pack("=Q",GetIntegerValue(var_value))
+            Buffer += pack("=Q", GetIntegerValue(var_value))
             data_len += 8
 
         return Buffer
 
-    def PACK_DEFAULT_DATA(self, defaultstoragename,skuid,var_value):
+    def PACK_DEFAULT_DATA(self, defaultstoragename, skuid, var_value):
         Buffer = ""
-        Buffer += pack("=L",4+8+8)
-        Buffer += pack("=Q",int(skuid))
-        Buffer += pack("=Q",int(defaultstoragename))
+        Buffer += pack("=L", 4+8+8)
+        Buffer += pack("=Q", int(skuid))
+        Buffer += pack("=Q", int(defaultstoragename))
 
         for item in var_value:
-            Buffer += pack("=B",item)
+            Buffer += pack("=B", item)
 
-        Buffer = pack("=L",len(Buffer)+4) + Buffer
+        Buffer = pack("=L", len(Buffer)+4) + Buffer
 
         return Buffer
 
-    def GetSkuId(self,skuname):
+    def GetSkuId(self, skuname):
         if skuname not in self.SkuIdMap:
             return None
         return self.SkuIdMap.get(skuname)[0]
-    def GetDefaultStoreId(self,dname):
+    def GetDefaultStoreId(self, dname):
         if dname not in self.DefaultStoreMap:
             return None
         return self.DefaultStoreMap.get(dname)[0]
-    def PACK_DELTA_DATA(self,skuname,defaultstoragename,delta_list):
+    def PACK_DELTA_DATA(self, skuname, defaultstoragename, delta_list):
         skuid = self.GetSkuId(skuname)
         defaultstorageid = self.GetDefaultStoreId(defaultstoragename)
         Buffer = ""
-        Buffer += pack("=L",4+8+8)
-        Buffer += pack("=Q",int(skuid))
-        Buffer += pack("=Q",int(defaultstorageid))
-        for (delta_offset,value) in delta_list:
-            Buffer += pack("=L",delta_offset)
-            Buffer = Buffer[:-1] + pack("=B",value)
+        Buffer += pack("=L", 4+8+8)
+        Buffer += pack("=Q", int(skuid))
+        Buffer += pack("=Q", int(defaultstorageid))
+        for (delta_offset, value) in delta_list:
+            Buffer += pack("=L", delta_offset)
+            Buffer = Buffer[:-1] + pack("=B", value)
 
-        Buffer = pack("=L",len(Buffer) + 4) + Buffer
+        Buffer = pack("=L", len(Buffer) + 4) + Buffer
 
         return Buffer
 
@@ -331,13 +331,13 @@ class VariableMgr(object):
         mybuffer = data
         if (len(data) % align) > 0:
             for i in range(align - (len(data) % align)):
-                mybuffer += pack("=B",0)
+                mybuffer += pack("=B", 0)
 
         return mybuffer
 
     def PACK_VARIABLE_NAME(self, var_name):
         Buffer = ""
         for name_char in var_name.strip("{").strip("}").split(","):
-            Buffer += pack("=B",int(name_char,16))
+            Buffer += pack("=B", int(name_char, 16))
 
         return Buffer
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 33b62011b9d0..17ca9e411061 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -345,7 +345,7 @@ class GenVPD :
                 #
                 # Enhanced for support "|" character in the string.
                 #
-                ValueList = ['', '', '', '','']
+                ValueList = ['', '', '', '', '']
 
                 ValueRe = re.compile(r'\s*L?\".*\|.*\"\s*$')
                 PtrValue = ValueRe.findall(line)
@@ -395,7 +395,7 @@ class GenVPD :
         count = 0
         for line in self.FileLinesList:
             if line != None :
-                PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4],line[5], self.InputFileName)   
+                PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4], line[5], self.InputFileName)
                 # Strip the space char
                 PCD.PcdCName     = PCD.PcdCName.strip(' ')
                 PCD.SkuId        = PCD.SkuId.strip(' ')
@@ -507,10 +507,10 @@ class GenVPD :
         index =0
         for pcd in self.PcdUnknownOffsetList:
             index += 1
-            if pcd.PcdCName == ".".join(("gEfiMdeModulePkgTokenSpaceGuid","PcdNvStoreDefaultValueBuffer")):
+            if pcd.PcdCName == ".".join(("gEfiMdeModulePkgTokenSpaceGuid", "PcdNvStoreDefaultValueBuffer")):
                 if index != len(self.PcdUnknownOffsetList):
                     for i in range(len(self.PcdUnknownOffsetList) - index):
-                        self.PcdUnknownOffsetList[index+i -1 ] , self.PcdUnknownOffsetList[index+i] = self.PcdUnknownOffsetList[index+i] , self.PcdUnknownOffsetList[index+i -1]
+                        self.PcdUnknownOffsetList[index+i -1 ], self.PcdUnknownOffsetList[index+i] = self.PcdUnknownOffsetList[index+i], self.PcdUnknownOffsetList[index+i -1]
 
         #
         # Process all Offset value are "*"
@@ -587,7 +587,7 @@ class GenVPD :
                                 eachUnfixedPcd.PcdOffset    = str(hex(LastOffset))
                                 eachUnfixedPcd.PcdBinOffset = LastOffset
                                 # Insert this pcd into fixed offset pcd list.
-                                self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount,eachUnfixedPcd)
+                                self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount, eachUnfixedPcd)
                                 
                                 # Delete the item's offset that has been fixed and added into fixed offset list
                                 self.PcdUnknownOffsetList.pop(countOfUnfixedList)
@@ -672,7 +672,7 @@ class GenVPD :
         for eachPcd in self.PcdFixedOffsetSizeList  :
             # write map file
             try :
-                fMapFile.write("%s | %s | %s | %s | %s  \n" % (eachPcd.PcdCName, eachPcd.SkuId,eachPcd.PcdOffset, eachPcd.PcdSize,eachPcd.PcdUnpackValue))
+                fMapFile.write("%s | %s | %s | %s | %s  \n" % (eachPcd.PcdCName, eachPcd.SkuId, eachPcd.PcdOffset, eachPcd.PcdSize, eachPcd.PcdUnpackValue))
             except:
                 EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.MapFileName, None)
 
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 0bc2306ea61a..d69908dabfec 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -497,8 +497,8 @@ PCDS_DYNAMICEX_DEFAULT = "PcdsDynamicExDefault"
 PCDS_DYNAMICEX_VPD = "PcdsDynamicExVpd"
 PCDS_DYNAMICEX_HII = "PcdsDynamicExHii"
 
-SECTIONS_HAVE_ITEM_PCD = [PCDS_DYNAMIC_DEFAULT.upper(),PCDS_DYNAMIC_VPD.upper(),PCDS_DYNAMIC_HII.upper(), \
-                          PCDS_DYNAMICEX_DEFAULT.upper(),PCDS_DYNAMICEX_VPD.upper(),PCDS_DYNAMICEX_HII.upper()]
+SECTIONS_HAVE_ITEM_PCD = [PCDS_DYNAMIC_DEFAULT.upper(), PCDS_DYNAMIC_VPD.upper(), PCDS_DYNAMIC_HII.upper(), \
+                          PCDS_DYNAMICEX_DEFAULT.upper(), PCDS_DYNAMICEX_VPD.upper(), PCDS_DYNAMICEX_HII.upper()]
 # Section allowed to have items after arch
 SECTIONS_HAVE_ITEM_AFTER_ARCH = [TAB_LIBRARY_CLASSES.upper(), TAB_DEPEX.upper(), TAB_USER_EXTENSIONS.upper(),
                                  PCDS_DYNAMIC_DEFAULT.upper(),
diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/Source/Python/Common/DscClassObject.py
index f42d247cad33..e6abc1f036ac 100644
--- a/BaseTools/Source/Python/Common/DscClassObject.py
+++ b/BaseTools/Source/Python/Common/DscClassObject.py
@@ -1307,7 +1307,7 @@ class Dsc(DscObject):
                 # Parse '!else'
                 #
                 if LineValue.upper().find(TAB_ELSE.upper()) > -1:
-                    Key = IfDefList[-1][0].split(' ' , 1)[0].strip()
+                    Key = IfDefList[-1][0].split(' ', 1)[0].strip()
                     self.InsertConditionalStatement(Filename, FileID, Model, IfDefList, StartLine, Arch)
                     IfDefList.append((Key, StartLine, MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE))
                     continue
diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspace.py b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
index ed85e4ee0b06..52f63ae53df8 100644
--- a/BaseTools/Source/Python/Common/EdkIIWorkspace.py
+++ b/BaseTools/Source/Python/Common/EdkIIWorkspace.py
@@ -114,7 +114,7 @@ class EdkIIWorkspace:
     # @retval string  The full path filename
     #
     def WorkspaceFile(self, FileName):
-        return os.path.realpath(mws.join(self.WorkspaceDir,FileName))
+        return os.path.realpath(mws.join(self.WorkspaceDir, FileName))
 
     ## Convert to a real path filename
     #
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 4b66307b7eb3..7663df7160c1 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -164,7 +164,7 @@ class ValueExpression(object):
                 if Oprand1[0] in ['"', "'"] or Oprand1.startswith('L"') or Oprand1.startswith("L'")or Oprand1.startswith('UINT'):
                     Oprand1, Size = ParseFieldValue(Oprand1)
                 else:
-                    Oprand1,Size = ParseFieldValue('"' + Oprand1 + '"')
+                    Oprand1, Size = ParseFieldValue('"' + Oprand1 + '"')
             if type(Oprand2) == type(''):
                 if Oprand2[0] in ['"', "'"] or Oprand2.startswith('L"') or Oprand2.startswith("L'") or Oprand2.startswith('UINT'):
                     Oprand2, Size = ParseFieldValue(Oprand2)
@@ -487,7 +487,7 @@ class ValueExpression(object):
         IsArray = IsGuid = False
         if len(Token.split(',')) == 11 and len(Token.split(',{')) == 2 \
             and len(Token.split('},')) == 1:
-            HexLen = [11,6,6,5,4,4,4,4,4,4,6]
+            HexLen = [11, 6, 6, 5, 4, 4, 4, 4, 4, 4, 6]
             HexList= Token.split(',')
             if HexList[3].startswith('{') and \
                 not [Index for Index, Hex in enumerate(HexList) if len(Hex) > HexLen[Index]]:
@@ -683,7 +683,7 @@ class ValueExpression(object):
     # Parse operator
     def _GetOperator(self):
         self.__SkipWS()
-        LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?',':']
+        LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?', ':']
 
         self._Token = ''
         Expr = self._Expr[self._Idx:]
diff --git a/BaseTools/Source/Python/Common/FdfParserLite.py b/BaseTools/Source/Python/Common/FdfParserLite.py
index f2741616c46f..6b7612303730 100644
--- a/BaseTools/Source/Python/Common/FdfParserLite.py
+++ b/BaseTools/Source/Python/Common/FdfParserLite.py
@@ -2341,7 +2341,7 @@ class FdfParser(object):
         
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
@@ -2610,7 +2610,7 @@ class FdfParser(object):
         
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
@@ -2927,7 +2927,7 @@ class FdfParser(object):
             
         AlignValue = ""
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment At Line ", self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
@@ -2992,7 +2992,7 @@ class FdfParser(object):
                 CheckSum = True
     
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                         "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                     raise Warning("Incorrect alignment At Line ", self.FileName, self.CurrentLineNumber)
                 if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
@@ -3067,7 +3067,7 @@ class FdfParser(object):
                 FvImageSectionObj.FvFileType = self.__Token
                 
                 if self.__GetAlignment():
-                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                             "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                         raise Warning("Incorrect alignment At Line ", self.FileName, self.CurrentLineNumber)
                     FvImageSectionObj.Alignment = self.__Token
@@ -3135,7 +3135,7 @@ class FdfParser(object):
                 EfiSectionObj.BuildNum = self.__Token
                 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 49522474b36f..7a7f3f80c65a 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -125,7 +125,7 @@ def _parseForGCC(lines, efifilepath, varnames):
                     if Str:
                         m = re.match('^([\da-fA-Fx]+) +([\da-fA-Fx]+)', Str.strip())
                         if m != None:
-                            varoffset.append((varname, int(m.groups(0)[0], 16) , int(sections[-1][1], 16), sections[-1][0]))
+                            varoffset.append((varname, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
 
     if not varoffset:
         return []
@@ -1475,15 +1475,15 @@ def AnalyzePcdExpression(Setting):
     return FieldList
 
 def ParseDevPathValue (Value):
-    DevPathList = [ "Path","HardwarePath","Pci","PcCard","MemoryMapped","VenHw","Ctrl","BMC","AcpiPath","Acpi","PciRoot",
-                    "PcieRoot","Floppy","Keyboard","Serial","ParallelPort","AcpiEx","AcpiExp","AcpiAdr","Msg","Ata","Scsi",
-                    "Fibre","FibreEx","I1394","USB","I2O","Infiniband","VenMsg","VenPcAnsi","VenVt100","VenVt100Plus",
-                    "VenUtf8","UartFlowCtrl","SAS","SasEx","NVMe","UFS","SD","eMMC","DebugPort","MAC","IPv4","IPv6","Uart",
-                    "UsbClass","UsbAudio","UsbCDCControl","UsbHID","UsbImage","UsbPrinter","UsbMassStorage","UsbHub",
-                    "UsbCDCData","UsbSmartCard","UsbVideo","UsbDiagnostic","UsbWireless","UsbDeviceFirmwareUpdate",
-                    "UsbIrdaBridge","UsbTestAndMeasurement","UsbWwid","Unit","iSCSI","Vlan","Uri","Bluetooth","Wi-Fi",
-                    "MediaPath","HD","CDROM","VenMedia","Media","Fv","FvFile","Offset","RamDisk","VirtualDisk","VirtualCD",
-                    "PersistentVirtualDisk","PersistentVirtualCD","BbsPath","BBS","Sata" ]
+    DevPathList = [ "Path", "HardwarePath", "Pci", "PcCard", "MemoryMapped", "VenHw", "Ctrl", "BMC", "AcpiPath", "Acpi", "PciRoot",
+                    "PcieRoot", "Floppy", "Keyboard", "Serial", "ParallelPort", "AcpiEx", "AcpiExp", "AcpiAdr", "Msg", "Ata", "Scsi",
+                    "Fibre", "FibreEx", "I1394", "USB", "I2O", "Infiniband", "VenMsg", "VenPcAnsi", "VenVt100", "VenVt100Plus",
+                    "VenUtf8", "UartFlowCtrl", "SAS", "SasEx", "NVMe", "UFS", "SD", "eMMC", "DebugPort", "MAC", "IPv4", "IPv6", "Uart",
+                    "UsbClass", "UsbAudio", "UsbCDCControl", "UsbHID", "UsbImage", "UsbPrinter", "UsbMassStorage", "UsbHub",
+                    "UsbCDCData", "UsbSmartCard", "UsbVideo", "UsbDiagnostic", "UsbWireless", "UsbDeviceFirmwareUpdate",
+                    "UsbIrdaBridge", "UsbTestAndMeasurement", "UsbWwid", "Unit", "iSCSI", "Vlan", "Uri", "Bluetooth", "Wi-Fi",
+                    "MediaPath", "HD", "CDROM", "VenMedia", "Media", "Fv", "FvFile", "Offset", "RamDisk", "VirtualDisk", "VirtualCD",
+                    "PersistentVirtualDisk", "PersistentVirtualCD", "BbsPath", "BBS", "Sata" ]
     if '\\' in Value:
         Value.replace('\\', '/').replace(' ', '')
     for Item in Value.split('/'):
@@ -1665,7 +1665,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
 #         Value, Size = ParseFieldValue(Value)
         if Size:
             try:
-                int(Size,16) if Size.upper().startswith("0X") else int(Size)
+                int(Size, 16) if Size.upper().startswith("0X") else int(Size)
             except:
                 IsValid = False
                 Size = -1
@@ -1694,7 +1694,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
 
         if Size:
             try:
-                int(Size,16) if Size.upper().startswith("0X") else int(Size)
+                int(Size, 16) if Size.upper().startswith("0X") else int(Size)
             except:
                 IsValid = False
                 Size = -1
@@ -1716,7 +1716,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
             IsValid = (len(FieldList) <= 3)
         if Size:
             try:
-                int(Size,16) if Size.upper().startswith("0X") else int(Size)
+                int(Size, 16) if Size.upper().startswith("0X") else int(Size)
             except:
                 IsValid = False
                 Size = -1
@@ -1920,7 +1920,7 @@ def ConvertStringToByteArray(Value):
 
     Value = eval(Value)         # translate escape character
     NewValue = '{'
-    for Index in range(0,len(Value)):
+    for Index in range(0, len(Value)):
         if Unicode:
             NewValue = NewValue + str(ord(Value[Index]) % 0x10000) + ','
         else:
@@ -2164,28 +2164,28 @@ class PeImageClass():
         return Value
 
 class DefaultStore():
-    def __init__(self,DefaultStores ):
+    def __init__(self, DefaultStores):
 
         self.DefaultStores = DefaultStores
-    def DefaultStoreID(self,DefaultStoreName):
-        for key,value in self.DefaultStores.items():
+    def DefaultStoreID(self, DefaultStoreName):
+        for key, value in self.DefaultStores.items():
             if value == DefaultStoreName:
                 return key
         return None
     def GetDefaultDefault(self):
         if not self.DefaultStores or "0" in self.DefaultStores:
-            return "0",TAB_DEFAULT_STORES_DEFAULT
+            return "0", TAB_DEFAULT_STORES_DEFAULT
         else:
             minvalue = min([int(value_str) for value_str in self.DefaultStores.keys()])
             return (str(minvalue), self.DefaultStores[str(minvalue)])
-    def GetMin(self,DefaultSIdList):
+    def GetMin(self, DefaultSIdList):
         if not DefaultSIdList:
             return "STANDARD"
         storeidset = {storeid for storeid, storename in self.DefaultStores.values() if storename in DefaultSIdList}
         if not storeidset:
             return ""
         minid = min(storeidset )
-        for sid,name in self.DefaultStores.values():
+        for sid, name in self.DefaultStores.values():
             if sid == minid:
                 return name
 class SkuClass():
@@ -2200,7 +2200,7 @@ class SkuClass():
 
         for SkuName in SkuIds:
             SkuId = SkuIds[SkuName][0]
-            skuid_num = int(SkuId,16) if SkuId.upper().startswith("0X") else int(SkuId)
+            skuid_num = int(SkuId, 16) if SkuId.upper().startswith("0X") else int(SkuId)
             if skuid_num > 0xFFFFFFFFFFFFFFFF:
                 EdkLogger.error("build", PARAMETER_INVALID,
                             ExtraData = "SKU-ID [%s] value %s exceeds the max value of UINT64"
@@ -2249,9 +2249,9 @@ class SkuClass():
             self.__SkuInherit = {}
             for item in self.SkuData.values():
                 self.__SkuInherit[item[1]]=item[2] if item[2] else "DEFAULT"
-        return self.__SkuInherit.get(skuname,"DEFAULT")
+        return self.__SkuInherit.get(skuname, "DEFAULT")
 
-    def GetSkuChain(self,sku):
+    def GetSkuChain(self, sku):
         if sku == "DEFAULT":
             return ["DEFAULT"]
         skulist = [sku]
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 4357f240f423..496961554e87 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -176,7 +176,7 @@ class EQOperatorObject(object):
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
         rangeContainer = RangeContainer()
-        rangeContainer.push(RangeObject(int(Operand) , int(Operand)))
+        rangeContainer.push(RangeObject(int(Operand), int(Operand)))
         SymbolTable[rangeId1] = rangeContainer
         return rangeId1   
     
@@ -473,7 +473,7 @@ class RangeExpression(object):
 
     # [!]*A
     def _RelExpr(self):
-        if self._IsOperator(["NOT" , "LE", "GE", "LT", "GT", "EQ", "XOR"]):
+        if self._IsOperator(["NOT", "LE", "GE", "LT", "GT", "EQ", "XOR"]):
             Token = self._Token
             Val = self._NeExpr()
             try:
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index e6c7a3b74ee1..358e7b8d7c31 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -739,7 +739,7 @@ def SplitString(String):
 # @param StringList:  A list for strings to be converted
 #
 def ConvertToSqlString(StringList):
-    return map(lambda s: s.replace("'", "''") , StringList)
+    return map(lambda s: s.replace("'", "''"), StringList)
 
 ## Convert To Sql String
 #
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 84dd7ac563dd..d59697c64b68 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -89,7 +89,7 @@ class VpdInfoFile:
     #
     #  @param offset integer value for VPD's offset in specific SKU.
     #
-    def Add(self, Vpd, skuname,Offset):
+    def Add(self, Vpd, skuname, Offset):
         if (Vpd == None):
             EdkLogger.error("VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOWN_ERROR, "Invalid VPD PCD entry.")
         
@@ -141,7 +141,7 @@ class VpdInfoFile:
                 if PcdValue == "" :
                     PcdValue  = Pcd.DefaultValue
 
-                Content += "%s.%s|%s|%s|%s|%s  \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname,str(self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(),PcdValue)
+                Content += "%s.%s|%s|%s|%s|%s  \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname, str(self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(), PcdValue)
                 i += 1
 
         return SaveFileOnChange(FilePath, Content, False)
@@ -170,8 +170,8 @@ class VpdInfoFile:
             # the line must follow output format defined in BPDG spec.
             #
             try:
-                PcdName, SkuId,Offset, Size, Value = Line.split("#")[0].split("|")
-                PcdName, SkuId,Offset, Size, Value = PcdName.strip(), SkuId.strip(),Offset.strip(), Size.strip(), Value.strip()
+                PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
+                PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
                 TokenSpaceName, PcdTokenName = PcdName.split(".")
             except:
                 EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Fail to parse VPD information file %s" % FilePath)
@@ -180,7 +180,7 @@ class VpdInfoFile:
             
             if (TokenSpaceName, PcdTokenName) not in self._VpdInfo:
                 self._VpdInfo[(TokenSpaceName, PcdTokenName)] = []
-            self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId,Offset, Value))
+            self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId, Offset, Value))
             for VpdObject in self._VpdArray.keys():
                 VpdObjectTokenCName = VpdObject.TokenCName
                 for PcdItem in GlobalData.MixedPcd:
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index d1b6aed71087..66e488d96c29 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -792,10 +792,10 @@ class CParser(Parser):
                 if self.backtracking == 0:
                           
                     if d != None:
-                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
+                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
                     else:
                       self.function_definition_stack[-1].ModifierText = ''
-                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start,declarator1.stop)
+                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
                     self.function_definition_stack[-1].DeclLine = declarator1.start.line
                     self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
                     if a != None:
@@ -929,9 +929,9 @@ class CParser(Parser):
                     if self.backtracking == 0:
                             
                         if b != None:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
                         else:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
                         	  
 
 
@@ -966,7 +966,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if t != None:
-                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
+                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
                         	
 
 
@@ -1410,7 +1410,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if s.stop != None:
-                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
+                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
                         	
 
 
@@ -1425,7 +1425,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if e.stop != None:
-                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
                         	
 
 
@@ -5408,7 +5408,7 @@ class CParser(Parser):
                 if self.failed:
                     return 
                 if self.backtracking == 0:
-                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
+                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
 
                 # C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
                 while True: #loop65
@@ -5508,7 +5508,7 @@ class CParser(Parser):
                         if self.failed:
                             return 
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
 
 
 
@@ -8284,7 +8284,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16391,7 +16391,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                     self.following.append(self.FOLLOW_statement_in_selection_statement2284)
                     self.statement()
@@ -16510,7 +16510,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16542,7 +16542,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16589,7 +16589,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index e04b67732141..145c7435cd12 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -562,7 +562,7 @@ class InfParser(MetaFileParser):
                     NmakeLine = ''
 
             # section content
-            self._ValueList = ['','','']
+            self._ValueList = ['', '', '']
             # parse current line, result will be put in self._ValueList
             self._SectionParser[self._SectionType](self)
             if self._ValueList == None or self._ItemType == MODEL_META_DATA_DEFINE:
@@ -921,7 +921,7 @@ class DscParser(MetaFileParser):
 
     ## Directive statement parser
     def _DirectiveParser(self):
-        self._ValueList = ['','','']
+        self._ValueList = ['', '', '']
         TokenList = GetSplitValueList(self._CurrentLine, ' ', 1)
         self._ValueList[0:len(TokenList)] = TokenList
 
@@ -1111,7 +1111,7 @@ class DscParser(MetaFileParser):
 
     ## Override parent's method since we'll do all macro replacements in parser
     def _GetMacros(self):
-        Macros = dict( [('ARCH','IA32'), ('FAMILY','MSFT'),('TOOL_CHAIN_TAG','VS2008x86'),('TARGET','DEBUG')])
+        Macros = dict( [('ARCH', 'IA32'), ('FAMILY', 'MSFT'), ('TOOL_CHAIN_TAG', 'VS2008x86'), ('TARGET', 'DEBUG')])
         Macros.update(self._FileLocalMacros)
         Macros.update(self._GetApplicableSectionMacro())
         Macros.update(GlobalData.gEdkGlobal)
@@ -1226,7 +1226,7 @@ class DscParser(MetaFileParser):
         self._RawTable.Drop()
         self._Table.Drop()
         for Record in RecordList:
-            EccGlobalData.gDb.TblDsc.Insert(Record[1],Record[2],Record[3],Record[4],Record[5],Record[6],Record[7],Record[8],Record[9],Record[10],Record[11],Record[12],Record[13],Record[14])
+            EccGlobalData.gDb.TblDsc.Insert(Record[1], Record[2], Record[3], Record[4], Record[5], Record[6], Record[7], Record[8], Record[9], Record[10], Record[11], Record[12], Record[13], Record[14])
         GlobalData.gPlatformDefines.update(self._FileLocalMacros)
         self._PostProcessed = True
         self._Content = None
@@ -1247,7 +1247,7 @@ class DscParser(MetaFileParser):
 
     def __RetrievePcdValue(self):
         Records = self._RawTable.Query(MODEL_PCD_FEATURE_FLAG, BelongsToItem=-1.0)
-        for TokenSpaceGuid,PcdName,Value,Dummy2,Dummy3,ID,Line in Records:
+        for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
             Value, DatumType, MaxDatumSize = AnalyzePcdData(Value)
             # Only use PCD whose value is straitforward (no macro and PCD)
             if self.SymbolPattern.findall(Value):
@@ -1572,7 +1572,7 @@ class DecParser(MetaFileParser):
                 continue
 
             # section content
-            self._ValueList = ['','','']
+            self._ValueList = ['', '', '']
             self._SectionParser[self._SectionType[0]](self)
             if self._ValueList == None or self._ItemType == MODEL_META_DATA_DEFINE:
                 self._ItemType = -1
@@ -1718,7 +1718,7 @@ class DecParser(MetaFileParser):
                         GuidValue = GuidValue.lstrip(' {')
                         HexList.append('0x' + str(GuidValue[2:]))
                         Index += 1
-            self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (HexList[0], HexList[1], HexList[2],HexList[3],HexList[4],HexList[5],HexList[6],HexList[7],HexList[8],HexList[9],HexList[10])
+            self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (HexList[0], HexList[1], HexList[2], HexList[3], HexList[4], HexList[5], HexList[6], HexList[7], HexList[8], HexList[9], HexList[10])
         else:
             EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value format",
                             ExtraData=self._CurrentLine + \
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index d1b6aed71087..66e488d96c29 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -792,10 +792,10 @@ class CParser(Parser):
                 if self.backtracking == 0:
                           
                     if d != None:
-                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
+                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
                     else:
                       self.function_definition_stack[-1].ModifierText = ''
-                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start,declarator1.stop)
+                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
                     self.function_definition_stack[-1].DeclLine = declarator1.start.line
                     self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
                     if a != None:
@@ -929,9 +929,9 @@ class CParser(Parser):
                     if self.backtracking == 0:
                             
                         if b != None:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
                         else:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
                         	  
 
 
@@ -966,7 +966,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if t != None:
-                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
+                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
                         	
 
 
@@ -1410,7 +1410,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if s.stop != None:
-                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
+                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
                         	
 
 
@@ -1425,7 +1425,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if e.stop != None:
-                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
                         	
 
 
@@ -5408,7 +5408,7 @@ class CParser(Parser):
                 if self.failed:
                     return 
                 if self.backtracking == 0:
-                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
+                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
 
                 # C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
                 while True: #loop65
@@ -5508,7 +5508,7 @@ class CParser(Parser):
                         if self.failed:
                             return 
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
 
 
 
@@ -8284,7 +8284,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16391,7 +16391,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                     self.following.append(self.FOLLOW_statement_in_selection_statement2284)
                     self.statement()
@@ -16510,7 +16510,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16542,7 +16542,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16589,7 +16589,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index c70f62f393a9..ceefc952237f 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -128,11 +128,11 @@ def GetIdentifierList():
 
     for pp in FileProfile.PPDirectiveList:
         Type = GetIdType(pp.Content)
-        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0],pp.StartPos[1],pp.EndPos[0],pp.EndPos[1])
+        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1])
         IdList.append(IdPP)
 
     for ae in FileProfile.AssignmentExpressionList:
-        IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.StartPos[0],ae.StartPos[1],ae.EndPos[0],ae.EndPos[1])
+        IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.StartPos[0], ae.StartPos[1], ae.EndPos[0], ae.EndPos[1])
         IdList.append(IdAE)
 
     FuncDeclPattern = GetFuncDeclPattern()
@@ -154,7 +154,7 @@ def GetIdentifierList():
                     var.Modifier += ' ' + FuncNamePartList[Index]
                     var.Declarator = var.Declarator.lstrip().lstrip(FuncNamePartList[Index])
                     Index += 1
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
             IdList.append(IdVar)
             continue
 
@@ -167,7 +167,7 @@ def GetIdentifierList():
                     var.Modifier += ' ' + Name[LSBPos:]
                     Name = Name[0:LSBPos]
 
-                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
                 IdList.append(IdVar)
         else:
             DeclList = var.Declarator.split('=')
@@ -176,7 +176,7 @@ def GetIdentifierList():
                 LSBPos = var.Declarator.find('[')
                 var.Modifier += ' ' + Name[LSBPos:]
                 Name = Name[0:LSBPos]
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
             IdList.append(IdVar)
 
     for enum in FileProfile.EnumerationDefinitionList:
@@ -184,7 +184,7 @@ def GetIdentifierList():
         RBPos = enum.Content.find('}')
         Name = enum.Content[4:LBPos].strip()
         Value = enum.Content[LBPos+1:RBPos]
-        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0],enum.StartPos[1],enum.EndPos[0],enum.EndPos[1])
+        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0], enum.StartPos[1], enum.EndPos[0], enum.EndPos[1])
         IdList.append(IdEnum)
 
     for su in FileProfile.StructUnionDefinitionList:
@@ -201,7 +201,7 @@ def GetIdentifierList():
         else:
             Name = su.Content[SkipLen:LBPos].strip()
             Value = su.Content[LBPos+1:RBPos]
-        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0],su.StartPos[1],su.EndPos[0],su.EndPos[1])
+        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1])
         IdList.append(IdPE)
 
     TdFuncPointerPattern = GetTypedefFuncPointerPattern()
@@ -224,11 +224,11 @@ def GetIdentifierList():
             Name = TmpStr[0:RBPos]
             Value = 'FP' + TmpStr[RBPos + 1:]
 
-        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0],td.StartPos[1],td.EndPos[0],td.EndPos[1])
+        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0], td.StartPos[1], td.EndPos[0], td.EndPos[1])
         IdList.append(IdTd)
 
     for funcCall in FileProfile.FunctionCallingList:
-        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0],funcCall.StartPos[1],funcCall.EndPos[0],funcCall.EndPos[1])
+        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndPos[1])
         IdList.append(IdFC)
     return IdList
 
@@ -330,7 +330,7 @@ def GetFunctionList():
                 FuncDef.Modifier += ' ' + FuncNamePartList[Index]
                 Index += 1
 
-        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0],FuncDef.StartPos[1],FuncDef.EndPos[0],FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
+        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0], FuncDef.StartPos[1], FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
         FuncObjList.append(FuncObj)
 
     return FuncObjList
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 27fe2619a35f..b678079b3785 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -23,7 +23,7 @@ import FfsFileStatement
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import AprioriSectionClassObject
 from Common.String import *
-from Common.Misc import SaveFileOnChange,PathClass
+from Common.Misc import SaveFileOnChange, PathClass
 from Common import EdkLogger
 from Common.BuildToolError import *
 
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index 5b806d9e4482..1fa202149b25 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -207,7 +207,7 @@ class CapsulePayload(CapsuleData):
         #
         Guid = self.ImageTypeId.split('-')
         Buffer = pack('=ILHHBBBBBBBBBBBBIIQ',
-                       int(self.Version,16),
+                       int(self.Version, 16),
                        int(Guid[0], 16), 
                        int(Guid[1], 16), 
                        int(Guid[2], 16), 
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index 5029ec7a1823..d24df30cb734 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -130,7 +130,7 @@ class EfiSection (EfiSectionClassObject):
             elif FileList != []:
                 for File in FileList:
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum , Index)
+                    Num = '%s.%d' %(SecNum, Index)
                     OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + Ffs.SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     VerString = f.read()
@@ -187,7 +187,7 @@ class EfiSection (EfiSectionClassObject):
             elif FileList != []:
                 for File in FileList:
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum , Index)
+                    Num = '%s.%d' %(SecNum, Index)
                     OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + Ffs.SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     UiString = f.read()
@@ -228,7 +228,7 @@ class EfiSection (EfiSectionClassObject):
                 for File in FileList:
                     """ Copy Map file to FFS output path """
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum , Index)
+                    Num = '%s.%d' %(SecNum, Index)
                     OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + Num + Ffs.SectionSuffix.get(SectionType))
                     File = GenFdsGlobalVariable.MacroExtend(File, Dict)
                     
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index f735d3b5b015..21060625217e 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -136,7 +136,7 @@ class FD(FDClassObject):
             # Call each region's AddToBuffer function
             #
             GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
-            RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict,Flag=Flag)
+            RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict, Flag=Flag)
         #
         # Write the buffer contents to Fd file
         #
@@ -162,7 +162,7 @@ class FD(FDClassObject):
                 if len(RegionObj.RegionDataList) == 1:
                     RegionData = RegionObj.RegionDataList[0]
                     FvList.append(RegionData.upper())
-                    FvAddDict[RegionData.upper()] = (int(self.BaseAddress,16) + \
+                    FvAddDict[RegionData.upper()] = (int(self.BaseAddress, 16) + \
                                                 RegionObj.Offset, RegionObj.Size)
                 else:
                     Offset = RegionObj.Offset
@@ -177,7 +177,7 @@ class FD(FDClassObject):
                             Size = 0
                             for blockStatement in FvObj.BlockSizeList:
                                 Size = Size + blockStatement[0] * blockStatement[1]
-                            FvAddDict[RegionData.upper()] = (int(self.BaseAddress,16) + \
+                            FvAddDict[RegionData.upper()] = (int(self.BaseAddress, 16) + \
                                                              Offset, Size)
                             Offset = Offset + Size
         #
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index d4ba485bcdff..43f849b07172 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -1855,7 +1855,7 @@ class FdfParser:
             return long(
                 ValueExpression(Expr,
                                 self.__CollectMacroPcd()
-                                )(True),0)
+                                )(True), 0)
         except Exception:
             self.SetFileBufferPos(StartPos)
             return None
@@ -2768,7 +2768,7 @@ class FdfParser:
         while True:
             AlignValue = None
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                         "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 #For FFS, Auto is default option same to ""
@@ -2828,7 +2828,7 @@ class FdfParser:
             FfsFileObj.CheckSum = True
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
@@ -2900,7 +2900,7 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
@@ -3190,7 +3190,7 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
@@ -3583,7 +3583,7 @@ class FdfParser:
         AfileName = self.__Token
         AfileBaseName = os.path.basename(AfileName)
         
-        if os.path.splitext(AfileBaseName)[1]  not in [".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"]:
+        if os.path.splitext(AfileBaseName)[1]  not in [".bin", ".BIN", ".Bin", ".dat", ".DAT", ".Dat", ".data", ".DATA", ".Data"]:
             raise Warning('invalid binary file type, should be one of "bin","BIN","Bin","dat","DAT","Dat","data","DATA","Data"', \
                           self.FileName, self.CurrentLineNumber)
         
@@ -3782,7 +3782,7 @@ class FdfParser:
 
         AlignValue = ""
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
@@ -3832,7 +3832,7 @@ class FdfParser:
 
             SectAlignment = ""
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                         "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
@@ -3912,7 +3912,7 @@ class FdfParser:
                 FvImageSectionObj.FvFileType = self.__Token
 
                 if self.__GetAlignment():
-                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                             "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                         raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                     FvImageSectionObj.Alignment = self.__Token
@@ -3980,7 +3980,7 @@ class FdfParser:
                 EfiSectionObj.BuildNum = self.__Token
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             if self.__Token == 'Auto' and (not SectionName == 'PE32') and (not SectionName == 'TE'):
@@ -4720,7 +4720,7 @@ class FdfParser:
                     FvInFdList = self.__GetFvInFd(RefFdName)
                     if FvInFdList != []:
                         for FvNameInFd in FvInFdList:
-                            LogStr += "FD %s contains FV %s\n" % (RefFdName,FvNameInFd)
+                            LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
                             if FvNameInFd not in RefFvStack:
                                 RefFvStack.append(FvNameInFd)
 
@@ -4776,7 +4776,7 @@ class FdfParser:
                         CapInFdList = self.__GetCapInFd(RefFdName)
                         if CapInFdList != []:
                             for CapNameInFd in CapInFdList:
-                                LogStr += "FD %s contains Capsule %s\n" % (RefFdName,CapNameInFd)
+                                LogStr += "FD %s contains Capsule %s\n" % (RefFdName, CapNameInFd)
                                 if CapNameInFd not in RefCapStack:
                                     RefCapStack.append(CapNameInFd)
 
@@ -4787,7 +4787,7 @@ class FdfParser:
                         FvInFdList = self.__GetFvInFd(RefFdName)
                         if FvInFdList != []:
                             for FvNameInFd in FvInFdList:
-                                LogStr += "FD %s contains FV %s\n" % (RefFdName,FvNameInFd)
+                                LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
                                 if FvNameInFd not in RefFvList:
                                     RefFvList.append(FvNameInFd)
 
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index baee54369852..d4354171ab5e 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -429,7 +429,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
         self.__InfParse__(Dict)
         Arch = self.GetCurrentArch()
-        SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir , self.InfFileName);
+        SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName);
         DestFile = os.path.join( self.OutputPath, self.ModuleGuid + '.ffs')
         
         SrcFileDir = "."
@@ -675,13 +675,13 @@ class FfsInfStatement(FfsInfStatementClassObject):
             Arch = self.CurrentArch
 
         OutputPath = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch],
-                                  Arch ,
+                                  Arch,
                                   ModulePath,
                                   FileName,
                                   'OUTPUT'
                                   )
         DebugPath = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch],
-                                  Arch ,
+                                  Arch,
                                   ModulePath,
                                   FileName,
                                   'DEBUG'
@@ -943,9 +943,9 @@ class FfsInfStatement(FfsInfStatementClassObject):
                 Sect.FvParentAddr = FvParentAddr
             
             if Rule.KeyStringList != []:
-                SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
+                SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
             else :
-                SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
+                SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
             
             if not HasGeneratedFlag:
                 UniVfrOffsetFileSection = ""    
@@ -1123,7 +1123,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
         try :
             SaveFileOnChange(UniVfrOffsetFileName, fStringIO.getvalue())
         except:
-            EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName,None)
+            EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName, None)
         
         fStringIO.close ()
 
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 615d9e39faf1..c64c0c80e299 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -386,8 +386,8 @@ class FV (FvClassObject):
                     # check if the file path exists or not
                     if not os.path.isfile(FileFullPath):
                         GenFdsGlobalVariable.ErrorLogger("Error opening FV Extension Header Entry file %s." % (self.FvExtEntryData[Index]))
-                    FvExtFile = open (FileFullPath,'rb')
-                    FvExtFile.seek(0,2)
+                    FvExtFile = open (FileFullPath, 'rb')
+                    FvExtFile.seek(0, 2)
                     Size = FvExtFile.tell()
                     if Size >= 0x10000:
                         GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 916ff919176c..ac5d5891df70 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -64,7 +64,7 @@ class FvImageSection(FvImageSectionClassObject):
             for FvFileName in FileList:
                 FvAlignmentValue = 0
                 if os.path.isfile(FvFileName):
-                    FvFileObj = open (FvFileName,'rb')
+                    FvFileObj = open (FvFileName, 'rb')
                     FvFileObj.seek(0)
                     # PI FvHeader is 0x48 byte
                     FvHeaderBuffer = FvFileObj.read(0x48)
@@ -112,7 +112,7 @@ class FvImageSection(FvImageSectionClassObject):
                 if self.FvFileName != None:
                     FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvFileName)
                     if os.path.isfile(FvFileName):
-                        FvFileObj = open (FvFileName,'rb')
+                        FvFileObj = open (FvFileName, 'rb')
                         FvFileObj.seek(0)
                         # PI FvHeader is 0x48 byte
                         FvHeaderBuffer = FvFileObj.read(0x48)
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 6807ffdd6c3a..97a824e1c64b 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -342,7 +342,7 @@ class GenFdsGlobalVariable:
         for Arch in ArchList:
             GenFdsGlobalVariable.OutputDirDict[Arch] = os.path.normpath(
                 os.path.join(GlobalData.gWorkspace,
-                             WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,GlobalData.gGlobalDefines['TARGET'],
+                             WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GlobalData.gGlobalDefines['TARGET'],
                              GlobalData.gGlobalDefines['TOOLCHAIN']].OutputDirectory,
                              GlobalData.gGlobalDefines['TARGET'] +'_' + GlobalData.gGlobalDefines['TOOLCHAIN']))
             GenFdsGlobalVariable.OutputDirFromDscDict[Arch] = os.path.normpath(
@@ -549,7 +549,7 @@ class GenFdsGlobalVariable:
 
         GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
         if MakefilePath:
-            if (tuple(Cmd),tuple(GenFdsGlobalVariable.SecCmdList),tuple(GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict.keys():
+            if (tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict.keys():
                 GenFdsGlobalVariable.FfsCmdDict[tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)] = MakefilePath
             GenFdsGlobalVariable.SecCmdList = []
             GenFdsGlobalVariable.CopyList = []
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index 127385228fcf..dbbb4312f47e 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -109,7 +109,7 @@ def _parseForGCC(lines, efifilepath):
                     PcdName = m.groups(0)[0]
                     m = re.match('^([\da-fA-Fx]+) +([\da-fA-Fx]+)', lines[index + 1].strip())
                     if m != None:
-                        bpcds.append((PcdName, int(m.groups(0)[0], 16) , int(sections[-1][1], 16), sections[-1][0]))
+                        bpcds.append((PcdName, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
                 
     # get section information from efi file
     efisecs = PeImageClass(efifilepath).SectionHeaderList
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index becf3e8eb9e8..1e07e23baeee 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -89,7 +89,7 @@ if __name__ == '__main__':
   parser.add_argument("--signature-size", dest='SignatureSizeStr', type=str, help="specify the signature size for decode process.")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0, 10)), default=0, help="set debug level")
   parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
 
   #
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 1641968ace0e..7d11758a795f 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -52,7 +52,7 @@ if __name__ == '__main__':
   parser.add_argument("--public-key-hash-c", dest='PublicKeyHashCFile', type=argparse.FileType('wb'), help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in C structure format")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0, 10)), default=0, help="set debug level")
 
   #
   # Parse command line arguments
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 2a19ad973b91..e5f5a38bbc49 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -51,7 +51,7 @@ EFI_HASH_ALGORITHM_SHA256_GUID = uuid.UUID('{51aa59de-fdf2-4ea3-bc63-875fb7842ee
 #     UINT8 Signature[256];
 #   } EFI_CERT_BLOCK_RSA_2048_SHA256;
 #
-EFI_CERT_BLOCK_RSA_2048_SHA256        = collections.namedtuple('EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType','PublicKey','Signature'])
+EFI_CERT_BLOCK_RSA_2048_SHA256        = collections.namedtuple('EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType', 'PublicKey', 'Signature'])
 EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT = struct.Struct('16s256s256s')
 
 #
@@ -72,7 +72,7 @@ if __name__ == '__main__':
   parser.add_argument("--private-key", dest='PrivateKeyFile', type=argparse.FileType('rb'), help="specify the private key filename.  If not specified, a test signing key is used.")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0,10)), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=list(range(0, 10)), default=0, help="set debug level")
   parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
 
   #
@@ -156,7 +156,7 @@ if __name__ == '__main__':
   PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
   PublicKey = ''
   while len(PublicKeyHexString) > 0:
-    PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2],16))
+    PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2], 16))
     PublicKeyHexString=PublicKeyHexString[2:]
   if Process.returncode != 0:
     sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index ebed7a0ea7b8..fe74abb28901 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -59,11 +59,11 @@ class TargetTool():
     def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCharacter):
         """Convert a text file to a dictionary of (name:value) pairs."""
         try:
-            f = open(FileName,'r')
+            f = open(FileName, 'r')
             for Line in f:
                 if Line.startswith(CommentCharacter) or Line.strip() == '':
                     continue
-                LineList = Line.split(KeySplitCharacter,1)
+                LineList = Line.split(KeySplitCharacter, 1)
                 if len(LineList) >= 2:
                     Key = LineList[0].strip()
                     if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary.keys():
@@ -104,7 +104,7 @@ class TargetTool():
                 if Line.startswith(CommentCharacter) or Line.strip() == '':
                     fw.write(Line)
                 else:
-                    LineList = Line.split(KeySplitCharacter,1)
+                    LineList = Line.split(KeySplitCharacter, 1)
                     if len(LineList) >= 2:
                         Key = LineList[0].strip()
                         if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary.keys():
@@ -203,14 +203,14 @@ def RangeCheckCallback(option, opt_str, value, parser):
         parser.error("Option %s only allows one instance in command line!" % option)
         
 def MyOptionParser():
-    parser = OptionParser(version=__version__,prog="TargetTool.exe",usage=__usage__,description=__copyright__)
-    parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32','X64','IPF','EBC', 'ARM', 'AARCH64','0'], dest="TARGET_ARCH",
+    parser = OptionParser(version=__version__, prog="TargetTool.exe", usage=__usage__, description=__copyright__)
+    parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32', 'X64', 'IPF', 'EBC', 'ARM', 'AARCH64', '0'], dest="TARGET_ARCH",
         help="ARCHS is one of list: IA32, X64, IPF, ARM, AARCH64 or EBC, which replaces target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-p", "--platform", action="callback", type="string", dest="DSCFILE", callback=SingleCheckCallback,
         help="Specify a DSC file, which replace target.txt's ACTIVE_PLATFORM definition. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-c", "--tooldef", action="callback", type="string", dest="TOOL_DEFINITION_FILE", callback=SingleCheckCallback,
         help="Specify the WORKSPACE relative path of tool_def.txt file, which replace target.txt's TOOL_CHAIN_CONF definition. 0 will clear this setting in target.txt and can't combine with other value.")
-    parser.add_option("-t", "--target", action="append", type="choice", choices=['DEBUG','RELEASE','0'], dest="TARGET",
+    parser.add_option("-t", "--target", action="append", type="choice", choices=['DEBUG', 'RELEASE', '0'], dest="TARGET",
         help="TARGET is one of list: DEBUG, RELEASE, which replaces target.txt's TARGET definition. To specify more TARGET, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-n", "--tagname", action="callback", type="string", dest="TOOL_CHAIN_TAG", callback=SingleCheckCallback,
         help="Specify the Tool Chain Tagname, which replaces target.txt's TOOL_CHAIN_TAG definition. 0 will clear this setting in target.txt and can't combine with other value.")
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index 94f6b1bc707a..af1bf9de3e00 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -261,7 +261,7 @@ def TrimPreprocessedVfr(Source, Target):
     CreateDirectory(os.path.dirname(Target))
     
     try:
-        f = open (Source,'r')
+        f = open (Source, 'r')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
     # read whole file
@@ -310,7 +310,7 @@ def TrimPreprocessedVfr(Source, Target):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        f = open (Target, 'w')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
     f.writelines(Lines)
@@ -407,7 +407,7 @@ def TrimAslFile(Source, Target, IncludePathFile):
     if IncludePathFile:
         try:
             LineNum = 0
-            for Line in open(IncludePathFile,'r'):
+            for Line in open(IncludePathFile, 'r'):
                 LineNum += 1
                 if Line.startswith("/I") or Line.startswith ("-I"):
                     IncludePathList.append(Line[2:].strip())
@@ -425,7 +425,7 @@ def TrimAslFile(Source, Target, IncludePathFile):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        f = open (Target, 'w')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
@@ -560,7 +560,7 @@ def TrimEdkSourceCode(Source, Target):
     CreateDirectory(os.path.dirname(Target))
 
     try:
-        f = open (Source,'rb')
+        f = open (Source, 'rb')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
     # read whole file
@@ -568,7 +568,7 @@ def TrimEdkSourceCode(Source, Target):
     f.close()
 
     NewLines = None
-    for Re,Repl in gImportCodePatterns:
+    for Re, Repl in gImportCodePatterns:
         if NewLines == None:
             NewLines = Re.sub(Repl, Lines)
         else:
@@ -579,7 +579,7 @@ def TrimEdkSourceCode(Source, Target):
         return
 
     try:
-        f = open (Target,'wb')
+        f = open (Target, 'wb')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
     f.write(NewLines)
diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
index 3a7c9809e31a..203f973669f3 100644
--- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py
+++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
@@ -285,8 +285,8 @@ class DependencyRules(object):
                 pass
             DecPath = dirname(DecFile)
             if DecPath.find(WorkSP) > -1:
-                InstallPath = GetRelativePath(DecPath,WorkSP)
-                DecFileRelaPath = GetRelativePath(DecFile,WorkSP)
+                InstallPath = GetRelativePath(DecPath, WorkSP)
+                DecFileRelaPath = GetRelativePath(DecFile, WorkSP)
             else:
                 InstallPath = DecPath
                 DecFileRelaPath = DecFile
@@ -348,8 +348,8 @@ class DependencyRules(object):
                 pass
             DecPath = dirname(DecFile)
             if DecPath.find(WorkSP) > -1:
-                InstallPath = GetRelativePath(DecPath,WorkSP)
-                DecFileRelaPath = GetRelativePath(DecFile,WorkSP)
+                InstallPath = GetRelativePath(DecPath, WorkSP)
+                DecFileRelaPath = GetRelativePath(DecFile, WorkSP)
             else:
                 InstallPath = DecPath
                 DecFileRelaPath = DecFile
diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/Python/UPT/Core/IpiDb.py
index baf687ef99ba..44187a1ee40f 100644
--- a/BaseTools/Source/Python/UPT/Core/IpiDb.py
+++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py
@@ -459,7 +459,7 @@ class IpiDatabase(object):
             (select InstallPath from ModInPkgInfo where 
             ModInPkgInfo.PackageGuid ='%s' 
             and ModInPkgInfo.PackageVersion = '%s')""" \
-                            % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1],Pkg[0], Pkg[1])
+                            % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1])
             
             self.Cur.execute(SqlCommand)
         #
@@ -921,7 +921,7 @@ class IpiDatabase(object):
     def __ConvertToSqlString(self, StringList):
         if self.DpTable:
             pass
-        return map(lambda s: s.replace("'", "''") , StringList)
+        return map(lambda s: s.replace("'", "''"), StringList)
 
 
 
diff --git a/BaseTools/Source/Python/UPT/Library/String.py b/BaseTools/Source/Python/UPT/Library/String.py
index 2f916324bd13..de3035279f01 100644
--- a/BaseTools/Source/Python/UPT/Library/String.py
+++ b/BaseTools/Source/Python/UPT/Library/String.py
@@ -633,7 +633,7 @@ def SplitString(String):
 # @param StringList:  A list for strings to be converted
 #
 def ConvertToSqlString(StringList):
-    return map(lambda s: s.replace("'", "''") , StringList)
+    return map(lambda s: s.replace("'", "''"), StringList)
 
 ## Convert To Sql String
 #
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 4c28b7f5d22a..1e0c79d6677d 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -649,7 +649,7 @@ class DecPomAlignment(PackageObject):
                         ContainerFile,
                         (Item.TokenSpaceGuidCName, Item.TokenCName,
                         Item.DefaultValue, Item.DatumType, Item.TokenValue,
-                        Type, Item.GetHeadComment(), Item.GetTailComment(),''),
+                        Type, Item.GetHeadComment(), Item.GetTailComment(), ''),
                         Language,
                         self.DecParser.GetDefineSectionMacro()
                         )
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 84b3c353201a..12f091dd421b 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -315,7 +315,7 @@ def Main():
         GlobalData.gDB.CloseDb()
 
         if pf.system() == 'Windows':
-            os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\',''))
+            os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\', ''))
 
     return ReturnCode
 
diff --git a/BaseTools/Source/Python/UPT/Xml/CommonXml.py b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
index e28aec5b9b05..498fe938aeab 100644
--- a/BaseTools/Source/Python/UPT/Xml/CommonXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
@@ -355,7 +355,7 @@ class PackageHeaderXml(object):
     def FromXml(self, Item, Key, PackageObject2):
         if not Item:
             XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea']
-            CheckDict = {'PackageHeader':None, }
+            CheckDict = {'PackageHeader': None, }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         self.PackagePath = XmlElement(Item, '%s/PackagePath' % Key)
         self.Header.FromXml(Item, Key)
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParser.py b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
index b4d52f7bdc1f..bd7be102057a 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParser.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
@@ -104,7 +104,7 @@ class DistributionPackageXml(object):
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
             else:
                 XmlTreeLevel = ['DistributionPackage', 'DistributionHeader']
-                CheckDict = CheckDict = {'DistributionHeader':'', }
+                CheckDict = CheckDict = {'DistributionHeader': '', }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             #
@@ -124,16 +124,16 @@ class DistributionPackageXml(object):
             #
             if self.DistP.Tools:
                 XmlTreeLevel = ['DistributionPackage', 'Tools', 'Header']
-                CheckDict = {'Name':self.DistP.Tools.GetName(), }
+                CheckDict = {'Name': self.DistP.Tools.GetName(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
                 if not self.DistP.Tools.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'Tools']
-                    CheckDict = {'FileName':None, }
+                    CheckDict = {'FileName': None, }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
                 for Item in self.DistP.Tools.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'Tools']
-                    CheckDict = {'FileName':Item.GetURI(), }
+                    CheckDict = {'FileName': Item.GetURI(), }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             #
@@ -141,16 +141,16 @@ class DistributionPackageXml(object):
             #
             if self.DistP.MiscellaneousFiles:
                 XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles', 'Header']
-                CheckDict = {'Name':self.DistP.MiscellaneousFiles.GetName(), }
+                CheckDict = {'Name': self.DistP.MiscellaneousFiles.GetName(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
                 if not self.DistP.MiscellaneousFiles.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
-                    CheckDict = {'FileName':None, }
+                    CheckDict = {'FileName': None, }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
                 for Item in self.DistP.MiscellaneousFiles.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
-                    CheckDict = {'FileName':Item.GetURI(), }
+                    CheckDict = {'FileName': Item.GetURI(), }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             #
@@ -158,7 +158,7 @@ class DistributionPackageXml(object):
             #
             for Item in self.DistP.UserExtensions:
                 XmlTreeLevel = ['DistributionPackage', 'UserExtensions']
-                CheckDict = {'UserId':Item.GetUserID(), }
+                CheckDict = {'UserId': Item.GetUserID(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
 
@@ -450,10 +450,10 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['MiscellaneousFiles']
     for Item in Module.GetMiscFileList():
         if not Item.GetFileList():
-            CheckDict = {'Filename':'', }
+            CheckDict = {'Filename': '', }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         for File in Item.GetFileList():
-            CheckDict = {'Filename':File.GetURI(), }
+            CheckDict = {'Filename': File.GetURI(), }
 
 ## ValidateMS2
 #
@@ -916,10 +916,10 @@ def ValidatePS2(Package):
     XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'MiscellaneousFiles']
     for Item in Package.GetMiscFileList():
         if not Item.GetFileList():
-            CheckDict = {'Filename':'', }
+            CheckDict = {'Filename': '', }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         for File in Item.GetFileList():
-            CheckDict = {'Filename':File.GetURI(), }
+            CheckDict = {'Filename': File.GetURI(), }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
 ## ValidatePackageSurfaceArea
diff --git a/BaseTools/Source/Python/Workspace/DecBuildData.py b/BaseTools/Source/Python/Workspace/DecBuildData.py
index 01f716bfab70..fcaeb329216b 100644
--- a/BaseTools/Source/Python/Workspace/DecBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DecBuildData.py
@@ -365,16 +365,16 @@ class DecBuildData(PackageBuildClassObject):
 
     def ProcessStructurePcd(self, StructurePcdRawDataSet):
         s_pcd_set = dict()
-        for s_pcd,LineNo in StructurePcdRawDataSet:
+        for s_pcd, LineNo in StructurePcdRawDataSet:
             if s_pcd.TokenSpaceGuidCName not in s_pcd_set:
                 s_pcd_set[s_pcd.TokenSpaceGuidCName] = []
-            s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd,LineNo))
+            s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd, LineNo))
 
         str_pcd_set = []
         for pcdname in s_pcd_set:
             dep_pkgs = []
             struct_pcd = StructurePcd()
-            for item,LineNo in s_pcd_set[pcdname]:
+            for item, LineNo in s_pcd_set[pcdname]:
                 if "<HeaderFiles>" in item.TokenCName:
                     struct_pcd.StructuredPcdIncludeFile = item.DefaultValue
                 elif "<Packages>" in item.TokenCName:
@@ -384,7 +384,7 @@ class DecBuildData(PackageBuildClassObject):
                     struct_pcd.TokenValue = struct_pcd.TokenValue.strip("{").strip()
                     struct_pcd.TokenSpaceGuidCName, struct_pcd.TokenCName = pcdname.split(".")
                 else:
-                    struct_pcd.AddDefaultValue(item.TokenCName, item.DefaultValue,self.MetaFile.File,LineNo)
+                    struct_pcd.AddDefaultValue(item.TokenCName, item.DefaultValue, self.MetaFile.File, LineNo)
 
             struct_pcd.PackageDecs = dep_pkgs
 
@@ -407,7 +407,7 @@ class DecBuildData(PackageBuildClassObject):
         StrPcdSet = []
         RecordList = self._RawData[Type, self._Arch]
         for TokenSpaceGuid, PcdCName, Setting, Arch, PrivateFlag, Dummy1, Dummy2 in RecordList:
-            PcdDict[Arch, PcdCName, TokenSpaceGuid] = (Setting,Dummy2)
+            PcdDict[Arch, PcdCName, TokenSpaceGuid] = (Setting, Dummy2)
             if not (PcdCName, TokenSpaceGuid) in PcdSet:
                 PcdSet.append((PcdCName, TokenSpaceGuid))
 
@@ -416,7 +416,7 @@ class DecBuildData(PackageBuildClassObject):
             # limit the ARCH to self._Arch, if no self._Arch found, tdict
             # will automatically turn to 'common' ARCH and try again
             #
-            Setting,LineNo = PcdDict[self._Arch, PcdCName, TokenSpaceGuid]
+            Setting, LineNo = PcdDict[self._Arch, PcdCName, TokenSpaceGuid]
             if Setting == None:
                 continue
 
@@ -438,7 +438,7 @@ class DecBuildData(PackageBuildClassObject):
                                         list(expressions)
                                         )
             if "." in TokenSpaceGuid:
-                StrPcdSet.append((PcdObj,LineNo))
+                StrPcdSet.append((PcdObj, LineNo))
             else:
                 Pcds[PcdCName, TokenSpaceGuid, self._PCD_TYPE_STRING_[Type]] = PcdObj
 
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index cabad879b8d2..5e61110df330 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -592,12 +592,12 @@ class DscBuildData(PlatformBuildClassObject):
                                     File=self.MetaFile, Line=Record[-1])
                 self._SkuIds[Record[1].upper()] = (Record[0], Record[1].upper(), Record[2].upper())
             if 'DEFAULT' not in self._SkuIds:
-                self._SkuIds['DEFAULT'] = ("0","DEFAULT","DEFAULT")
+                self._SkuIds['DEFAULT'] = ("0", "DEFAULT", "DEFAULT")
             if 'COMMON' not in self._SkuIds:
-                self._SkuIds['COMMON'] = ("0","DEFAULT","DEFAULT")
+                self._SkuIds['COMMON'] = ("0", "DEFAULT", "DEFAULT")
         return self._SkuIds
-    def ToInt(self,intstr):
-        return int(intstr,16) if intstr.upper().startswith("0X") else int(intstr)
+    def ToInt(self, intstr):
+        return int(intstr, 16) if intstr.upper().startswith("0X") else int(intstr)
     def _GetDefaultStores(self):
         if self.DefaultStores == None:
             self.DefaultStores = sdict()
@@ -609,9 +609,9 @@ class DscBuildData(PlatformBuildClassObject):
                 if Record[1] in [None, '']:
                     EdkLogger.error('build', FORMAT_INVALID, 'No DefaultStores ID name',
                                     File=self.MetaFile, Line=Record[-1])
-                self.DefaultStores[Record[1].upper()] = (self.ToInt(Record[0]),Record[1].upper())
+                self.DefaultStores[Record[1].upper()] = (self.ToInt(Record[0]), Record[1].upper())
             if TAB_DEFAULT_STORES_DEFAULT not in self.DefaultStores:
-                self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (0,TAB_DEFAULT_STORES_DEFAULT)
+                self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (0, TAB_DEFAULT_STORES_DEFAULT)
             GlobalData.gDefaultStores = self.DefaultStores.keys()
             if GlobalData.gDefaultStores:
                 GlobalData.gDefaultStores.sort()
@@ -671,7 +671,7 @@ class DscBuildData(PlatformBuildClassObject):
             for Type in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, \
                          MODEL_PCD_FEATURE_FLAG, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX]:
                 RecordList = self._RawData[Type, self._Arch, None, ModuleId]
-                for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+                for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                     TokenList = GetSplitValueList(Setting)
                     DefaultValue = TokenList[0]
                     if len(TokenList) > 1:
@@ -695,7 +695,7 @@ class DscBuildData(PlatformBuildClassObject):
 
             # get module private build options
             RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, None, ModuleId]
-            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                 if (ToolChainFamily, ToolChain) not in Module.BuildOptions:
                     Module.BuildOptions[ToolChainFamily, ToolChain] = Option
                 else:
@@ -735,7 +735,7 @@ class DscBuildData(PlatformBuildClassObject):
             RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Arch, None, -1]
             Macros = self._Macros
             for Record in RecordList:
-                LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Dummy,Dummy, LineNo = Record
+                LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Dummy, Dummy, LineNo = Record
                 if LibraryClass == '' or LibraryClass == 'NULL':
                     self._NullLibraryNumber += 1
                     LibraryClass = 'NULL%d' % self._NullLibraryNumber
@@ -802,7 +802,7 @@ class DscBuildData(PlatformBuildClassObject):
                 ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
                 PkgSet.update(ModuleData.Packages)
 
-            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain,PkgSet)
+            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
 
 
         if (PcdCName, TokenSpaceGuid) not in self._DecPcds:
@@ -851,11 +851,11 @@ class DscBuildData(PlatformBuildClassObject):
                                 ExtraData="%s.%s" % (TokenSpaceGuid, PcdCName))
             if PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
                 if self._DecPcds[PcdCName, TokenSpaceGuid].DatumType.strip() != ValueList[1].strip():
-                    EdkLogger.error('build', FORMAT_INVALID, ErrStr , File=self.MetaFile, Line=LineNo,
+                    EdkLogger.error('build', FORMAT_INVALID, ErrStr, File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
         return ValueList
 
-    def _FilterPcdBySkuUsage(self,Pcds):
+    def _FilterPcdBySkuUsage(self, Pcds):
         available_sku = self.SkuIdMgr.AvailableSkuIdSet
         sku_usage = self.SkuIdMgr.SkuUsageType
         if sku_usage == SkuClass.SINGLE:
@@ -871,7 +871,7 @@ class DscBuildData(PlatformBuildClassObject):
                 if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
                     Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         return Pcds
-    def CompleteHiiPcdsDefaultStores(self,Pcds):
+    def CompleteHiiPcdsDefaultStores(self, Pcds):
         HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]]
         DefaultStoreMgr = DefaultStore(self.DefaultStores)
         for pcd in HiiPcd:
@@ -903,17 +903,17 @@ class DscBuildData(PlatformBuildClassObject):
             self._Pcds = self._FilterPcdBySkuUsage(self._Pcds)
         return self._Pcds
 
-    def _dumpPcdInfo(self,Pcds):
+    def _dumpPcdInfo(self, Pcds):
         for pcd in Pcds:
             pcdobj = Pcds[pcd]
             if not pcdobj.TokenCName.startswith("Test"):
                 continue
             for skuid in pcdobj.SkuInfoList:
-                if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]):
+                if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]):
                     for storename in pcdobj.SkuInfoList[skuid].DefaultStoreDict:
-                        print("PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,storename,str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename])))
+                        print("PcdCName: %s, SkuName: %s, StoreName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid, storename, str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename])))
                 else:
-                    print("PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,str(pcdobj.SkuInfoList[skuid].DefaultValue)))
+                    print("PcdCName: %s, SkuName: %s, Value: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid, str(pcdobj.SkuInfoList[skuid].DefaultValue)))
     ## Retrieve [BuildOptions]
     def _GetBuildOptions(self):
         if self._BuildOptions == None:
@@ -923,7 +923,7 @@ class DscBuildData(PlatformBuildClassObject):
             #
             for CodeBase in (EDKII_NAME, EDK_NAME):
                 RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, CodeBase]
-                for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+                for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                     if Dummy3.upper() != 'COMMON':
                         continue
                     CurKey = (ToolChainFamily, ToolChain, CodeBase)
@@ -946,7 +946,7 @@ class DscBuildData(PlatformBuildClassObject):
             DriverType = '%s.%s' % (Edk, ModuleType)
             CommonDriverType = '%s.%s' % ('COMMON', ModuleType)
             RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch]
-            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                 Type = Dummy2 + '.' + Dummy3
                 if Type.upper() == DriverType.upper() or Type.upper() == CommonDriverType.upper():
                     Key = (ToolChainFamily, ToolChain, Edk)
@@ -960,9 +960,9 @@ class DscBuildData(PlatformBuildClassObject):
     def GetStructurePcdInfo(self, PcdSet):
         structure_pcd_data = {}
         for item in PcdSet:
-            if (item[0],item[1]) not in structure_pcd_data:
-                structure_pcd_data[(item[0],item[1])] = []
-            structure_pcd_data[(item[0],item[1])].append(item)
+            if (item[0], item[1]) not in structure_pcd_data:
+                structure_pcd_data[(item[0], item[1])] = []
+            structure_pcd_data[(item[0], item[1])].append(item)
 
         return structure_pcd_data
 
@@ -988,7 +988,7 @@ class DscBuildData(PlatformBuildClassObject):
         for Type in TypeList:
             RecordList.extend(self._RawData[Type, self._Arch])
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_store, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_store, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             default_store = default_store.upper()
             SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
@@ -996,7 +996,7 @@ class DscBuildData(PlatformBuildClassObject):
                 continue
 
             if SkuName in SkuIds and "." in TokenSpaceGuid:
-                S_PcdSet.append(( TokenSpaceGuid.split(".")[0],TokenSpaceGuid.split(".")[1], PcdCName,SkuName, default_store,Dummy5, AnalyzePcdExpression(Setting)[0]))
+                S_PcdSet.append(( TokenSpaceGuid.split(".")[0], TokenSpaceGuid.split(".")[1], PcdCName, SkuName, default_store, Dummy5, AnalyzePcdExpression(Setting)[0]))
 
         # handle pcd value override
         StrPcdSet = self.GetStructurePcdInfo(S_PcdSet)
@@ -1013,12 +1013,12 @@ class DscBuildData(PlatformBuildClassObject):
                         str_pcd_obj_str.DefaultFromDSC = str_pcd_obj.DefaultValue
                 for str_pcd_data in StrPcdSet[str_pcd]:
                     if str_pcd_data[3] in SkuIds:
-                        str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], str(str_pcd_data[6]), 'DEFAULT' if str_pcd_data[3] == 'COMMON' else str_pcd_data[3],'STANDARD' if str_pcd_data[4] == 'COMMON' else str_pcd_data[4], self.MetaFile.File,LineNo=str_pcd_data[5])
+                        str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], str(str_pcd_data[6]), 'DEFAULT' if str_pcd_data[3] == 'COMMON' else str_pcd_data[3], 'STANDARD' if str_pcd_data[4] == 'COMMON' else str_pcd_data[4], self.MetaFile.File, LineNo=str_pcd_data[5])
                 S_pcd_set[str_pcd[1], str_pcd[0]] = str_pcd_obj_str
             else:
                 EdkLogger.error('build', PARSER_ERROR,
                             "Pcd (%s.%s) defined in DSC is not declared in DEC files. Arch: ['%s']" % (str_pcd[0], str_pcd[1], self._Arch),
-                            File=self.MetaFile,Line = StrPcdSet[str_pcd][0][5])
+                            File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
         # Add the Structure PCD that only defined in DEC, don't have override in DSC file
         for Pcd in self.DecPcds:
             if type (self._DecPcds[Pcd]) is StructurePcd:
@@ -1066,7 +1066,7 @@ class DscBuildData(PlatformBuildClassObject):
 
         Str_Pcd_Values = self.GenerateByteArrayValue(S_pcd_set)
         if Str_Pcd_Values:
-            for (skuname,StoreName,PcdGuid,PcdName,PcdValue) in Str_Pcd_Values:
+            for (skuname, StoreName, PcdGuid, PcdName, PcdValue) in Str_Pcd_Values:
                 str_pcd_obj = S_pcd_set.get((PcdName, PcdGuid))
                 if str_pcd_obj is None:
                     print(PcdName, PcdGuid)
@@ -1118,7 +1118,7 @@ class DscBuildData(PlatformBuildClassObject):
                 elif 'DEFAULT' in pcd.SkuInfoList.keys() and 'COMMON' in pcd.SkuInfoList.keys():
                     del(pcd.SkuInfoList['COMMON'])
 
-        map(self.FilterSkuSettings,[Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType])
+        map(self.FilterSkuSettings, [Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType])
         return Pcds
 
     ## Retrieve non-dynamic PCD settings
@@ -1140,7 +1140,7 @@ class DscBuildData(PlatformBuildClassObject):
         # Find out all possible PCD candidates for self._Arch
         RecordList = self._RawData[Type, self._Arch]
         PcdValueDict = sdict()
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
             if SkuName not in AvailableSkuIdSet:
@@ -1191,7 +1191,7 @@ class DscBuildData(PlatformBuildClassObject):
 
         return Pcds
 
-    def __UNICODE2OCTList(self,Value):
+    def __UNICODE2OCTList(self, Value):
         Value = Value.strip()
         Value = Value[2:-1]
         List = []
@@ -1202,7 +1202,7 @@ class DscBuildData(PlatformBuildClassObject):
         List.append('0x00')
         List.append('0x00')
         return List
-    def __STRING2OCTList(self,Value):
+    def __STRING2OCTList(self, Value):
         OCTList = []
         Value = Value.strip('"')
         for char in Value:
@@ -1395,7 +1395,7 @@ class DscBuildData(PlatformBuildClassObject):
                             CApp = CApp + '  Pcd->%s = %d; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
             for skuname in self.SkuIdMgr.GetSkuChain(SkuName):
                 inherit_OverrideValues = Pcd.SkuOverrideValues[skuname]
-                for FieldList in [Pcd.DefaultFromDSC,inherit_OverrideValues.get(DefaultStoreName)]:
+                for FieldList in [Pcd.DefaultFromDSC, inherit_OverrideValues.get(DefaultStoreName)]:
                     if not FieldList:
                         continue
                     if Pcd.DefaultFromDSC and FieldList == Pcd.DefaultFromDSC:
@@ -1614,7 +1614,7 @@ class DscBuildData(PlatformBuildClassObject):
                 File.close()
                 error_line = FileData[int (FileLine) - 1]
                 if r"//" in error_line:
-                    c_line,dsc_line = error_line.split(r"//")
+                    c_line, dsc_line = error_line.split(r"//")
                 else:
                     dsc_line = error_line
 
@@ -1641,7 +1641,7 @@ class DscBuildData(PlatformBuildClassObject):
         for Pcd in FileBuffer:
             PcdValue = Pcd.split ('|')
             PcdInfo = PcdValue[0].split ('.')
-            StructurePcdSet.append((PcdInfo[0],PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
+            StructurePcdSet.append((PcdInfo[0], PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
         return StructurePcdSet
 
     ## Retrieve dynamic PCD settings
@@ -1665,7 +1665,7 @@ class DscBuildData(PlatformBuildClassObject):
         AvailableSkuIdSet = copy.copy(self.SkuIds)
 
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
             if SkuName not in AvailableSkuIdSet:
@@ -1727,7 +1727,7 @@ class DscBuildData(PlatformBuildClassObject):
             elif 'DEFAULT' in pcd.SkuInfoList.keys() and 'COMMON' in pcd.SkuInfoList.keys():
                 del(pcd.SkuInfoList['COMMON'])
 
-        map(self.FilterSkuSettings,Pcds.values())
+        map(self.FilterSkuSettings, Pcds.values())
 
         return Pcds
 
@@ -1757,10 +1757,10 @@ class DscBuildData(PlatformBuildClassObject):
             return True
         else:
             return False
-    def CompletePcdValues(self,PcdSet):
+    def CompletePcdValues(self, PcdSet):
         Pcds = {}
         DefaultStoreObj = DefaultStore(self._GetDefaultStores())
-        SkuIds = {skuname:skuid for skuname,skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname !='COMMON'}
+        SkuIds = {skuname:skuid for skuname, skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname !='COMMON'}
         DefaultStores = set([storename for pcdobj in PcdSet.values() for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict.keys()])
         for PcdCName, TokenSpaceGuid in PcdSet:
             PcdObj = PcdSet[(PcdCName, TokenSpaceGuid)]
@@ -1781,7 +1781,7 @@ class DscBuildData(PlatformBuildClassObject):
                         if defaultstorename not in skuobj.DefaultStoreDict:
                             skuobj.DefaultStoreDict[defaultstorename] = copy.deepcopy(skuobj.DefaultStoreDict[mindefaultstorename])
                     skuobj.HiiDefaultValue = skuobj.DefaultStoreDict[mindefaultstorename]
-            for skuname,skuid in SkuIds.items():
+            for skuname, skuid in SkuIds.items():
                 if skuname not in PcdObj.SkuInfoList:
                     nextskuid = self.SkuIdMgr.GetNextSkuId(skuname)
                     while nextskuid not in PcdObj.SkuInfoList:
@@ -1815,7 +1815,7 @@ class DscBuildData(PlatformBuildClassObject):
         AvailableSkuIdSet = copy.copy(self.SkuIds)
         DefaultStoresDefine = self._GetDefaultStores()
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
             DefaultStore = DefaultStore.upper()
@@ -1828,14 +1828,14 @@ class DscBuildData(PlatformBuildClassObject):
                 EdkLogger.error('build', PARAMETER_INVALID, 'DefaultStores %s is not defined in [DefaultStores] section' % DefaultStore,
                                             File=self.MetaFile, Line=Dummy5)
             if "." not in TokenSpaceGuid:
-                PcdSet.add((PcdCName, TokenSpaceGuid, SkuName,DefaultStore, Dummy4))
-            PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid,DefaultStore] = Setting
+                PcdSet.add((PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy4))
+            PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore] = Setting
 
 
         # Remove redundant PCD candidates, per the ARCH and SKU
-        for PcdCName, TokenSpaceGuid, SkuName,DefaultStore, Dummy4 in PcdSet:
+        for PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy4 in PcdSet:
 
-            Setting = PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceGuid,DefaultStore]
+            Setting = PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore]
             if Setting == None:
                 continue
             VariableName, VariableGuid, VariableOffset, DefaultValue, VarAttribute = self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
@@ -1879,10 +1879,10 @@ class DscBuildData(PlatformBuildClassObject):
                     Skuitem = pcdObject.SkuInfoList[SkuName]
                     Skuitem.DefaultStoreDict.update({DefaultStore:DefaultValue})
                 else:
-                    SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute,DefaultStore={DefaultStore:DefaultValue})
+                    SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
                     pcdObject.SkuInfoList[SkuName] = SkuInfo
             else:
-                SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute,DefaultStore={DefaultStore:DefaultValue})
+                SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
                 Pcds[PcdCName, TokenSpaceGuid] = PcdClassObject(
                                                 PcdCName,
                                                 TokenSpaceGuid,
@@ -1909,7 +1909,7 @@ class DscBuildData(PlatformBuildClassObject):
                     sku.HiiDefaultValue = pcdDecObject.DefaultValue
             if 'DEFAULT' not in pcd.SkuInfoList.keys() and 'COMMON' not in pcd.SkuInfoList.keys():
                 valuefromDec = pcdDecObject.DefaultValue
-                SkuInfo = SkuInfoClass('DEFAULT', '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec,VariableAttribute=SkuInfoObj.VariableAttribute,DefaultStore={DefaultStore:valuefromDec})
+                SkuInfo = SkuInfoClass('DEFAULT', '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec, VariableAttribute=SkuInfoObj.VariableAttribute, DefaultStore={DefaultStore:valuefromDec})
                 pcd.SkuInfoList['DEFAULT'] = SkuInfo
             elif 'DEFAULT' not in pcd.SkuInfoList.keys() and 'COMMON' in pcd.SkuInfoList.keys():
                 pcd.SkuInfoList['DEFAULT'] = pcd.SkuInfoList['COMMON']
@@ -1937,19 +1937,19 @@ class DscBuildData(PlatformBuildClassObject):
             invalidpcd = ",".join(invalidhii)
             EdkLogger.error('build', PCD_VARIABLE_INFO_ERROR, Message='The same HII PCD must map to the same EFI variable for all SKUs', File=self.MetaFile, ExtraData=invalidpcd)
 
-        map(self.FilterSkuSettings,Pcds.values())
+        map(self.FilterSkuSettings, Pcds.values())
 
         return Pcds
 
-    def CheckVariableNameAssignment(self,Pcds):
+    def CheckVariableNameAssignment(self, Pcds):
         invalidhii = []
         for pcdname in Pcds:
             pcd = Pcds[pcdname]
-            varnameset = set([sku.VariableName for (skuid,sku) in pcd.SkuInfoList.items()])
+            varnameset = set([sku.VariableName for (skuid, sku) in pcd.SkuInfoList.items()])
             if len(varnameset) > 1:
-                invalidhii.append(".".join((pcdname[1],pcdname[0])))
+                invalidhii.append(".".join((pcdname[1], pcdname[0])))
         if len(invalidhii):
-            return False,invalidhii
+            return False, invalidhii
         else:
             return True, []
     ## Retrieve dynamic VPD PCD settings
@@ -1973,7 +1973,7 @@ class DscBuildData(PlatformBuildClassObject):
         RecordList = self._RawData[Type, self._Arch]
         AvailableSkuIdSet = copy.copy(self.SkuIds)
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = 'DEFAULT' if SkuName == 'COMMON' else SkuName
             if SkuName not in AvailableSkuIdSet:
@@ -2040,7 +2040,7 @@ class DscBuildData(PlatformBuildClassObject):
                 del(pcd.SkuInfoList['COMMON'])
 
 
-        map(self.FilterSkuSettings,Pcds.values())
+        map(self.FilterSkuSettings, Pcds.values())
         return Pcds
 
     ## Add external modules
@@ -2105,7 +2105,7 @@ class DscBuildData(PlatformBuildClassObject):
                     continue
                 ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
                 PkgSet.update(ModuleData.Packages)
-            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain,PkgSet)
+            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
         return self._DecPcds
     _Macros             = property(_GetMacros)
     Arch                = property(_GetArch, _SetArch)
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 5128dc2a6d2f..c85e3fe08649 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -299,7 +299,7 @@ class MetaFileParser(object):
         for Item in GetSplitValueList(self._CurrentLine[1:-1], TAB_COMMA_SPLIT):
             if Item == '':
                 continue
-            ItemList = GetSplitValueList(Item, TAB_SPLIT,3)
+            ItemList = GetSplitValueList(Item, TAB_SPLIT, 3)
             # different section should not mix in one section
             if self._SectionName != '' and self._SectionName != ItemList[0].upper():
                 EdkLogger.error('Parser', FORMAT_INVALID, "Different section names in the same section",
@@ -417,7 +417,7 @@ class MetaFileParser(object):
 
     ## Construct section Macro dict 
     def _ConstructSectionMacroDict(self, Name, Value):
-        ScopeKey = [(Scope[0], Scope[1],Scope[2]) for Scope in self._Scope]
+        ScopeKey = [(Scope[0], Scope[1], Scope[2]) for Scope in self._Scope]
         ScopeKey = tuple(ScopeKey)
         SectionDictKey = self._SectionType, ScopeKey
         #
@@ -449,20 +449,20 @@ class MetaFileParser(object):
                 continue
 
             for ActiveScope in self._Scope:
-                Scope0, Scope1 ,Scope2= ActiveScope[0], ActiveScope[1],ActiveScope[2]
-                if(Scope0, Scope1,Scope2) not in Scope:
+                Scope0, Scope1, Scope2= ActiveScope[0], ActiveScope[1], ActiveScope[2]
+                if(Scope0, Scope1, Scope2) not in Scope:
                     break
             else:
                 SpeSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
 
             for ActiveScope in self._Scope:
-                Scope0, Scope1,Scope2 = ActiveScope[0], ActiveScope[1],ActiveScope[2]
-                if(Scope0, Scope1,Scope2) not in Scope and (Scope0, "COMMON","COMMON") not in Scope and ("COMMON", Scope1,"COMMON") not in Scope:
+                Scope0, Scope1, Scope2 = ActiveScope[0], ActiveScope[1], ActiveScope[2]
+                if(Scope0, Scope1, Scope2) not in Scope and (Scope0, "COMMON", "COMMON") not in Scope and ("COMMON", Scope1, "COMMON") not in Scope:
                     break
             else:
                 ComSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
 
-            if ("COMMON", "COMMON","COMMON") in Scope:
+            if ("COMMON", "COMMON", "COMMON") in Scope:
                 ComComMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
 
         Macros.update(ComComMacroDict)
@@ -634,7 +634,7 @@ class InfParser(MetaFileParser):
             # Model, Value1, Value2, Value3, Arch, Platform, BelongsToItem=-1,
             # LineBegin=-1, ColumnBegin=-1, LineEnd=-1, ColumnEnd=-1, Enabled=-1
             #
-            for Arch, Platform,_ in self._Scope:
+            for Arch, Platform, _ in self._Scope:
                 LastItem = self._Store(self._SectionType,
                             self._ValueList[0],
                             self._ValueList[1],
@@ -944,7 +944,7 @@ class DscParser(MetaFileParser):
                 self._DirectiveParser()
                 continue
             if Line[0] == TAB_OPTION_START and not self._InSubsection:
-                EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (Line, Index+1),ExtraData=self.MetaFile)
+                EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (Line, Index+1), ExtraData=self.MetaFile)
 
             if self._InSubsection:
                 SectionType = self._SubsectionType
@@ -1024,7 +1024,7 @@ class DscParser(MetaFileParser):
                             ExtraData=self._CurrentLine)
 
         ItemType = self.DataType[DirectiveName]
-        Scope = [['COMMON', 'COMMON','COMMON']]
+        Scope = [['COMMON', 'COMMON', 'COMMON']]
         if ItemType == MODEL_META_DATA_INCLUDE:
             Scope = self._Scope
         if ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF:
@@ -1099,7 +1099,7 @@ class DscParser(MetaFileParser):
     @ParseMacro
     def _SkuIdParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
-        if len(TokenList) not in (2,3):
+        if len(TokenList) not in (2, 3):
             EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '<Integer>|<UiName>[|<UiName>]'",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:len(TokenList)] = TokenList
@@ -1159,7 +1159,7 @@ class DscParser(MetaFileParser):
 
         # Validate the datum type of Dynamic Defaul PCD and DynamicEx Default PCD
         ValueList = GetSplitValueList(self._ValueList[2])
-        if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8 , TAB_UINT16, TAB_UINT32 , TAB_UINT64] \
+        if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64] \
                               and self._ItemType in [MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]:
             EdkLogger.error('Parser', FORMAT_INVALID, "The datum type '%s' of PCD is wrong" % ValueList[1],
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
@@ -1167,7 +1167,7 @@ class DscParser(MetaFileParser):
         # Validate the VariableName of DynamicHii and DynamicExHii for PCD Entry must not be an empty string
         if self._ItemType in [MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII]:
             DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-            if len(DscPcdValueList[0].replace('L','').replace('"','').strip()) == 0:
+            if len(DscPcdValueList[0].replace('L', '').replace('"', '').strip()) == 0:
                 EdkLogger.error('Parser', FORMAT_INVALID, "The VariableName field in the HII format PCD entry must not be an empty string",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
@@ -1296,7 +1296,7 @@ class DscParser(MetaFileParser):
         self._ContentIndex = 0
         self._InSubsection = False
         while self._ContentIndex < len(self._Content) :
-            Id, self._ItemType, V1, V2, V3, S1, S2, S3,Owner, self._From, \
+            Id, self._ItemType, V1, V2, V3, S1, S2, S3, Owner, self._From, \
                 LineStart, ColStart, LineEnd, ColEnd, Enabled = self._Content[self._ContentIndex]
 
             if self._From < 0:
@@ -1314,8 +1314,8 @@ class DscParser(MetaFileParser):
                     break
                 Record = self._Content[self._ContentIndex]
                 if LineStart == Record[10] and LineEnd == Record[12]:
-                    if [Record[5], Record[6],Record[7]] not in self._Scope:
-                        self._Scope.append([Record[5], Record[6],Record[7]])
+                    if [Record[5], Record[6], Record[7]] not in self._Scope:
+                        self._Scope.append([Record[5], Record[6], Record[7]])
                     self._ContentIndex += 1
                 else:
                     break
@@ -1404,7 +1404,7 @@ class DscParser(MetaFileParser):
                         MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_DYNAMIC_EX_HII,
                         MODEL_PCD_DYNAMIC_EX_VPD):
             Records = self._RawTable.Query(PcdType, BelongsToItem= -1.0)
-            for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4,ID, Line in Records:
+            for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4, ID, Line in Records:
                 Name = TokenSpaceGuid + '.' + PcdName
                 if Name not in GlobalData.gPlatformOtherPcds:
                     PcdLine = Line
@@ -1776,7 +1776,7 @@ class DecParser(MetaFileParser):
         if self._DefinesCount > 1:
             EdkLogger.error('Parser', FORMAT_INVALID, 'Multiple [Defines] section is exist.', self.MetaFile )
         if self._DefinesCount == 0:
-            EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] section exist.',self.MetaFile)
+            EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] section exist.', self.MetaFile)
         self._Done()
 
 
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index 92fcf6dd2b22..9416065b284f 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -258,8 +258,8 @@ class PackageTable(MetaFileTable):
                 ValidType = "@ValidList"
             if oricomment.startswith("@Expression"):
                 ValidType = "@Expression"
-            EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s of PCD %s.%s is incorrect" % (ValidType,TokenSpaceGuid, PcdCName),
-                            ExtraData=oricomment,File=self.MetaFile, Line=LineNum)
+            EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s of PCD %s.%s is incorrect" % (ValidType, TokenSpaceGuid, PcdCName),
+                            ExtraData=oricomment, File=self.MetaFile, Line=LineNum)
             return set(), set(), set()
         return set(validateranges), set(validlists), set(expressions)
 ## Python class representation of table storing platform data
@@ -308,7 +308,7 @@ class PlatformTable(MetaFileTable):
     #
     def Insert(self, Model, Value1, Value2, Value3, Scope1='COMMON', Scope2='COMMON', Scope3=TAB_DEFAULT_STORES_DEFAULT,BelongsToItem=-1,
                FromItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=1):
-        (Value1, Value2, Value3, Scope1, Scope2,Scope3) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2,Scope3))
+        (Value1, Value2, Value3, Scope1, Scope2, Scope3) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2, Scope3))
         return Table.Insert(
                         self, 
                         Model, 
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
index c760e57b8f64..6b5e0edb0a4d 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
@@ -45,7 +45,7 @@ def GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain):
 #  @retval: A dictionary contains instances of PcdClassObject with key (PcdCName, TokenSpaceGuid)
 #  @retval: A dictionary contains real GUIDs of TokenSpaceGuid
 #
-def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain,additionalPkgs):
+def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, additionalPkgs):
     PkgList = GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain)
     PkgList = set(PkgList)
     PkgList |= additionalPkgs
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index c3bfecf8cc66..3db1719c7769 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1208,16 +1208,16 @@ class PcdReport(object):
                     else:
                         if IsByteArray:
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', "{"))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
                             for Array in ArrayList:
                                 FileWrite(File, '%s' % (Array))
                         else:
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', Value))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
                     if TypeName in ('DYNVPD', 'DEXVPD'):
                         FileWrite(File, '%*s' % (self.MaxLen + 4, SkuInfo.VpdOffset))
                     if IsStructure:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 216a25446f23..c64e8f265a97 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -832,7 +832,7 @@ class Build():
         self.HashSkipModules = []
         self.Db_Flag = False
         self.LaunchPrebuildFlag = False
-        self.PlatformBuildPath = os.path.join(GlobalData.gConfDirectory,'.cache', '.PlatformBuild')
+        self.PlatformBuildPath = os.path.join(GlobalData.gConfDirectory, '.cache', '.PlatformBuild')
         if BuildOptions.CommandLength:
             GlobalData.gCommandMaxLength = BuildOptions.CommandLength
 
@@ -1125,7 +1125,7 @@ class Build():
             # and preserve them for the rest of the main build step, because the child process environment will
             # evaporate as soon as it exits, we cannot get it in build step.
             #
-            PrebuildEnvFile = os.path.join(GlobalData.gConfDirectory,'.cache','.PrebuildEnv')
+            PrebuildEnvFile = os.path.join(GlobalData.gConfDirectory, '.cache', '.PrebuildEnv')
             if os.path.isfile(PrebuildEnvFile):
                 os.remove(PrebuildEnvFile)
             if os.path.isfile(self.PlatformBuildPath):
@@ -1165,7 +1165,7 @@ class Build():
                 f = open(PrebuildEnvFile)
                 envs = f.readlines()
                 f.close()
-                envs = itertools.imap(lambda l: l.split('=',1), envs)
+                envs = itertools.imap(lambda l: l.split('=', 1), envs)
                 envs = itertools.ifilter(lambda l: len(l) == 2, envs)
                 envs = itertools.imap(lambda l: [i.strip() for i in l], envs)
                 os.environ.update(dict(envs))
@@ -2346,7 +2346,7 @@ def MyOptionParser():
     Parser.add_option("-D", "--define", action="append", type="string", dest="Macros", help="Macro: \"Name [= Value]\".")
 
     Parser.add_option("-y", "--report-file", action="store", dest="ReportFile", help="Create/overwrite the report to the specified filename.")
-    Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD','LIBRARY','FLASH','DEPEX','BUILD_FLAGS','FIXED_ADDRESS','HASH','EXECUTION_ORDER'], dest="ReportType", default=[],
+    Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD', 'LIBRARY', 'FLASH', 'DEPEX', 'BUILD_FLAGS', 'FIXED_ADDRESS', 'HASH', 'EXECUTION_ORDER'], dest="ReportType", default=[],
         help="Flags that control the type of build report to generate.  Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HASH, EXECUTION_ORDER].  "\
              "To specify more than one flag, repeat this option on the command line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BUILD_FLAGS, FIXED_ADDRESS]")
     Parser.add_option("-F", "--flag", action="store", type="string", dest="Flag",
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 1cf2ce13be2b..1eafecefbacd 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -161,7 +161,7 @@ class BaseToolsTest(unittest.TestCase):
         if minlen is None: minlen = 1024
         if maxlen is None: maxlen = minlen
         return ''.join(
-            [chr(random.randint(0,255))
+            [chr(random.randint(0, 255))
              for x in range(random.randint(minlen, maxlen))
             ])
 
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 49ff656c066f..3bf524123d0f 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -187,7 +187,7 @@ class Config:
         return path
 
     def MakeDirs(self):
-        for path in (self.src_dir, self.build_dir,self.prefix, self.symlinks):
+        for path in (self.src_dir, self.build_dir, self.prefix, self.symlinks):
             if not os.path.exists(path):
                 os.makedirs(path)
 
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 12/15] BaseTools: Migrate to the new octal literal
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (10 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 11/15] BaseTools: Adjust the spaces around commas and colons Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 13/15] BaseTools: Unify long int and int in python scripts Gary Lin
                   ` (3 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Change the octal literals according to PEP3127
https://www.python.org/dev/peps/pep-3127/

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/Common/LongFilePathOs.py | 2 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py     | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/BaseTools/Source/Python/Common/LongFilePathOs.py b/BaseTools/Source/Python/Common/LongFilePathOs.py
index 2e530f9dd774..47d63faeb995 100644
--- a/BaseTools/Source/Python/Common/LongFilePathOs.py
+++ b/BaseTools/Source/Python/Common/LongFilePathOs.py
@@ -33,7 +33,7 @@ def rmdir(path):
 def mkdir(path):
     return os.mkdir(LongFilePath(path))
 
-def makedirs(name, mode=0777):
+def makedirs(name, mode=0o777):
     return os.makedirs(LongFilePath(name), mode)
 
 def rename(old, new):
diff --git a/BaseTools/Source/Python/UPT/Core/FileHook.py b/BaseTools/Source/Python/UPT/Core/FileHook.py
index d8736a872366..67e86f4f7454 100644
--- a/BaseTools/Source/Python/UPT/Core/FileHook.py
+++ b/BaseTools/Source/Python/UPT/Core/FileHook.py
@@ -166,7 +166,7 @@ def _hookrm(path):
     else:
         __built_in_remove__(path)
 
-def _hookmkdir(path, mode=0777):
+def _hookmkdir(path, mode=0o777):
     if GlobalData.gRECOVERMGR:
         GlobalData.gRECOVERMGR.bkmkdir(path, mode)
     else:
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 13/15] BaseTools: Unify long int and int in python scripts
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (11 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 12/15] BaseTools: Migrate to the new octal literal Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 14/15] BaseTools: Adjust old python2 idioms Gary Lin
                   ` (2 subsequent siblings)
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

According to PEP237, long int and int are unified.
https://www.python.org/dev/peps/pep-0237/

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/Common/Expression.py | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 7663df7160c1..d7714e54d47e 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -179,7 +179,6 @@ class ValueExpression(object):
                 Oprand2 = IntToStr(Oprand2)
         TypeDict = {
             type(0)  : 0,
-            type(0L) : 0,
             type('') : 1,
             type(True) : 2
         }
@@ -819,7 +818,7 @@ class ValueExpressionEx(ValueExpression):
                     else:
                         ListItem = PcdValue.split(',')
 
-                    if type(ListItem) == type(0) or type(ListItem) == type(0L):
+                    if type(ListItem) == type(0):
                         for Index in range(0, Size):
                             ValueStr += '0x%02X' % (int(ListItem) & 255)
                             ListItem >>= 8
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 14/15] BaseTools: Adjust old python2 idioms
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (12 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 13/15] BaseTools: Unify long int and int in python scripts Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-19  4:43 ` [PATCH 15/15] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
  2018-01-25 13:37 ` [PATCH 00/15] BaseTools: One step toward python3 Zhu, Yonghong
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Based on "futurize -f lib2to3.fixes.fix_idioms"

* Change some type comparisons to isinstance() calls:
    type(x) == T -> isinstance(x, T)
    type(x) is T -> isinstance(x, T)
    type(x) != T -> not isinstance(x, T)
    type(x) is not T -> not isinstance(x, T)

* Change "while 1:" into "while True:".

* Change both

    v = list(EXPR)
    v.sort()
    foo(v)

and the more general

    v = EXPR
    v.sort()
    foo(v)

into

    v = sorted(EXPR)
    foo(v)

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/MemoryProfileSymbolGen.py                     |  2 +-
 BaseTools/Scripts/UpdateBuildVersions.py                        |  6 +--
 BaseTools/Source/Python/AutoGen/AutoGen.py                      |  5 +--
 BaseTools/Source/Python/AutoGen/BuildEngine.py                  |  2 +-
 BaseTools/Source/Python/AutoGen/GenDepex.py                     |  2 +-
 BaseTools/Source/Python/Common/Dictionary.py                    |  2 +-
 BaseTools/Source/Python/Common/Expression.py                    | 46 ++++++++++----------
 BaseTools/Source/Python/Common/Misc.py                          | 13 +++---
 BaseTools/Source/Python/Common/RangeExpression.py               | 16 +++----
 BaseTools/Source/Python/Common/String.py                        |  4 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py          |  2 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py            |  2 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                   |  3 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 12 ++---
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                  |  2 +-
 BaseTools/Source/Python/Eot/Parser.py                           |  2 +-
 BaseTools/Source/Python/GenFds/GenFds.py                        |  4 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                |  2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py           | 15 +++----
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py           | 21 +++------
 BaseTools/Source/Python/UPT/Library/Misc.py                     |  6 +--
 BaseTools/Source/Python/UPT/Library/ParserValidate.py           |  2 +-
 BaseTools/Source/Python/UPT/Library/String.py                   |  2 +-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py          |  2 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py       |  3 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py       |  3 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py   |  3 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py           |  2 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py               |  6 +--
 BaseTools/Source/Python/Workspace/MetaFileParser.py             | 18 ++++----
 BaseTools/Source/Python/build/BuildReport.py                    |  3 +-
 BaseTools/Source/Python/build/build.py                          |  4 +-
 BaseTools/gcc/mingw-gcc-build.py                                |  4 +-
 33 files changed, 101 insertions(+), 120 deletions(-)

diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index c9158800668d..b98f6dccea08 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -263,7 +263,7 @@ def main():
         return 1
 
     try:
-        while 1:
+        while True:
             line = file.readline()
             if not line:
                 break
diff --git a/BaseTools/Scripts/UpdateBuildVersions.py b/BaseTools/Scripts/UpdateBuildVersions.py
index cff2e2263a8a..5725be57562f 100755
--- a/BaseTools/Scripts/UpdateBuildVersions.py
+++ b/BaseTools/Scripts/UpdateBuildVersions.py
@@ -253,7 +253,7 @@ def GetSvnRevision(opts):
     StatusCmd = "svn st -v --depth infinity --non-interactive"
     contents = ShellCommandResults(StatusCmd, opts)
     os.chdir(Cwd)
-    if type(contents) is ListType:
+    if isinstance(contents, ListType):
         for line in contents:
             if line.startswith("M "):
                 Modified = True
@@ -263,7 +263,7 @@ def GetSvnRevision(opts):
     InfoCmd = "svn info %s" % SrcPath.replace("\\", "/").strip()
     Revision = 0
     contents = ShellCommandResults(InfoCmd, opts)
-    if type(contents) is IntType:
+    if isinstance(contents, IntType):
         return 0, Modified
     for line in contents:
         line = line.strip()
@@ -284,7 +284,7 @@ def CheckSvn(opts):
     VerCmd = "svn --version"
     contents = ShellCommandResults(VerCmd, opts)
     opts.silent = OriginalSilent
-    if type(contents) is IntType:
+    if isinstance(contents, IntType):
         if opts.verbose:
             sys.stdout.write("SVN does not appear to be available.\n")
             sys.stdout.flush()
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index e8914df7310c..8b91904a289a 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -405,7 +405,7 @@ class WorkspaceAutoGen(AutoGen):
             PGen = PlatformAutoGen(self, self.MetaFile, Target, Toolchain, Arch)
             if GlobalData.BuildOptionPcd:
                 for i, pcd in enumerate(GlobalData.BuildOptionPcd):
-                    if type(pcd) is tuple:
+                    if isinstance(pcd, tuple):
                         continue
                     (pcdname, pcdvalue) = pcd.split('=')
                     if not pcdvalue:
@@ -1673,8 +1673,7 @@ class PlatformAutoGen(AutoGen):
                         PcdNvStoreDfBuffer.SkuInfoList[skuname].DefaultValue = vardump
                         PcdNvStoreDfBuffer.MaxDatumSize = str(len(vardump.split(",")))
 
-            PlatformPcds = self._PlatformPcds.keys()
-            PlatformPcds.sort()
+            PlatformPcds = sorted(self._PlatformPcds.keys())
             #
             # Add VPD type PCD into VpdFile and determine whether the VPD PCD need to be fixed up.
             #
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index e8f6788cdc40..6daff7210a37 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -80,7 +80,7 @@ class TargetDescBlock(object):
         return hash(self.Target.Path)
 
     def __eq__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             return Other.Target.Path == self.Target.Path
         else:
             return str(Other) == self.Target.Path
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 98a43db7a4e5..0f6a1700f541 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -143,7 +143,7 @@ class DependencyExpression:
     def __init__(self, Expression, ModuleType, Optimize=False):
         self.ModuleType = ModuleType
         self.Phase = gType2Phase[ModuleType]
-        if type(Expression) == type([]):
+        if isinstance(Expression, type([])):
             self.ExpressionString = " ".join(Expression)
             self.TokenList = Expression
         else:
diff --git a/BaseTools/Source/Python/Common/Dictionary.py b/BaseTools/Source/Python/Common/Dictionary.py
index 5f2cc8f31ffa..c381995f97ff 100644
--- a/BaseTools/Source/Python/Common/Dictionary.py
+++ b/BaseTools/Source/Python/Common/Dictionary.py
@@ -69,7 +69,7 @@ def printDict(Dict):
 # @param key:   The key of the item to be printed
 #
 def printList(Key, List):
-    if type(List) == type([]):
+    if isinstance(List, type([])):
         if len(List) > 0:
             if Key.find(TAB_SPLIT) != -1:
                 print("\n" + Key)
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index d7714e54d47e..a1b4e9ce4d73 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -159,23 +159,23 @@ class ValueExpression(object):
     def Eval(Operator, Oprand1, Oprand2 = None):
         WrnExp = None
 
-        if Operator not in ["in", "not in"] and (type(Oprand1) == type('') or type(Oprand2) == type('')):
-            if type(Oprand1) == type(''):
+        if Operator not in ["in", "not in"] and (isinstance(Oprand1, type('')) or isinstance(Oprand2, type(''))):
+            if isinstance(Oprand1, type('')):
                 if Oprand1[0] in ['"', "'"] or Oprand1.startswith('L"') or Oprand1.startswith("L'")or Oprand1.startswith('UINT'):
                     Oprand1, Size = ParseFieldValue(Oprand1)
                 else:
                     Oprand1, Size = ParseFieldValue('"' + Oprand1 + '"')
-            if type(Oprand2) == type(''):
+            if isinstance(Oprand2, type('')):
                 if Oprand2[0] in ['"', "'"] or Oprand2.startswith('L"') or Oprand2.startswith("L'") or Oprand2.startswith('UINT'):
                     Oprand2, Size = ParseFieldValue(Oprand2)
                 else:
                     Oprand2, Size = ParseFieldValue('"' + Oprand2 + '"')
-            if type(Oprand1) == type('') or type(Oprand2) == type(''):
+            if isinstance(Oprand1, type('')) or isinstance(Oprand2, type('')):
                 raise BadExpression(ERR_STRING_EXPR % Operator)
         if Operator in ['in', 'not in']:
-            if type(Oprand1) != type(''):
+            if not isinstance(Oprand1, type('')):
                 Oprand1 = IntToStr(Oprand1)
-            if type(Oprand2) != type(''):
+            if not isinstance(Oprand2, type('')):
                 Oprand2 = IntToStr(Oprand2)
         TypeDict = {
             type(0)  : 0,
@@ -185,18 +185,18 @@ class ValueExpression(object):
 
         EvalStr = ''
         if Operator in ["!", "NOT", "not"]:
-            if type(Oprand1) == type(''):
+            if isinstance(Oprand1, type('')):
                 raise BadExpression(ERR_STRING_EXPR % Operator)
             EvalStr = 'not Oprand1'
         elif Operator in ["~"]:
-            if type(Oprand1) == type(''):
+            if isinstance(Oprand1, type('')):
                 raise BadExpression(ERR_STRING_EXPR % Operator)
             EvalStr = '~ Oprand1'
         else:
             if Operator in ["+", "-"] and (type(True) in [type(Oprand1), type(Oprand2)]):
                 # Boolean in '+'/'-' will be evaluated but raise warning
                 WrnExp = WrnExpression(WRN_BOOL_EXPR)
-            elif type('') in [type(Oprand1), type(Oprand2)] and type(Oprand1)!= type(Oprand2):
+            elif type('') in [type(Oprand1), type(Oprand2)] and not isinstance(Oprand1, type(Oprand2)):
                 # == between string and number/boolean will always return False, != return True
                 if Operator == "==":
                     WrnExp = WrnExpression(WRN_EQCMP_STR_OTHERS)
@@ -217,11 +217,11 @@ class ValueExpression(object):
                     pass
                 else:
                     raise BadExpression(ERR_EXPR_TYPE)
-            if type(Oprand1) == type('') and type(Oprand2) == type(''):
+            if isinstance(Oprand1, type('')) and isinstance(Oprand2, type('')):
                 if (Oprand1.startswith('L"') and not Oprand2.startswith('L"')) or \
                     (not Oprand1.startswith('L"') and Oprand2.startswith('L"')):
                     raise BadExpression(ERR_STRING_CMP % (Oprand1, Operator, Oprand2))
-            if 'in' in Operator and type(Oprand2) == type(''):
+            if 'in' in Operator and isinstance(Oprand2, type('')):
                 Oprand2 = Oprand2.split()
             EvalStr = 'Oprand1 ' + Operator + ' Oprand2'
 
@@ -248,7 +248,7 @@ class ValueExpression(object):
 
     def __init__(self, Expression, SymbolTable={}):
         self._NoProcess = False
-        if type(Expression) != type(''):
+        if not isinstance(Expression, type('')):
             self._Expr = Expression
             self._NoProcess = True
             return
@@ -297,7 +297,7 @@ class ValueExpression(object):
 
             try:
                 Token = self._GetToken()
-                if type(Token) == type('') and Token.startswith('{') and Token.endswith('}') and self._Idx >= self._Len:
+                if isinstance(Token, type('')) and Token.startswith('{') and Token.endswith('}') and self._Idx >= self._Len:
                     return self._Expr
             except BadExpression:
                 pass
@@ -307,7 +307,7 @@ class ValueExpression(object):
 
         Val = self._ConExpr()
         RealVal = Val
-        if type(Val) == type(''):
+        if isinstance(Val, type('')):
             if Val == 'L""':
                 Val = False
             elif not Val:
@@ -548,7 +548,7 @@ class ValueExpression(object):
                 Ex.Pcd = self._Token
                 raise Ex
             self._Token = ValueExpression(self._Symb[self._Token], self._Symb)(True, self._Depth+1)
-            if type(self._Token) != type(''):
+            if not isinstance(self._Token, type('')):
                 self._LiteralToken = hex(self._Token)
                 return
 
@@ -652,7 +652,7 @@ class ValueExpression(object):
                 if Ch == ')':
                     TmpValue = self._Expr[Idx :self._Idx - 1]
                     TmpValue = ValueExpression(TmpValue)(True)
-                    TmpValue = '0x%x' % int(TmpValue) if type(TmpValue) != type('') else TmpValue
+                    TmpValue = '0x%x' % int(TmpValue) if not isinstance(TmpValue, type('')) else TmpValue
                     break
             self._Token, Size = ParseFieldValue(Prefix + '(' + TmpValue + ')')
             return  self._Token
@@ -744,9 +744,9 @@ class ValueExpressionEx(ValueExpression):
             PcdValue = '0'
         if self.PcdType in ['UINT8', 'UINT16', 'UINT32', 'UINT64', 'BOOLEAN']:
             PcdValue = PcdValue.strip()
-            if type(PcdValue) == type('') and PcdValue.startswith('{') and PcdValue.endswith('}'):
+            if isinstance(PcdValue, type('')) and PcdValue.startswith('{') and PcdValue.endswith('}'):
                 PcdValue = PcdValue[1:-1].split(',')
-            if type(PcdValue) == type([]):
+            if isinstance(PcdValue, type([])):
                 TmpValue = 0
                 Size = 0
                 for Item in PcdValue:
@@ -765,14 +765,14 @@ class ValueExpressionEx(ValueExpression):
                     else:
                         ItemValue = ParseFieldValue(Item)[0]
 
-                    if type(ItemValue) == type(''):
+                    if isinstance(ItemValue, type('')):
                         ItemValue = int(ItemValue, 16) if ItemValue.startswith('0x') else int(ItemValue)
 
                     TmpValue = (ItemValue << (Size * 8)) | TmpValue
                     Size = Size + ItemSize
             else:
                 TmpValue, Size = ParseFieldValue(PcdValue)
-            if type(TmpValue) == type(''):
+            if isinstance(TmpValue, type('')):
                 TmpValue = int(TmpValue)
             else:
                 PcdValue = '0x%0{}X'.format(Size) % (TmpValue)
@@ -818,13 +818,13 @@ class ValueExpressionEx(ValueExpression):
                     else:
                         ListItem = PcdValue.split(',')
 
-                    if type(ListItem) == type(0):
+                    if isinstance(ListItem, type(0)):
                         for Index in range(0, Size):
                             ValueStr += '0x%02X' % (int(ListItem) & 255)
                             ListItem >>= 8
                             ValueStr += ', '
                             PcdValue = '{' + ValueStr[:-2] + '}'
-                    elif type(ListItem) == type(''):
+                    elif isinstance(ListItem, type('')):
                         if ListItem.startswith('{') and ListItem.endswith('}'):
                             PcdValue = ListItem
                     else:
@@ -861,7 +861,7 @@ class ValueExpressionEx(ValueExpression):
                             else:
                                 ItemSize = 0
                             TmpValue = ValueExpressionEx(Item, self.PcdType, self._Symb)(True)
-                            Item = '0x%x' % TmpValue if type(TmpValue) != type('') else TmpValue
+                            Item = '0x%x' % TmpValue if not isinstance(TmpValue, type('')) else TmpValue
                             if ItemSize == 0:
                                 ItemValue, ItemSize = ParseFieldValue(Item)
                             else:
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 7a7f3f80c65a..b3d31b07256c 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1508,9 +1508,9 @@ def ParseDevPathValue (Value):
     return '{' + out + '}', Size
 
 def ParseFieldValue (Value):
-    if type(Value) == type(0):
+    if isinstance(Value, type(0)):
         return Value, (Value.bit_length() + 7) / 8
-    if type(Value) != type(''):
+    if not isinstance(Value, type('')):
         raise BadExpression('Type %s is %s' %(Value, type(Value)))
     Value = Value.strip()
     if Value.startswith('UINT8') and Value.endswith(')'):
@@ -1834,8 +1834,7 @@ def CheckPcdDatum(Type, Value):
             Printset.add(TAB_PRINTCHAR_BS)
             Printset.add(TAB_PRINTCHAR_NUL)
             if not set(Value).issubset(Printset):
-                PrintList = list(Printset)
-                PrintList.sort()
+                PrintList = sorted(Printset)
                 return False, "Invalid PCD string value of type [%s]; must be printable chars %s." % (Type, PrintList)
     elif Type == 'BOOLEAN':
         if Value not in ['TRUE', 'True', 'true', '0x1', '0x01', '1', 'FALSE', 'False', 'false', '0x0', '0x00', '0']:
@@ -1997,7 +1996,7 @@ class PathClass(object):
     # @retval True  The two PathClass are the same
     #
     def __eq__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             return self.Path == Other.Path
         else:
             return self.Path == str(Other)
@@ -2010,7 +2009,7 @@ class PathClass(object):
     # @retval -1    The first PathClass is less than the second PathClass
     # @retval 1     The first PathClass is Bigger than the second PathClass
     def __cmp__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             OtherKey = Other.Path
         else:
             OtherKey = str(Other)
@@ -2256,7 +2255,7 @@ class SkuClass():
             return ["DEFAULT"]
         skulist = [sku]
         nextsku = sku
-        while 1:
+        while True:
             nextsku = self.GetNextSkuId(nextsku)
             skulist.append(nextsku)
             if nextsku == "DEFAULT":
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 496961554e87..1bf3adab1e1d 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -106,7 +106,7 @@ class XOROperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "XOR ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId = str(uuid.uuid1())
@@ -120,7 +120,7 @@ class LEOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "LE ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -132,7 +132,7 @@ class LTOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable):
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "LT ..." 
             raise BadExpression(ERR_SNYTAX % Expr) 
         rangeId1 = str(uuid.uuid1())
@@ -145,7 +145,7 @@ class GEOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "GE ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -158,7 +158,7 @@ class GTOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "GT ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -171,7 +171,7 @@ class EQOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "EQ ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -370,7 +370,7 @@ class RangeExpression(object):
 
     def __init__(self, Expression, PcdDataType, SymbolTable = {}):
         self._NoProcess = False
-        if type(Expression) != type(''):
+        if not isinstance(Expression, type('')):
             self._Expr = Expression
             self._NoProcess = True
             return
@@ -591,7 +591,7 @@ class RangeExpression(object):
                 Ex.Pcd = self._Token
                 raise Ex
             self._Token = RangeExpression(self._Symb[self._Token], self._Symb)(True, self._Depth + 1)
-            if type(self._Token) != type(''):
+            if not isinstance(self._Token, type('')):
                 self._LiteralToken = hex(self._Token)
                 return
 
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index 358e7b8d7c31..d2ec46d84eb8 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -246,7 +246,7 @@ def SplitModuleType(Key):
 def ReplaceMacros(StringList, MacroDefinitions={}, SelfReplacement=False):
     NewList = []
     for String in StringList:
-        if type(String) == type(''):
+        if isinstance(String, type('')):
             NewList.append(ReplaceMacro(String, MacroDefinitions, SelfReplacement))
         else:
             NewList.append(String)
@@ -782,7 +782,7 @@ def RemoveBlockComment(Lines):
 # Get String of a List
 #
 def GetStringOfList(List, Split=' '):
-    if type(List) != type([]):
+    if not isinstance(List, type([])):
         return List
     Str = ''
     for Item in List:
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 3408cff8d75e..9c1e6b407356 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -159,7 +159,7 @@ class TargetTxtClassObject(object):
     # @param key:   The key of the item to be printed
     #
     def printList(Key, List):
-        if type(List) == type([]):
+        if isinstance(List, type([])):
             if len(List) > 0:
                 if Key.find(TAB_SPLIT) != -1:
                     print("\n" + Key)
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index 6dab179efc01..d3587b171192 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -155,7 +155,7 @@ class ToolDefClassObject(object):
                             if ErrorCode != 0:
                                 EdkLogger.error("tools_def.txt parser", FILE_NOT_FOUND, ExtraData=IncFile)
 
-                    if type(IncFileTmp) is PathClass:
+                    if isinstance(IncFileTmp, PathClass):
                         IncFile = IncFileTmp.Path
                     else:
                         IncFile = IncFileTmp
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index d59697c64b68..96d906ae2b3a 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -128,8 +128,7 @@ class VpdInfoFile:
                             "Invalid parameter FilePath: %s." % FilePath)        
 
         Content = FILE_COMMENT_TEMPLATE
-        Pcds = self._VpdArray.keys()
-        Pcds.sort()
+        Pcds = sorted(self._VpdArray.keys())
         for Pcd in Pcds:
             i = 0
             PcdTokenCName = Pcd.TokenCName
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 145c7435cd12..605a1d847c61 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -69,7 +69,7 @@ def ParseMacro(Parser):
         self._ItemType = MODEL_META_DATA_DEFINE
         # DEFINE defined macros
         if Type == TAB_DSC_DEFINES_DEFINE:
-            if type(self) == DecParser:
+            if isinstance(self, DecParser):
                 if MODEL_META_DATA_HEADER in self._SectionType:
                     self._FileLocalMacros[Name] = Value
                 else:
@@ -84,7 +84,7 @@ def ParseMacro(Parser):
                 SectionLocalMacros = self._SectionsMacroDict[SectionDictKey]
                 SectionLocalMacros[Name] = Value
         # EDK_GLOBAL defined macros
-        elif type(self) != DscParser:
+        elif not isinstance(self, DscParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "EDK_GLOBAL can only be used in .dsc file",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
         elif self._SectionType != MODEL_META_DATA_HEADER:
@@ -216,7 +216,7 @@ class MetaFileParser(object):
     #   DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
     #
     def __getitem__(self, DataInfo):
-        if type(DataInfo) != type(()):
+        if not isinstance(DataInfo, type(())):
             DataInfo = (DataInfo,)
 
         # Parse the file first, if necessary
@@ -258,7 +258,7 @@ class MetaFileParser(object):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         self._ValueList[0:len(TokenList)] = TokenList
         # Don't do macro replacement for dsc file at this point
-        if type(self) != DscParser:
+        if not isinstance(self, DscParser):
             Macros = self._Macros
             self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
 
@@ -356,7 +356,7 @@ class MetaFileParser(object):
             if os.path.exists(UniFile):
                 self._UniObj = UniParser(UniFile, IsExtraUni=False, IsModuleUni=False)
         
-        if type(self) == InfParser and self._Version < 0x00010005:
+        if isinstance(self, InfParser) and self._Version < 0x00010005:
             # EDK module allows using defines as macros
             self._FileLocalMacros[Name] = Value
         self._Defines[Name] = Value
@@ -371,7 +371,7 @@ class MetaFileParser(object):
             self._ValueList[1] = TokenList2[1]              # keys
         else:
             self._ValueList[1] = TokenList[0]
-        if len(TokenList) == 2 and type(self) != DscParser: # value
+        if len(TokenList) == 2 and not isinstance(self, DscParser): # value
             self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
 
         if self._ValueList[1].count('_') != 4:
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index eb76f4e6d54a..313fad602841 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -35,7 +35,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
         Element.appendChild(Doc.createTextNode(String))
     
     for Item in NodeList:
-        if type(Item) == type([]):
+        if isinstance(Item, type([])):
             Key = Item[0]
             Value = Item[1]
             if Key != '' and Key != None and Value != '' and Value != None:
diff --git a/BaseTools/Source/Python/Eot/Parser.py b/BaseTools/Source/Python/Eot/Parser.py
index ab19e30b69aa..951fe7e3be2e 100644
--- a/BaseTools/Source/Python/Eot/Parser.py
+++ b/BaseTools/Source/Python/Eot/Parser.py
@@ -731,7 +731,7 @@ def GetParameter(Parameter, Index = 1):
 #  @return: The name of parameter
 #
 def GetParameterName(Parameter):
-    if type(Parameter) == type('') and Parameter.startswith('&'):
+    if isinstance(Parameter, type('')) and Parameter.startswith('&'):
         return Parameter[1:].replace('{', '').replace('}', '').replace('\r', '').replace('\n', '').strip()
     else:
         return Parameter.strip()
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index bc7ef6408509..161955bc70ae 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -374,7 +374,7 @@ def CheckBuildOptionPcd():
     for Arch in GenFdsGlobalVariable.ArchList:
         PkgList  = GenFdsGlobalVariable.WorkSpace.GetPackageList(GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag)
         for i, pcd in enumerate(GlobalData.BuildOptionPcd):
-            if type(pcd) is tuple:
+            if isinstance(pcd, tuple):
                 continue
             (pcdname, pcdvalue) = pcd.split('=')
             if not pcdvalue:
@@ -842,7 +842,7 @@ class GenFds :
                         if not Name:
                             continue
 
-                        Name = ' '.join(Name) if type(Name) == type([]) else Name
+                        Name = ' '.join(Name) if isinstance(Name, type([])) else Name
                         GuidXRefFile.write("%s %s\n" %(FileStatementGuid, Name))
 
        # Append GUIDs, Protocols, and PPIs to the Xref file
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index fe74abb28901..2b6124dd4579 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -84,7 +84,7 @@ class TargetTool():
         KeyList = self.TargetTxtDictionary.keys()
         errMsg  = ''
         for Key in KeyList:
-            if type(self.TargetTxtDictionary[Key]) == type([]):
+            if isinstance(self.TargetTxtDictionary[Key], type([])):
                 print("%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key])))
             elif self.TargetTxtDictionary[Key] == None:
                 errMsg += "  Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep 
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
index d39c1827ba26..53d7b2b19b52 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
@@ -123,8 +123,7 @@ def GenPcd(Package, Content):
         if Pcd.GetSupModuleList():
             Statement += GenDecTailComment(Pcd.GetSupModuleList())
 
-        ArchList = Pcd.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Pcd.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -205,8 +204,7 @@ def GenGuidProtocolPpi(Package, Content):
         #
         if Guid.GetSupModuleList():
             Statement += GenDecTailComment(Guid.GetSupModuleList())     
-        ArchList = Guid.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Guid.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -246,8 +244,7 @@ def GenGuidProtocolPpi(Package, Content):
         #
         if Protocol.GetSupModuleList():
             Statement += GenDecTailComment(Protocol.GetSupModuleList())
-        ArchList = Protocol.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Protocol.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -287,8 +284,7 @@ def GenGuidProtocolPpi(Package, Content):
         #
         if Ppi.GetSupModuleList():
             Statement += GenDecTailComment(Ppi.GetSupModuleList())
-        ArchList = Ppi.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Ppi.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -463,8 +459,7 @@ def PackageToDec(Package, DistHeader = None):
         if LibraryClass.GetSupModuleList():
             Statement += \
             GenDecTailComment(LibraryClass.GetSupModuleList())
-        ArchList = LibraryClass.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(LibraryClass.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index 4a9528b500f2..4dcdcff4f13a 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -494,8 +494,7 @@ def GenPackages(ModuleObject):
         Statement += RelaPath.replace('\\', '/')
         if FFE:
             Statement += '|' + FFE
-        ArchList = PackageDependency.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(PackageDependency.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [Statement]
@@ -514,8 +513,7 @@ def GenSources(ModuleObject):
         SourceFile = Source.GetSourceFile()
         Family = Source.GetFamily()
         FeatureFlag = Source.GetFeatureFlag()
-        SupArchList = Source.GetSupArchList()
-        SupArchList.sort()
+        SupArchList = sorted(Source.GetSupArchList())
         SortedArch = ' '.join(SupArchList)
         Statement = GenSourceStatement(ConvertPath(SourceFile), Family, FeatureFlag)
         if SortedArch in NewSectionDict:
@@ -723,8 +721,7 @@ def GenGuidSections(GuidObjList):
         #
         # merge duplicate items
         #
-        ArchList = Guid.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Guid.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if (Statement, SortedArch) in GuidDict:
             PreviousComment = GuidDict[Statement, SortedArch]
@@ -783,8 +780,7 @@ def GenProtocolPPiSections(ObjList, IsProtocol):
         #
         # merge duplicate items
         #
-        ArchList = Object.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Object.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if (Statement, SortedArch) in Dict:
             PreviousComment = Dict[Statement, SortedArch]
@@ -858,8 +854,7 @@ def GenPcdSections(ModuleObject):
             #
             # Merge duplicate entries
             #
-            ArchList = Pcd.GetSupArchList()
-            ArchList.sort()
+            ArchList = sorted(Pcd.GetSupArchList())
             SortedArch = ' '.join(ArchList)
             if (Statement, SortedArch) in Dict:
                 PreviousComment = Dict[Statement, SortedArch]
@@ -1026,8 +1021,7 @@ def GenSpecialSections(ObjectList, SectionName, UserExtensionsContent=''):
         if CommentStr and not CommentStr.endswith('\n#\n'):
             CommentStr = CommentStr + '#\n'
         NewStateMent = CommentStr + Statement
-        SupArch = Obj.GetSupArchList()
-        SupArch.sort()
+        SupArch = sorted(Obj.GetSupArchList())
         SortedArch = ' '.join(SupArch)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [NewStateMent]
@@ -1105,8 +1099,7 @@ def GenBinaries(ModuleObject):
             FileName = ConvertPath(FileNameObj.GetFilename())
             FileType = FileNameObj.GetFileType()
             FFE = FileNameObj.GetFeatureFlag()
-            ArchList = FileNameObj.GetSupArchList()
-            ArchList.sort()
+            ArchList = sorted(FileNameObj.GetSupArchList())
             SortedArch = ' '.join(ArchList)
             Key = (FileName, FileType, FFE, SortedArch)
             if Key in BinariesDict:
diff --git a/BaseTools/Source/Python/UPT/Library/Misc.py b/BaseTools/Source/Python/UPT/Library/Misc.py
index 24e0a20daf87..936db991cdf5 100644
--- a/BaseTools/Source/Python/UPT/Library/Misc.py
+++ b/BaseTools/Source/Python/UPT/Library/Misc.py
@@ -515,7 +515,7 @@ class PathClass(object):
     # Check whether PathClass are the same
     #
     def __eq__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             return self.Path == Other.Path
         else:
             return self.Path == str(Other)
@@ -820,11 +820,11 @@ def ConvertArchList(ArchList):
     if not ArchList:
         return NewArchList
 
-    if type(ArchList) == list:
+    if isinstance(ArchList, list):
         for Arch in ArchList:
             Arch = Arch.upper()
             NewArchList.append(Arch)
-    elif type(ArchList) == str:
+    elif isinstance(ArchList, str):
         ArchList = ArchList.upper()
         NewArchList.append(ArchList)
 
diff --git a/BaseTools/Source/Python/UPT/Library/ParserValidate.py b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
index 028cf9a54f84..5348073b56ba 100644
--- a/BaseTools/Source/Python/UPT/Library/ParserValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
@@ -341,7 +341,7 @@ def IsValidCFormatGuid(Guid):
                 #
                 # Index may out of bound
                 #
-                if type(List[Index]) != type(1) or \
+                if not isinstance(List[Index], type(1)) or \
                    len(Value) > List[Index] or len(Value) < 3:
                     return False
                 
diff --git a/BaseTools/Source/Python/UPT/Library/String.py b/BaseTools/Source/Python/UPT/Library/String.py
index de3035279f01..e6cab4650373 100644
--- a/BaseTools/Source/Python/UPT/Library/String.py
+++ b/BaseTools/Source/Python/UPT/Library/String.py
@@ -652,7 +652,7 @@ def ConvertToSqlString2(String):
 # @param Split: split character
 #
 def GetStringOfList(List, Split=' '):
-    if type(List) != type([]):
+    if not isinstance(List, type([])):
         return List
     Str = ''
     for Item in List:
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
index fd02efb6bf04..05fe3b547326 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
@@ -40,7 +40,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
         Element.appendChild(Doc.createTextNode(String))
 
     for Item in NodeList:
-        if type(Item) == type([]):
+        if isinstance(Item, type([])):
             Key = Item[0]
             Value = Item[1]
             if Key != '' and Key != None and Value != '' and Value != None:
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 1e0c79d6677d..bcc5d96f9153 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -410,8 +410,7 @@ class DecPomAlignment(PackageObject):
         # 
         PackagePath = os.path.split(self.GetFullPath())[0]
         IncludePathList = \
-            [os.path.normpath(Path) + sep for Path in IncludesDict.keys()]
-        IncludePathList.sort()
+            sorted([os.path.normpath(Path) + sep for Path in IncludesDict.keys()])
         
         #
         # get a non-overlap set of include path, IncludePathList should be 
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
index a15173285345..c0e4805a3f15 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
@@ -614,8 +614,7 @@ class InfPomAlignment(ModuleObject):
                 SourceFile = Item.GetSourceFileName()
                 Family = Item.GetFamily()
                 FeatureFlag = Item.GetFeatureFlagExp()
-                SupArchList = ConvertArchList(Item.GetSupArchList())
-                SupArchList.sort()
+                SupArchList = sorted(ConvertArchList(Item.GetSupArchList()))
                 Source = SourceFileObject()
                 Source.SetSourceFile(SourceFile)
                 Source.SetFamily(Family)
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
index 042d4784c84c..9685799a0f0d 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
@@ -194,8 +194,7 @@ def GenBinaryData(BinaryData, BinaryObj, BinariesDict, AsBuildIns, BinaryFileObj
         # can be used for the attribute.
         # If both not have VALID_ARCHITECTURE comment and no architecturie specified, then keep it empty.
         #        
-        SupArchList = ConvertArchList(ItemObj.GetSupArchList())
-        SupArchList.sort()
+        SupArchList = sorted(ConvertArchList(ItemObj.GetSupArchList()))
         if len(SupArchList) == 1 and SupArchList[0] == 'COMMON':
             if not (len(OriSupArchList) == 1 or OriSupArchList[0] == 'COMMON'):
                 SupArchList = OriSupArchList
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index e5f1f01556e5..ee70597bc9a3 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -162,7 +162,7 @@ class StructurePcd(PcdClassObject):
         self.validateranges = PcdObject.validateranges if PcdObject.validateranges else self.validateranges
         self.validlists = PcdObject.validlists if PcdObject.validlists else self.validlists
         self.expressions = PcdObject.expressions if PcdObject.expressions else self.expressions
-        if type(PcdObject) is StructurePcd:
+        if isinstance(PcdObject, StructurePcd):
             self.StructuredPcdIncludeFile = PcdObject.StructuredPcdIncludeFile if PcdObject.StructuredPcdIncludeFile else self.StructuredPcdIncludeFile
             self.PackageDecs = PcdObject.PackageDecs if PcdObject.PackageDecs else self.PackageDecs
             self.DefaultValues = PcdObject.DefaultValues if PcdObject.DefaultValues else self.DefaultValues
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 5e61110df330..15af0c54bbe0 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -862,13 +862,13 @@ class DscBuildData(PlatformBuildClassObject):
             for pcdname in Pcds:
                 pcd = Pcds[pcdname]
                 Pcds[pcdname].SkuInfoList = {"DEFAULT":pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
-                if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
+                if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
                     Pcds[pcdname].SkuOverrideValues = {"DEFAULT":pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         else:
             for pcdname in Pcds:
                 pcd = Pcds[pcdname]
                 Pcds[pcdname].SkuInfoList = {skuid:pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
-                if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
+                if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
                     Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         return Pcds
     def CompleteHiiPcdsDefaultStores(self, Pcds):
@@ -1021,7 +1021,7 @@ class DscBuildData(PlatformBuildClassObject):
                             File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
         # Add the Structure PCD that only defined in DEC, don't have override in DSC file
         for Pcd in self.DecPcds:
-            if type (self._DecPcds[Pcd]) is StructurePcd:
+            if isinstance(self._DecPcds[Pcd], StructurePcd):
                 if Pcd not in S_pcd_set:
                     str_pcd_obj_str = StructurePcd()
                     str_pcd_obj_str.copy(self._DecPcds[Pcd])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index c85e3fe08649..902ed1fe338a 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -75,10 +75,10 @@ def ParseMacro(Parser):
             #
             # First judge whether this DEFINE is in conditional directive statements or not.
             #
-            if type(self) == DscParser and self._InDirective > -1:
+            if isinstance(self, DscParser) and self._InDirective > -1:
                 pass
             else:
-                if type(self) == DecParser:
+                if isinstance(self, DecParser):
                     if MODEL_META_DATA_HEADER in self._SectionType:
                         self._FileLocalMacros[Name] = Value
                     else:
@@ -89,7 +89,7 @@ def ParseMacro(Parser):
                     self._ConstructSectionMacroDict(Name, Value)
 
         # EDK_GLOBAL defined macros
-        elif type(self) != DscParser:
+        elif not isinstance(self, DscParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "EDK_GLOBAL can only be used in .dsc file",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         elif self._SectionType != MODEL_META_DATA_HEADER:
@@ -230,7 +230,7 @@ class MetaFileParser(object):
     #   DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
     #
     def __getitem__(self, DataInfo):
-        if type(DataInfo) != type(()):
+        if not isinstance(DataInfo, type(())):
             DataInfo = (DataInfo,)
 
         # Parse the file first, if necessary
@@ -272,7 +272,7 @@ class MetaFileParser(object):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         self._ValueList[0:len(TokenList)] = TokenList
         # Don't do macro replacement for dsc file at this point
-        if type(self) != DscParser:
+        if not isinstance(self, DscParser):
             Macros = self._Macros
             self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
 
@@ -379,7 +379,7 @@ class MetaFileParser(object):
                 EdkLogger.error('Parser', FORMAT_INVALID, "Invalid version number",
                                 ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
-        if type(self) == InfParser and self._Version < 0x00010005:
+        if isinstance(self, InfParser) and self._Version < 0x00010005:
             # EDK module allows using defines as macros
             self._FileLocalMacros[Name] = Value
         self._Defines[Name] = Value
@@ -395,7 +395,7 @@ class MetaFileParser(object):
             self._ValueList[1] = TokenList2[1]              # keys
         else:
             self._ValueList[1] = TokenList[0]
-        if len(TokenList) == 2 and type(self) != DscParser: # value
+        if len(TokenList) == 2 and not isinstance(self, DscParser): # value
             self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
 
         if self._ValueList[1].count('_') != 4:
@@ -424,7 +424,7 @@ class MetaFileParser(object):
         # DecParser SectionType is a list, will contain more than one item only in Pcd Section
         # As Pcd section macro usage is not alllowed, so here it is safe
         #
-        if type(self) == DecParser:
+        if isinstance(self, DecParser):
             SectionDictKey = self._SectionType[0], ScopeKey
         if SectionDictKey not in self._SectionsMacroDict:
             self._SectionsMacroDict[SectionDictKey] = {}
@@ -441,7 +441,7 @@ class MetaFileParser(object):
         SpeSpeMacroDict = {}
 
         ActiveSectionType = self._SectionType
-        if type(self) == DecParser:
+        if isinstance(self, DecParser):
             ActiveSectionType = self._SectionType[0]
 
         for (SectionType, Scope) in self._SectionsMacroDict:
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 3db1719c7769..3352504d502e 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1753,8 +1753,7 @@ class FdRegionReport(object):
                 for Match in gOffsetGuidPattern.finditer(FvReport):
                     Guid = Match.group(2).upper()
                     OffsetInfo[Match.group(1)] = self._GuidsDb.get(Guid, Guid)
-                OffsetList = OffsetInfo.keys()
-                OffsetList.sort()
+                OffsetList = sorted(OffsetInfo.keys())
                 for Offset in OffsetList:
                     FileWrite (File, "%s %s" % (Offset, OffsetInfo[Offset]))
             except IOError:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index c64e8f265a97..f77924137665 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -303,7 +303,7 @@ def LaunchCommand(Command, WorkingDir):
         if EndOfProcedure != None:
             EndOfProcedure.set()
         if Proc == None:
-            if type(Command) != type(""):
+            if not isinstance(Command, type("")):
                 Command = " ".join(Command)
             EdkLogger.error("build", COMMAND_FAILURE, "Failed to start command", ExtraData="%s [%s]" % (Command, WorkingDir))
 
@@ -314,7 +314,7 @@ def LaunchCommand(Command, WorkingDir):
 
     # check the return code of the program
     if Proc.returncode != 0:
-        if type(Command) != type(""):
+        if not isinstance(Command, type("")):
             Command = " ".join(Command)
         # print out the Response file and its content when make failure
         RespFile = os.path.join(WorkingDir, 'OUTPUT', 'respfilelist.txt')
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 3bf524123d0f..6a805ce51885 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -258,9 +258,9 @@ class SourceFiles:
             replaceables = ('extract-dir', 'filename', 'url')
             for replaceItem in fdata:
                 if replaceItem in replaceables: continue
-                if type(fdata[replaceItem]) != str: continue
+                if not isinstance(fdata[replaceItem], str): continue
                 for replaceable in replaceables:
-                    if type(fdata[replaceable]) != str: continue
+                    if not isinstance(fdata[replaceable], str): continue
                     if replaceable in fdata:
                         fdata[replaceable] = \
                             fdata[replaceable].replace(
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* [PATCH 15/15] BaseTools: Replace StringIO.StringIO with io.BytesIO
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (13 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 14/15] BaseTools: Adjust old python2 idioms Gary Lin
@ 2018-01-19  4:43 ` Gary Lin
  2018-01-25 13:37 ` [PATCH 00/15] BaseTools: One step toward python3 Zhu, Yonghong
  15 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-19  4:43 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Replace StringIO.StringIO with io.BytesIO to be compatible with python3.
This commit also removes "import StringIO" from those python scripts
that don't really use it.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/ConvertUni.py                            |  5 -----
 BaseTools/Source/Python/AutoGen/AutoGen.py                 | 10 +++++-----
 BaseTools/Source/Python/AutoGen/GenDepex.py                |  4 ++--
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                |  4 ++--
 BaseTools/Source/Python/AutoGen/IdfClassObject.py          |  1 -
 BaseTools/Source/Python/AutoGen/StrGather.py               |  4 ++--
 BaseTools/Source/Python/AutoGen/UniClassObject.py          |  6 +++---
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py |  4 ++--
 BaseTools/Source/Python/BPDG/GenVpd.py                     |  6 +++---
 BaseTools/Source/Python/Eot/FvImage.py                     |  1 -
 BaseTools/Source/Python/GenFds/AprioriSection.py           |  4 ++--
 BaseTools/Source/Python/GenFds/Capsule.py                  | 10 +++++-----
 BaseTools/Source/Python/GenFds/CapsuleData.py              |  4 ++--
 BaseTools/Source/Python/GenFds/Fd.py                       |  6 +++---
 BaseTools/Source/Python/GenFds/FfsFileStatement.py         |  4 ++--
 BaseTools/Source/Python/GenFds/FfsInfStatement.py          |  4 ++--
 BaseTools/Source/Python/GenFds/Fv.py                       |  6 +++---
 BaseTools/Source/Python/GenFds/FvImageSection.py           |  4 ++--
 BaseTools/Source/Python/GenFds/GenFds.py                   |  8 ++++----
 BaseTools/Source/Python/GenFds/OptionRom.py                |  3 ---
 BaseTools/Source/Python/GenFds/Region.py                   | 11 ++++++-----
 BaseTools/Source/Python/Trim/Trim.py                       |  6 +++---
 BaseTools/Source/Python/build/BuildReport.py               |  4 ++--
 BaseTools/Source/Python/build/build.py                     |  8 ++++----
 24 files changed, 59 insertions(+), 68 deletions(-)

diff --git a/BaseTools/Scripts/ConvertUni.py b/BaseTools/Scripts/ConvertUni.py
index 2af55dfc6702..67bbe41b1f18 100755
--- a/BaseTools/Scripts/ConvertUni.py
+++ b/BaseTools/Scripts/ConvertUni.py
@@ -23,11 +23,6 @@ import codecs
 import os
 import sys
 
-try:
-    from io import StringIO
-except ImportError:
-    from StringIO import StringIO
-
 class ConvertOneArg:
     """Converts utf-16 to utf-8 for one command line argument.
 
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 8b91904a289a..1dfd5e0fb76b 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -24,7 +24,7 @@ import uuid
 import GenC
 import GenMake
 import GenDepex
-from StringIO import StringIO
+from io import BytesIO
 
 from StrGather import *
 from BuildEngine import BuildRule
@@ -3683,8 +3683,8 @@ class ModuleAutoGen(AutoGen):
     def _GetAutoGenFileList(self):
         UniStringAutoGenC = True
         IdfStringAutoGenC = True
-        UniStringBinBuffer = StringIO()
-        IdfGenBinBuffer = StringIO()
+        UniStringBinBuffer = BytesIO()
+        IdfGenBinBuffer = BytesIO()
         if self.BuildType == 'UEFI_HII':
             UniStringAutoGenC = False
             IdfStringAutoGenC = False
@@ -3968,8 +3968,8 @@ class ModuleAutoGen(AutoGen):
         except:
             EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
 
-        # Use a instance of StringIO to cache data
-        fStringIO = StringIO('')  
+        # Use a instance of BytesIO to cache data
+        fStringIO = BytesIO('')
 
         for Item in VfrUniOffsetList:
             if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 0f6a1700f541..bb516b651266 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -17,7 +17,7 @@ import Common.LongFilePathOs as os
 import re
 import traceback
 from Common.LongFilePathSupport import OpenLongFilePath as open
-from StringIO import StringIO
+from io import BytesIO
 from struct import pack
 from Common.BuildToolError import *
 from Common.Misc import SaveFileOnChange
@@ -344,7 +344,7 @@ class DependencyExpression:
     #   @retval False   If file exists and is not changed.
     #
     def Generate(self, File=None):
-        Buffer = StringIO()
+        Buffer = BytesIO()
         if len(self.PostfixNotation) == 0:
             return False
 
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index f158b999d89b..32b0150bc78b 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -11,7 +11,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 from builtins import range
-from StringIO import StringIO
+from io import BytesIO
 from Common.Misc import *
 from Common.String import StringToArray
 from struct import pack
@@ -976,7 +976,7 @@ def CreatePcdDatabaseCode (Info, AutoGenC, AutoGenH):
         DbFileName = os.path.join(Info.PlatformInfo.BuildDir, "FV", Phase + "PcdDataBase.raw")
     else:
         DbFileName = os.path.join(Info.OutputDir, Phase + "PcdDataBase.raw")
-    DbFile = StringIO()
+    DbFile = BytesIO()
     DbFile.write(PcdDbBuffer)
     Changed = SaveFileOnChange(DbFileName, DbFile.getvalue(), True)
 def CreatePcdDataBase(PcdDBData):
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index d6d4703370aa..db1e5ee6a32d 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -14,7 +14,6 @@
 # Import Modules
 #
 import Common.EdkLogger as EdkLogger
-import StringIO
 from Common.BuildToolError import *
 from Common.String import GetLineNo
 from Common.Misc import PathClass
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index 718cd60514b4..b61450c02831 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -19,7 +19,7 @@ import re
 import Common.EdkLogger as EdkLogger
 from Common.BuildToolError import *
 from UniClassObject import *
-from StringIO import StringIO
+from io import BytesIO
 from struct import pack, unpack
 from Common.LongFilePathSupport import OpenLongFilePath as open
 
@@ -382,7 +382,7 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
         if Language not in UniLanguageListFiltered:
             continue
         
-        StringBuffer = StringIO()
+        StringBuffer = BytesIO()
         StrStringValue = ''
         ArrayLength = 0
         NumberOfUseOtherLangDef = 0
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index cab7623bc4e6..5c4ccd7a8b77 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -21,7 +21,7 @@ from builtins import range
 import Common.LongFilePathOs as os, codecs, re
 import distutils.util
 import Common.EdkLogger as EdkLogger
-import StringIO
+from io import BytesIO
 from Common.BuildToolError import *
 from Common.String import GetLineNo
 from Common.Misc import PathClass
@@ -308,7 +308,7 @@ class UniFileClassObject(object):
 
         self.VerifyUcs2Data(FileIn, FileName, Encoding)
 
-        UniFile = StringIO.StringIO(FileIn)
+        UniFile = BytesIO(FileIn)
         Info = codecs.lookup(Encoding)
         (Reader, Writer) = (Info.streamreader, Info.streamwriter)
         return codecs.StreamReaderWriter(UniFile, Reader, Writer)
@@ -322,7 +322,7 @@ class UniFileClassObject(object):
             FileDecoded = codecs.decode(FileIn, Encoding)
             Ucs2Info.encode(FileDecoded)
         except:
-            UniFile = StringIO.StringIO(FileIn)
+            UniFile = BytesIO(FileIn)
             Info = codecs.lookup(Encoding)
             (Reader, Writer) = (Info.streamreader, Info.streamwriter)
             File = codecs.StreamReaderWriter(UniFile, Reader, Writer)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index ff355d05d79f..60027390e820 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -19,7 +19,7 @@ from builtins import range
 import os
 from Common.RangeExpression import RangeExpression
 from Common.Misc import *
-from StringIO import StringIO
+from io import BytesIO
 from struct import pack
 
 class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
@@ -181,7 +181,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                             Buffer += b
                             realLength += 1
         
-        DbFile = StringIO()
+        DbFile = BytesIO()
         if Phase == 'DXE' and os.path.exists(BinFilePath):
             BinFile = open(BinFilePath, "rb")
             BinBuffer = BinFile.read()
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 17ca9e411061..0cc6f320aa16 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -15,7 +15,7 @@
 
 from builtins import range
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import StringTable as st
 import array
 import re
@@ -660,8 +660,8 @@ class GenVPD :
             # Open failed
             EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.MapFileName, None)
 
-        # Use a instance of StringIO to cache data
-        fStringIO = StringIO.StringIO('')
+        # Use a instance of BytesIO to cache data
+        fStringIO = BytesIO('')
 
         # Write the header of map file.
         try :
diff --git a/BaseTools/Source/Python/Eot/FvImage.py b/BaseTools/Source/Python/Eot/FvImage.py
index 64a27217e4a8..0a1eca1ed86f 100644
--- a/BaseTools/Source/Python/Eot/FvImage.py
+++ b/BaseTools/Source/Python/Eot/FvImage.py
@@ -24,7 +24,6 @@ import codecs
 import copy
 
 from UserDict import IterableUserDict
-from cStringIO import StringIO
 from array import array
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from CommonDataClass import *
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index b678079b3785..65919270af15 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -18,7 +18,7 @@
 from builtins import range
 from struct import *
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import FfsFileStatement
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import AprioriSectionClassObject
@@ -51,7 +51,7 @@ class AprioriSection (AprioriSectionClassObject):
     def GenFfs (self, FvName, Dict = {}, IsMakefile = False):
         DXE_GUID = "FC510EE7-FFDC-11D4-BD41-0080C73C8881"
         PEI_GUID = "1B45CC0A-156A-428A-AF62-49864DA0E6E6"
-        Buffer = StringIO.StringIO('')
+        Buffer = BytesIO('')
         AprioriFileGuid = DXE_GUID
         if self.AprioriType == "PEI":
             AprioriFileGuid = PEI_GUID
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index e03d78995737..60019195df27 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -19,7 +19,7 @@ from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import CapsuleClassObject
 import Common.LongFilePathOs as os
 import subprocess
-import StringIO
+from io import BytesIO
 from Common.Misc import SaveFileOnChange
 from GenFds import GenFds
 from Common.Misc import PackRegistryFormatGuid
@@ -66,7 +66,7 @@ class Capsule (CapsuleClassObject) :
         #     UINT32            CapsuleImageSize;
         # } EFI_CAPSULE_HEADER;
         #
-        Header = StringIO.StringIO()
+        Header = BytesIO()
         #
         # Use FMP capsule GUID: 6DCBD5ED-E82D-4C44-BDA1-7194199AD92A
         #
@@ -97,7 +97,7 @@ class Capsule (CapsuleClassObject) :
         #     // UINT64 ItemOffsetList[];
         # } EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER;
         #
-        FwMgrHdr = StringIO.StringIO()
+        FwMgrHdr = BytesIO()
         if 'CAPSULE_HEADER_INIT_VERSION' in self.TokensDict:
             FwMgrHdr.write(pack('=I', int(self.TokensDict['CAPSULE_HEADER_INIT_VERSION'], 16)))
         else:
@@ -132,7 +132,7 @@ class Capsule (CapsuleClassObject) :
         #
 
         PreSize = FwMgrHdrSize
-        Content = StringIO.StringIO()
+        Content = BytesIO()
         for driver in self.CapsuleDataList:
             FileName = driver.GenCapsuleSubItem()
             FwMgrHdr.write(pack('=Q', PreSize))
@@ -247,7 +247,7 @@ class Capsule (CapsuleClassObject) :
     def GenCapInf(self):
         self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiCapsuleName +  "_Cap" + '.inf')
-        CapInfFile = StringIO.StringIO() #open (self.CapInfFileName , 'w+')
+        CapInfFile = BytesIO() #open (self.CapInfFileName , 'w+')
 
         CapInfFile.writelines("[options]" + T_CHAR_LF)
 
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index 1fa202149b25..f0a55d81120b 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -17,7 +17,7 @@
 #
 import Ffs
 from GenFdsGlobalVariable import GenFdsGlobalVariable
-import StringIO
+from io import BytesIO
 from struct import pack
 import os
 from Common.Misc import SaveFileOnChange
@@ -82,7 +82,7 @@ class CapsuleFv (CapsuleData):
         if self.FvName.find('.fv') == -1:
             if self.FvName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
                 FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
-                FdBuffer = StringIO.StringIO('')
+                FdBuffer = BytesIO('')
                 FvObj.CapsuleName = self.CapsuleName
                 FvFile = FvObj.AddToBuffer(FdBuffer)
                 FvObj.CapsuleName = None
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index 21060625217e..acd73f6449f6 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -18,7 +18,7 @@
 import Region
 import Fv
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import sys
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -74,7 +74,7 @@ class FD(FDClassObject):
                 HasCapsuleRegion = True
                 break
         if HasCapsuleRegion:
-            TempFdBuffer = StringIO.StringIO('')
+            TempFdBuffer = BytesIO('')
             PreviousRegionStart = -1
             PreviousRegionSize = 1
 
@@ -103,7 +103,7 @@ class FD(FDClassObject):
                 GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
                 RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
         
-        FdBuffer = StringIO.StringIO('')
+        FdBuffer = BytesIO('')
         PreviousRegionStart = -1
         PreviousRegionSize = 1
         for RegionObj in self.RegionList :
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index cbfea730ef18..1293c8a107f0 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -19,7 +19,7 @@ from builtins import range
 import Ffs
 import Rule
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import subprocess
 
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -83,7 +83,7 @@ class FileStatement (FileStatementClassObject) :
         Dict.update(self.DefineVarDict)
         SectionAlignments = None
         if self.FvName != None :
-            Buffer = StringIO.StringIO('')
+            Buffer = BytesIO('')
             if self.FvName.upper() not in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
                 EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (self.FvName))
             Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index d4354171ab5e..29e1837160bc 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -18,7 +18,7 @@
 #
 import Rule
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import Ffs
@@ -1090,7 +1090,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
     def __GenUniVfrOffsetFile(self, VfrUniOffsetList, UniVfrOffsetFileName):
 
         # Use a instance of StringIO to cache data
-        fStringIO = StringIO.StringIO('')  
+        fStringIO = BytesIO('')
         
         for Item in VfrUniOffsetList:
             if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index c64c0c80e299..88a520998eae 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -18,7 +18,7 @@
 from builtins import range
 import Common.LongFilePathOs as os
 import subprocess
-import StringIO
+from io import BytesIO
 from struct import *
 
 import Ffs
@@ -268,7 +268,7 @@ class FV (FvClassObject):
         #
         self.InfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiFvName + '.inf')
-        self.FvInfFile = StringIO.StringIO()
+        self.FvInfFile = BytesIO()
 
         #
         # Add [Options]
@@ -427,7 +427,7 @@ class FV (FvClassObject):
             #
             if TotalSize > 0:
                 FvExtHeaderFileName = os.path.join(GenFdsGlobalVariable.FvDir, self.UiFvName + '.ext')
-                FvExtHeaderFile = StringIO.StringIO()
+                FvExtHeaderFile = BytesIO()
                 FvExtHeaderFile.write(Buffer)
                 Changed = SaveFileOnChange(FvExtHeaderFileName, FvExtHeaderFile.getvalue(), True)
                 FvExtHeaderFile.close()
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index ac5d5891df70..7416ce1b7d8a 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -16,7 +16,7 @@
 # Import Modules
 #
 import Section
-import StringIO
+from io import BytesIO
 from Ffs import Ffs
 import subprocess
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -97,7 +97,7 @@ class FvImageSection(FvImageSectionClassObject):
         # Generate Fv
         #
         if self.FvName != None:
-            Buffer = StringIO.StringIO('')
+            Buffer = BytesIO('')
             Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName)
             if Fv != None:
                 self.Fv = Fv
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 161955bc70ae..72b47abc352e 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -29,7 +29,7 @@ from Workspace.BuildClassObject import PcdClassObject
 from Workspace.BuildClassObject import ModuleBuildClassObject
 import RuleComplexFile
 from EfiSection import EfiSection
-import StringIO
+from io import BytesIO
 import Common.TargetTxtClassObject as TargetTxtClassObject
 import Common.ToolDefClassObject as ToolDefClassObject
 from Common.DataType import *
@@ -593,13 +593,13 @@ class GenFds :
         if GenFds.OnlyGenerateThisFv != None and GenFds.OnlyGenerateThisFv.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
             FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(GenFds.OnlyGenerateThisFv.upper())
             if FvObj != None:
-                Buffer = StringIO.StringIO()
+                Buffer = BytesIO()
                 FvObj.AddToBuffer(Buffer)
                 Buffer.close()
                 return
         elif GenFds.OnlyGenerateThisFv == None:
             for FvName in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
-                Buffer = StringIO.StringIO('')
+                Buffer = BytesIO('')
                 FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[FvName]
                 FvObj.AddToBuffer(Buffer)
                 Buffer.close()
@@ -751,7 +751,7 @@ class GenFds :
 
     def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
         GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
-        GuidXRefFile = StringIO.StringIO('')
+        GuidXRefFile = BytesIO('')
         GuidDict = {}
         ModuleList = []
         FileGuidList = []
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index 2e61a38c1d33..946cdf812a24 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -17,7 +17,6 @@
 #
 import Common.LongFilePathOs as os
 import subprocess
-import StringIO
 
 import OptRomInfStatement
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -138,5 +137,3 @@ class OverrideAttribs:
         self.PciDeviceId = None
         self.PciRevision = None
         self.NeedCompress = None
-        
-        
\ No newline at end of file
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 5b9b203cf475..6ace73abe904 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -18,7 +18,7 @@
 from builtins import range
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
-import StringIO
+from io import BytesIO
 import string
 from CommonDataClass.FdfClass import RegionClassObject
 import Common.LongFilePathOs as os
@@ -127,7 +127,7 @@ class Region(RegionClassObject):
                         if self.FvAddress % FvAlignValue != 0:
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "FV (%s) is NOT %s Aligned!" % (FvObj.UiFvName, FvObj.FvAlignment))
-                        FvBuffer = StringIO.StringIO('')
+                        FvBuffer = BytesIO('')
                         FvBaseAddress = '0x%X' % self.FvAddress
                         BlockSize = None
                         BlockNum = None
@@ -135,7 +135,8 @@ class Region(RegionClassObject):
                         if Flag:
                             continue
 
-                        if FvBuffer.len > Size:
+                        FvBufferLen = len(FvBuffer.getvalue())
+                        if FvBufferLen > Size:
                             FvBuffer.close()
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "Size of FV (%s) is larger than Region Size 0x%X specified." % (RegionData, Size))
@@ -144,8 +145,8 @@ class Region(RegionClassObject):
                         #
                         Buffer.write(FvBuffer.getvalue())
                         FvBuffer.close()
-                        FvOffset = FvOffset + FvBuffer.len
-                        Size = Size - FvBuffer.len
+                        FvOffset = FvOffset + FvBufferLen
+                        Size = Size - FvBufferLen
                         continue
                     else:
                         EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (RegionData))
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index af1bf9de3e00..87edfbe31fbf 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -18,7 +18,7 @@ from builtins import range
 import Common.LongFilePathOs as os
 import sys
 import re
-import StringIO
+from io import BytesIO
 
 from optparse import OptionParser
 from optparse import make_option
@@ -455,8 +455,8 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
 
-    # Use a instance of StringIO to cache data
-    fStringIO = StringIO.StringIO('')
+    # Use a instance of BytesIO to cache data
+    fStringIO = BytesIO('')
 
     for Item in VfrUniOffsetList:
         if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 3352504d502e..096d3ba8da14 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -28,7 +28,7 @@ import hashlib
 import subprocess
 import threading
 from datetime import datetime
-from StringIO import StringIO
+from io import BytesIO
 from Common import EdkLogger
 from Common.Misc import SaveFileOnChange
 from Common.Misc import GuidStructureByteArrayToGuidString
@@ -2057,7 +2057,7 @@ class BuildReport(object):
     def GenerateReport(self, BuildDuration, AutoGenTime, MakeTime, GenFdsTime):
         if self.ReportFile:
             try:
-                File = StringIO('')
+                File = BytesIO('')
                 for (Wa, MaList) in self.ReportList:
                     PlatformReport(Wa, MaList, self.ReportType).GenerateReport(File, BuildDuration, AutoGenTime, MakeTime, GenFdsTime, self.ReportType)
                 Content = FileLinesSplit(File.getvalue(), gLineMaxLength)
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index f77924137665..4c40597117a9 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -19,7 +19,7 @@
 from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
-import StringIO
+from io import BytesIO
 import sys
 import glob
 import time
@@ -1778,7 +1778,7 @@ class Build():
                             if not Ma.IsLibrary:
                                 ModuleList[Ma.Guid.upper()] = Ma
 
-                    MapBuffer = StringIO('')
+                    MapBuffer = BytesIO('')
                     if self.LoadFixAddress != 0:
                         #
                         # Rebase module to the preferred memory address before GenFds
@@ -1928,7 +1928,7 @@ class Build():
                             if not Ma.IsLibrary:
                                 ModuleList[Ma.Guid.upper()] = Ma
 
-                    MapBuffer = StringIO('')
+                    MapBuffer = BytesIO('')
                     if self.LoadFixAddress != 0:
                         #
                         # Rebase module to the preferred memory address before GenFds
@@ -2116,7 +2116,7 @@ class Build():
                     #
                     # Rebase module to the preferred memory address before GenFds
                     #
-                    MapBuffer = StringIO('')
+                    MapBuffer = BytesIO('')
                     if self.LoadFixAddress != 0:
                         self._CollectModuleMapBuffer(MapBuffer, ModuleList)
 
-- 
2.15.1



^ permalink raw reply related	[flat|nested] 18+ messages in thread

* Re: [PATCH 00/15] BaseTools: One step toward python3
  2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
                   ` (14 preceding siblings ...)
  2018-01-19  4:43 ` [PATCH 15/15] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
@ 2018-01-25 13:37 ` Zhu, Yonghong
  2018-01-26  2:01   ` Gary Lin
  15 siblings, 1 reply; 18+ messages in thread
From: Zhu, Yonghong @ 2018-01-25 13:37 UTC (permalink / raw)
  To: Gary Lin, edk2-devel@lists.01.org; +Cc: Gao, Liming, Zhu, Yonghong

Hi Gary,

Thanks for your patches. I am still in evaluating these changes and do some verification for it.
I still need some more days to give you comment. Thanks.

Best Regards,
Zhu Yonghong


-----Original Message-----
From: Gary Lin [mailto:glin@suse.com] 
Sent: Friday, January 19, 2018 12:43 PM
To: edk2-devel@lists.01.org
Cc: Zhu, Yonghong <yonghong.zhu@intel.com>; Gao, Liming <liming.gao@intel.com>
Subject: [PATCH 00/15] BaseTools: One step toward python3

Since python2 will be EOF in 2020, we start to evaluate the impact of the python2 removal. As expected, OMVF building failed the test. It's actually a task noted in the wiki page:

https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support

Maybe it's time to convert the python scripts gradully.

This patchset doesn't make the python scripts in BaseTools compatible with python3 immediately. It aims to do the trivial and safe conversion and replacement to make some statements compatible with both python2 and python3, so we can deal with the difficult cases later.

With the help of "futurize" from python-future, it's easier to refactor the statements. This patchset is basically equivalent to "futurize -1"
plus "StringIO.StringIO => io.BytesIO" and minus "fix_absolute_import".
The reason to skip "fix_absolute_import" is that python2 failed to find some modules after converting to absolute import, and it might take time to figure out a proper fix.

For the "io.BytesIO" change, it MIGHT introduce slow down to the build time since io.BytesIO is slower than StringIO.StringIO in python2(*).
For a quick test, I built OVMF with the following command based on
8ab0bd2397c9d3922e0c7dbb1aa6f7e08799079f:

$ rm -rf Build && make -C BaseTools/ clean $ time ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
                          -D NETWORK_IP6_ENABLE \
                          -D HTTP_BOOT_ENABLE \
                          -D TLS_ENABLE

Before io.BytesIO:

  Build total time: 00:03:56
  real    4m22.991s
  user    3m55.874s
  sys     0m27.250s

After io.BytesIO:

  Build total time: 00:03:57
  real    4m23.953s
  user    3m57.526s
  sys     0m27.192s

The difference is only 1 second, and I would say the impact is subtle. 

The next step will be fixing relative import and maybe applying more futurize fixes. We won't get there soon but at least we are moving... 

(*) https://stackoverflow.com/questions/37462075/confusing-about-stringio-cstringio-and-byteio

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>

Gary Lin (15):
  BaseTools: Refactor python except statements
  BaseTools: Refactor python print statements
  BaseTools: Remove the old python "not-equal"
  BaseTools: Use the python3-range functions
  BaseTools: Remove tuple parameter in python scripts
  BaseTools: Remove the deprecated hash_key()
  BaseTools: Import reduce() from functools
  BaseTools: Replace StandardError with Expression
  BaseTools: Remove types.TypeType
  BaseTools: Refactor python raise statement
  BaseTools: Adjust the spaces around commas and colons
  BaseTools: Migrate to the new octal literal
  BaseTools: Unify long int and int in python scripts
  BaseTools: Adjust old python2 idioms
  BaseTools: Replace StringIO.StringIO with io.BytesIO

 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                      |   5 +-
 BaseTools/Scripts/BinToPcd.py                                          |  46 +++---
 BaseTools/Scripts/ConvertMasmToNasm.py                                 |   1 +
 BaseTools/Scripts/ConvertUni.py                                        |   5 -
 BaseTools/Scripts/MemoryProfileSymbolGen.py                            |  22 +--
 BaseTools/Scripts/PatchCheck.py                                        |   7 +-
 BaseTools/Scripts/RunMakefile.py                                       |   2 +-
 BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                        |  20 +--
 BaseTools/Scripts/UpdateBuildVersions.py                               |  18 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                             |  91 +++++-----
 BaseTools/Source/Python/AutoGen/BuildEngine.py                         |  38 +++--
 BaseTools/Source/Python/AutoGen/GenC.py                                |   5 +-
 BaseTools/Source/Python/AutoGen/GenDepex.py                            |   8 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                             |   8 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                            | 142 ++++++++--------
 BaseTools/Source/Python/AutoGen/GenVar.py                              | 165 +++++++++----------
 BaseTools/Source/Python/AutoGen/IdfClassObject.py                      |   1 -
 BaseTools/Source/Python/AutoGen/InfSectionParser.py                    |   1 +
 BaseTools/Source/Python/AutoGen/StrGather.py                           |   5 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py                      |  18 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py             |  10 +-
 BaseTools/Source/Python/BPDG/BPDG.py                                   |   3 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                                 |  25 +--
 BaseTools/Source/Python/Common/DataType.py                             |   4 +-
 BaseTools/Source/Python/Common/DecClassObject.py                       |  39 ++---
 BaseTools/Source/Python/Common/Dictionary.py                           |   9 +-
 BaseTools/Source/Python/Common/DscClassObject.py                       |  70 ++++----
 BaseTools/Source/Python/Common/EdkIIWorkspace.py                       |  25 +--
 BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py                  | 133 +++++++--------
 BaseTools/Source/Python/Common/Expression.py                           |  81 ++++-----
 BaseTools/Source/Python/Common/FdfClassObject.py                       |   1 +
 BaseTools/Source/Python/Common/FdfParserLite.py                        |  47 +++---
 BaseTools/Source/Python/Common/InfClassObject.py                       | 113 ++++++-------
 BaseTools/Source/Python/Common/LongFilePathOs.py                       |   2 +-
 BaseTools/Source/Python/Common/MigrationUtilities.py                   |   1 +
 BaseTools/Source/Python/Common/Misc.py                                 |  70 ++++----
 BaseTools/Source/Python/Common/Parsing.py                              |   1 +
 BaseTools/Source/Python/Common/RangeExpression.py                      |  32 ++--
 BaseTools/Source/Python/Common/String.py                               |   7 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py                 |  15 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py                   |   3 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                          |  23 +--
 BaseTools/Source/Python/Ecc/CParser.py                                 | 173 ++++++++++----------
 BaseTools/Source/Python/Ecc/Check.py                                   |   1 +
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                   |  69 ++++----
 BaseTools/Source/Python/Ecc/Configuration.py                           |   5 +-
 BaseTools/Source/Python/Ecc/Exception.py                               |   3 +-
 BaseTools/Source/Python/Ecc/MetaDataParser.py                          |   3 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py         |   5 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py        |  41 ++---
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                         |   9 +-
 BaseTools/Source/Python/Ecc/c.py                                       |  15 +-
 BaseTools/Source/Python/Eot/CParser.py                                 | 173 ++++++++++----------
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py                   |  61 +++----
 BaseTools/Source/Python/Eot/FvImage.py                                 |  17 +-
 BaseTools/Source/Python/Eot/InfParserLite.py                           |   8 +-
 BaseTools/Source/Python/Eot/Parser.py                                  |   2 +-
 BaseTools/Source/Python/Eot/c.py                                       |  23 +--
 BaseTools/Source/Python/GenFds/AprioriSection.py                       |   7 +-
 BaseTools/Source/Python/GenFds/Capsule.py                              |  10 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                          |   6 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                           |   6 +-
 BaseTools/Source/Python/GenFds/Fd.py                                   |  12 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                            |  43 ++---
 BaseTools/Source/Python/GenFds/FfsFileStatement.py                     |   5 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                      |  16 +-
 BaseTools/Source/Python/GenFds/Fv.py                                   |  13 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py                       |   8 +-
 BaseTools/Source/Python/GenFds/GenFds.py                               |  20 ++-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                 |  10 +-
 BaseTools/Source/Python/GenFds/OptionRom.py                            |   3 -
 BaseTools/Source/Python/GenFds/Region.py                               |  14 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py           |   9 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py                 |   1 +
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  32 ++--
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py |  30 ++--
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         |  36 ++--
 BaseTools/Source/Python/TargetTool/TargetTool.py                       |  39 ++---
 BaseTools/Source/Python/Trim/Trim.py                                   |  25 +--
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                    |  12 +-
 BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py           |   4 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py                           |   2 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                              |   6 +-
 BaseTools/Source/Python/UPT/Core/PackageFile.py                        |  12 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py                  |  15 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                  |  42 ++---
 BaseTools/Source/Python/UPT/InstallPkg.py                              |   2 +-
 BaseTools/Source/Python/UPT/InventoryWs.py                             |   2 +-
 BaseTools/Source/Python/UPT/Library/CommentParsing.py                  |   5 +-
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py              |  11 +-
 BaseTools/Source/Python/UPT/Library/Misc.py                            |  11 +-
 BaseTools/Source/Python/UPT/Library/ParserValidate.py                  |   2 +-
 BaseTools/Source/Python/UPT/Library/Parsing.py                         |   3 +-
 BaseTools/Source/Python/UPT/Library/String.py                          |   5 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                  |  20 ++-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py                 |   4 +-
 BaseTools/Source/Python/UPT/MkPkg.py                                   |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py           |   6 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py           |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py             |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py   |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                   |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py         |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py              |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py              |   4 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py         |   2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py           |   3 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py    |   4 +-
 BaseTools/Source/Python/UPT/Parser/DecParserMisc.py                    |   1 +
 BaseTools/Source/Python/UPT/Parser/InfSectionParser.py                 |   3 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py              |  57 +++----
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py              |   3 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py          |   3 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py                              |   2 +-
 BaseTools/Source/Python/UPT/RmPkg.py                                   |   2 +-
 BaseTools/Source/Python/UPT/TestInstall.py                             |   4 +-
 BaseTools/Source/Python/UPT/UPT.py                                     |   9 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                  |   5 +-
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py           |  10 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py                           |   2 +-
 BaseTools/Source/Python/UPT/Xml/IniToXml.py                            |   1 +
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                           |  25 +--
 BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py                       |   3 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py                  |   2 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py                      |  14 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                      | 142 ++++++++--------
 BaseTools/Source/Python/Workspace/InfBuildData.py                      |   3 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py                    |  74 +++++----
 BaseTools/Source/Python/Workspace/MetaFileTable.py                     |  10 +-
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py                   |   2 +-
 BaseTools/Source/Python/build/BuildReport.py                           |  17 +-
 BaseTools/Source/Python/build/build.py                                 |  35 ++--
 BaseTools/Tests/CheckPythonSyntax.py                                   |   2 +-
 BaseTools/Tests/TestTools.py                                           |  13 +-
 BaseTools/Tests/TianoCompress.py                                       |   6 +-
 BaseTools/gcc/mingw-gcc-build.py                                       | 112 ++++++-------
 136 files changed, 1559 insertions(+), 1477 deletions(-)

--
2.15.1



^ permalink raw reply	[flat|nested] 18+ messages in thread

* Re: [PATCH 00/15] BaseTools: One step toward python3
  2018-01-25 13:37 ` [PATCH 00/15] BaseTools: One step toward python3 Zhu, Yonghong
@ 2018-01-26  2:01   ` Gary Lin
  0 siblings, 0 replies; 18+ messages in thread
From: Gary Lin @ 2018-01-26  2:01 UTC (permalink / raw)
  To: Zhu, Yonghong; +Cc: edk2-devel@lists.01.org, Gao, Liming

On Thu, Jan 25, 2018 at 01:37:39PM +0000, Zhu, Yonghong wrote:
> Hi Gary,
> 
> Thanks for your patches. I am still in evaluating these changes and do some verification for it.
> I still need some more days to give you comment. Thanks.
I forgot to mention my branch in the cover letter.

https://github.com/lcp/edk2/tree/python3-futurize

It would be easier to review/apply patches from a git branch.

Thanks,

Gary Lin

> 
> Best Regards,
> Zhu Yonghong
> 
> 
> -----Original Message-----
> From: Gary Lin [mailto:glin@suse.com] 
> Sent: Friday, January 19, 2018 12:43 PM
> To: edk2-devel@lists.01.org
> Cc: Zhu, Yonghong <yonghong.zhu@intel.com>; Gao, Liming <liming.gao@intel.com>
> Subject: [PATCH 00/15] BaseTools: One step toward python3
> 
> Since python2 will be EOF in 2020, we start to evaluate the impact of the python2 removal. As expected, OMVF building failed the test. It's actually a task noted in the wiki page:
> 
> https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support
> 
> Maybe it's time to convert the python scripts gradully.
> 
> This patchset doesn't make the python scripts in BaseTools compatible with python3 immediately. It aims to do the trivial and safe conversion and replacement to make some statements compatible with both python2 and python3, so we can deal with the difficult cases later.
> 
> With the help of "futurize" from python-future, it's easier to refactor the statements. This patchset is basically equivalent to "futurize -1"
> plus "StringIO.StringIO => io.BytesIO" and minus "fix_absolute_import".
> The reason to skip "fix_absolute_import" is that python2 failed to find some modules after converting to absolute import, and it might take time to figure out a proper fix.
> 
> For the "io.BytesIO" change, it MIGHT introduce slow down to the build time since io.BytesIO is slower than StringIO.StringIO in python2(*).
> For a quick test, I built OVMF with the following command based on
> 8ab0bd2397c9d3922e0c7dbb1aa6f7e08799079f:
> 
> $ rm -rf Build && make -C BaseTools/ clean $ time ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
>                           -D NETWORK_IP6_ENABLE \
>                           -D HTTP_BOOT_ENABLE \
>                           -D TLS_ENABLE
> 
> Before io.BytesIO:
> 
>   Build total time: 00:03:56
>   real    4m22.991s
>   user    3m55.874s
>   sys     0m27.250s
> 
> After io.BytesIO:
> 
>   Build total time: 00:03:57
>   real    4m23.953s
>   user    3m57.526s
>   sys     0m27.192s
> 
> The difference is only 1 second, and I would say the impact is subtle. 
> 
> The next step will be fixing relative import and maybe applying more futurize fixes. We won't get there soon but at least we are moving... 
> 
> (*) https://stackoverflow.com/questions/37462075/confusing-about-stringio-cstringio-and-byteio
> 
> Contributed-under: TianoCore Contribution Agreement 1.1
> Cc: Yonghong Zhu <yonghong.zhu@intel.com>
> Cc: Liming Gao <liming.gao@intel.com>
> Signed-off-by: Gary Lin <glin@suse.com>
> 
> Gary Lin (15):
>   BaseTools: Refactor python except statements
>   BaseTools: Refactor python print statements
>   BaseTools: Remove the old python "not-equal"
>   BaseTools: Use the python3-range functions
>   BaseTools: Remove tuple parameter in python scripts
>   BaseTools: Remove the deprecated hash_key()
>   BaseTools: Import reduce() from functools
>   BaseTools: Replace StandardError with Expression
>   BaseTools: Remove types.TypeType
>   BaseTools: Refactor python raise statement
>   BaseTools: Adjust the spaces around commas and colons
>   BaseTools: Migrate to the new octal literal
>   BaseTools: Unify long int and int in python scripts
>   BaseTools: Adjust old python2 idioms
>   BaseTools: Replace StringIO.StringIO with io.BytesIO
> 
>  BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                      |   5 +-
>  BaseTools/Scripts/BinToPcd.py                                          |  46 +++---
>  BaseTools/Scripts/ConvertMasmToNasm.py                                 |   1 +
>  BaseTools/Scripts/ConvertUni.py                                        |   5 -
>  BaseTools/Scripts/MemoryProfileSymbolGen.py                            |  22 +--
>  BaseTools/Scripts/PatchCheck.py                                        |   7 +-
>  BaseTools/Scripts/RunMakefile.py                                       |   2 +-
>  BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                        |  20 +--
>  BaseTools/Scripts/UpdateBuildVersions.py                               |  18 +-
>  BaseTools/Source/Python/AutoGen/AutoGen.py                             |  91 +++++-----
>  BaseTools/Source/Python/AutoGen/BuildEngine.py                         |  38 +++--
>  BaseTools/Source/Python/AutoGen/GenC.py                                |   5 +-
>  BaseTools/Source/Python/AutoGen/GenDepex.py                            |   8 +-
>  BaseTools/Source/Python/AutoGen/GenMake.py                             |   8 +-
>  BaseTools/Source/Python/AutoGen/GenPcdDb.py                            | 142 ++++++++--------
>  BaseTools/Source/Python/AutoGen/GenVar.py                              | 165 +++++++++----------
>  BaseTools/Source/Python/AutoGen/IdfClassObject.py                      |   1 -
>  BaseTools/Source/Python/AutoGen/InfSectionParser.py                    |   1 +
>  BaseTools/Source/Python/AutoGen/StrGather.py                           |   5 +-
>  BaseTools/Source/Python/AutoGen/UniClassObject.py                      |  18 +-
>  BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py             |  10 +-
>  BaseTools/Source/Python/BPDG/BPDG.py                                   |   3 +-
>  BaseTools/Source/Python/BPDG/GenVpd.py                                 |  25 +--
>  BaseTools/Source/Python/Common/DataType.py                             |   4 +-
>  BaseTools/Source/Python/Common/DecClassObject.py                       |  39 ++---
>  BaseTools/Source/Python/Common/Dictionary.py                           |   9 +-
>  BaseTools/Source/Python/Common/DscClassObject.py                       |  70 ++++----
>  BaseTools/Source/Python/Common/EdkIIWorkspace.py                       |  25 +--
>  BaseTools/Source/Python/Common/EdkIIWorkspaceBuild.py                  | 133 +++++++--------
>  BaseTools/Source/Python/Common/Expression.py                           |  81 ++++-----
>  BaseTools/Source/Python/Common/FdfClassObject.py                       |   1 +
>  BaseTools/Source/Python/Common/FdfParserLite.py                        |  47 +++---
>  BaseTools/Source/Python/Common/InfClassObject.py                       | 113 ++++++-------
>  BaseTools/Source/Python/Common/LongFilePathOs.py                       |   2 +-
>  BaseTools/Source/Python/Common/MigrationUtilities.py                   |   1 +
>  BaseTools/Source/Python/Common/Misc.py                                 |  70 ++++----
>  BaseTools/Source/Python/Common/Parsing.py                              |   1 +
>  BaseTools/Source/Python/Common/RangeExpression.py                      |  32 ++--
>  BaseTools/Source/Python/Common/String.py                               |   7 +-
>  BaseTools/Source/Python/Common/TargetTxtClassObject.py                 |  15 +-
>  BaseTools/Source/Python/Common/ToolDefClassObject.py                   |   3 +-
>  BaseTools/Source/Python/Common/VpdInfoFile.py                          |  23 +--
>  BaseTools/Source/Python/Ecc/CParser.py                                 | 173 ++++++++++----------
>  BaseTools/Source/Python/Ecc/Check.py                                   |   1 +
>  BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                   |  69 ++++----
>  BaseTools/Source/Python/Ecc/Configuration.py                           |   5 +-
>  BaseTools/Source/Python/Ecc/Exception.py                               |   3 +-
>  BaseTools/Source/Python/Ecc/MetaDataParser.py                          |   3 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py         |   5 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py        |  41 ++---
>  BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                         |   9 +-
>  BaseTools/Source/Python/Ecc/c.py                                       |  15 +-
>  BaseTools/Source/Python/Eot/CParser.py                                 | 173 ++++++++++----------
>  BaseTools/Source/Python/Eot/CodeFragmentCollector.py                   |  61 +++----
>  BaseTools/Source/Python/Eot/FvImage.py                                 |  17 +-
>  BaseTools/Source/Python/Eot/InfParserLite.py                           |   8 +-
>  BaseTools/Source/Python/Eot/Parser.py                                  |   2 +-
>  BaseTools/Source/Python/Eot/c.py                                       |  23 +--
>  BaseTools/Source/Python/GenFds/AprioriSection.py                       |   7 +-
>  BaseTools/Source/Python/GenFds/Capsule.py                              |  10 +-
>  BaseTools/Source/Python/GenFds/CapsuleData.py                          |   6 +-
>  BaseTools/Source/Python/GenFds/EfiSection.py                           |   6 +-
>  BaseTools/Source/Python/GenFds/Fd.py                                   |  12 +-
>  BaseTools/Source/Python/GenFds/FdfParser.py                            |  43 ++---
>  BaseTools/Source/Python/GenFds/FfsFileStatement.py                     |   5 +-
>  BaseTools/Source/Python/GenFds/FfsInfStatement.py                      |  16 +-
>  BaseTools/Source/Python/GenFds/Fv.py                                   |  13 +-
>  BaseTools/Source/Python/GenFds/FvImageSection.py                       |   8 +-
>  BaseTools/Source/Python/GenFds/GenFds.py                               |  20 ++-
>  BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                 |  10 +-
>  BaseTools/Source/Python/GenFds/OptionRom.py                            |   3 -
>  BaseTools/Source/Python/GenFds/Region.py                               |  14 +-
>  BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py           |   9 +-
>  BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py                 |   1 +
>  BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  32 ++--
>  BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py |  30 ++--
>  BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         |  36 ++--
>  BaseTools/Source/Python/TargetTool/TargetTool.py                       |  39 ++---
>  BaseTools/Source/Python/Trim/Trim.py                                   |  25 +--
>  BaseTools/Source/Python/UPT/Core/DependencyRules.py                    |  12 +-
>  BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py           |   4 +-
>  BaseTools/Source/Python/UPT/Core/FileHook.py                           |   2 +-
>  BaseTools/Source/Python/UPT/Core/IpiDb.py                              |   6 +-
>  BaseTools/Source/Python/UPT/Core/PackageFile.py                        |  12 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py                  |  15 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                  |  42 ++---
>  BaseTools/Source/Python/UPT/InstallPkg.py                              |   2 +-
>  BaseTools/Source/Python/UPT/InventoryWs.py                             |   2 +-
>  BaseTools/Source/Python/UPT/Library/CommentParsing.py                  |   5 +-
>  BaseTools/Source/Python/UPT/Library/ExpressionValidate.py              |  11 +-
>  BaseTools/Source/Python/UPT/Library/Misc.py                            |  11 +-
>  BaseTools/Source/Python/UPT/Library/ParserValidate.py                  |   2 +-
>  BaseTools/Source/Python/UPT/Library/Parsing.py                         |   3 +-
>  BaseTools/Source/Python/UPT/Library/String.py                          |   5 +-
>  BaseTools/Source/Python/UPT/Library/UniClassObject.py                  |  20 ++-
>  BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py                 |   4 +-
>  BaseTools/Source/Python/UPT/MkPkg.py                                   |   2 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py           |   6 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py           |   2 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py             |   4 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py   |   2 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                   |   4 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py         |   4 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py              |   4 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py              |   4 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py         |   2 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py           |   3 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py    |   4 +-
>  BaseTools/Source/Python/UPT/Parser/DecParserMisc.py                    |   1 +
>  BaseTools/Source/Python/UPT/Parser/InfSectionParser.py                 |   3 +-
>  BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py              |  57 +++----
>  BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py              |   3 +-
>  BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py          |   3 +-
>  BaseTools/Source/Python/UPT/ReplacePkg.py                              |   2 +-
>  BaseTools/Source/Python/UPT/RmPkg.py                                   |   2 +-
>  BaseTools/Source/Python/UPT/TestInstall.py                             |   4 +-
>  BaseTools/Source/Python/UPT/UPT.py                                     |   9 +-
>  BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                  |   5 +-
>  BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py           |  10 +-
>  BaseTools/Source/Python/UPT/Xml/CommonXml.py                           |   2 +-
>  BaseTools/Source/Python/UPT/Xml/IniToXml.py                            |   1 +
>  BaseTools/Source/Python/UPT/Xml/XmlParser.py                           |  25 +--
>  BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py                       |   3 +-
>  BaseTools/Source/Python/Workspace/BuildClassObject.py                  |   2 +-
>  BaseTools/Source/Python/Workspace/DecBuildData.py                      |  14 +-
>  BaseTools/Source/Python/Workspace/DscBuildData.py                      | 142 ++++++++--------
>  BaseTools/Source/Python/Workspace/InfBuildData.py                      |   3 +-
>  BaseTools/Source/Python/Workspace/MetaFileParser.py                    |  74 +++++----
>  BaseTools/Source/Python/Workspace/MetaFileTable.py                     |  10 +-
>  BaseTools/Source/Python/Workspace/WorkspaceCommon.py                   |   2 +-
>  BaseTools/Source/Python/build/BuildReport.py                           |  17 +-
>  BaseTools/Source/Python/build/build.py                                 |  35 ++--
>  BaseTools/Tests/CheckPythonSyntax.py                                   |   2 +-
>  BaseTools/Tests/TestTools.py                                           |  13 +-
>  BaseTools/Tests/TianoCompress.py                                       |   6 +-
>  BaseTools/gcc/mingw-gcc-build.py                                       | 112 ++++++-------
>  136 files changed, 1559 insertions(+), 1477 deletions(-)
> 
> --
> 2.15.1
> 
> _______________________________________________
> edk2-devel mailing list
> edk2-devel@lists.01.org
> https://lists.01.org/mailman/listinfo/edk2-devel
> 


^ permalink raw reply	[flat|nested] 18+ messages in thread

end of thread, other threads:[~2018-01-26  1:56 UTC | newest]

Thread overview: 18+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2018-01-19  4:43 [PATCH 00/15] BaseTools: One step toward python3 Gary Lin
2018-01-19  4:43 ` [PATCH 01/15] BaseTools: Refactor python except statements Gary Lin
2018-01-19  4:43 ` [PATCH 02/15] BaseTools: Refactor python print statements Gary Lin
2018-01-19  4:43 ` [PATCH 03/15] BaseTools: Remove the old python "not-equal" Gary Lin
2018-01-19  4:43 ` [PATCH 04/15] BaseTools: Use the python3-range functions Gary Lin
2018-01-19  4:43 ` [PATCH 05/15] BaseTools: Remove tuple parameter in python scripts Gary Lin
2018-01-19  4:43 ` [PATCH 06/15] BaseTools: Remove the deprecated hash_key() Gary Lin
2018-01-19  4:43 ` [PATCH 07/15] BaseTools: Import reduce() from functools Gary Lin
2018-01-19  4:43 ` [PATCH 08/15] BaseTools: Replace StandardError with Expression Gary Lin
2018-01-19  4:43 ` [PATCH 09/15] BaseTools: Remove types.TypeType Gary Lin
2018-01-19  4:43 ` [PATCH 10/15] BaseTools: Refactor python raise statement Gary Lin
2018-01-19  4:43 ` [PATCH 11/15] BaseTools: Adjust the spaces around commas and colons Gary Lin
2018-01-19  4:43 ` [PATCH 12/15] BaseTools: Migrate to the new octal literal Gary Lin
2018-01-19  4:43 ` [PATCH 13/15] BaseTools: Unify long int and int in python scripts Gary Lin
2018-01-19  4:43 ` [PATCH 14/15] BaseTools: Adjust old python2 idioms Gary Lin
2018-01-19  4:43 ` [PATCH 15/15] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
2018-01-25 13:37 ` [PATCH 00/15] BaseTools: One step toward python3 Zhu, Yonghong
2018-01-26  2:01   ` Gary Lin

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox