public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
From: "Feng, Bob C" <bob.c.feng@intel.com>
To: edk2-devel@lists.01.org
Cc: Bob Feng <bob.c.feng@intel.com>,
	Liming Gao <liming.gao@intel.com>,
	Yonghong Zhu <yonghong.zhu@intel.com>,
	"Zhiju . Fan" <zhijux.fan@intel.com>
Subject: [Patch 31/33] BaseTools: Handle the bytes and str difference
Date: Fri, 25 Jan 2019 12:56:24 +0800	[thread overview]
Message-ID: <20190125045626.14700-32-bob.c.feng@intel.com> (raw)
In-Reply-To: <20190125045626.14700-1-bob.c.feng@intel.com>

Deal with bytes and str is different, remove the unicode(),
correct open file parameter.
Using utcfromtimestamp instead of fromtimestamp.

Cc: Bob Feng <bob.c.feng@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Zhiju.Fan <zhijux.fan@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                             | 42 ++++++++++++++++++++----------------------
 BaseTools/Source/Python/AutoGen/GenC.py                                |  6 +++---
 BaseTools/Source/Python/AutoGen/GenMake.py                             | 14 +++++++++-----
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                            | 29 +++++++++++++++--------------
 BaseTools/Source/Python/AutoGen/GenVar.py                              | 34 ++++++++++++++++++----------------
 BaseTools/Source/Python/AutoGen/InfSectionParser.py                    |  2 +-
 BaseTools/Source/Python/AutoGen/StrGather.py                           |  5 ++++-
 BaseTools/Source/Python/AutoGen/UniClassObject.py                      |  4 ++--
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py             |  2 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                                 |  8 ++++----
 BaseTools/Source/Python/Common/LongFilePathOs.py                       |  3 +--
 BaseTools/Source/Python/Common/LongFilePathSupport.py                  | 12 ------------
 BaseTools/Source/Python/Common/Misc.py                                 | 48 ++++++++++++++++++++++++++++++++----------------
 BaseTools/Source/Python/Common/StringUtils.py                          | 10 ++--------
 BaseTools/Source/Python/Common/VpdInfoFile.py                          | 10 +++++-----
 BaseTools/Source/Python/GenFds/AprioriSection.py                       |  2 +-
 BaseTools/Source/Python/GenFds/Capsule.py                              | 15 +++++++--------
 BaseTools/Source/Python/GenFds/CapsuleData.py                          |  2 +-
 BaseTools/Source/Python/GenFds/Fd.py                                   |  4 ++--
 BaseTools/Source/Python/GenFds/FdfParser.py                            |  4 ++--
 BaseTools/Source/Python/GenFds/FfsFileStatement.py                     | 16 ++++++++--------
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                      | 12 +++++-------
 BaseTools/Source/Python/GenFds/Fv.py                                   | 48 ++++++++++++++++++++++++------------------------
 BaseTools/Source/Python/GenFds/FvImageSection.py                       |  2 +-
 BaseTools/Source/Python/GenFds/GenFds.py                               | 22 +++++++++++-----------
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                 |  4 ++--
 BaseTools/Source/Python/GenFds/Region.py                               |  6 +++---
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  2 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | 14 +++++++-------
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         |  7 ++++---
 BaseTools/Source/Python/Trim/Trim.py                                   | 18 ++++++++----------
 BaseTools/Source/Python/UPT/Library/StringUtils.py                     |  4 +---
 BaseTools/Source/Python/Workspace/BuildClassObject.py                  |  4 ++--
 BaseTools/Source/Python/Workspace/DscBuildData.py                      | 13 ++++++++++---
 BaseTools/Source/Python/Workspace/MetaFileParser.py                    |  4 ++--
 BaseTools/Source/Python/build/BuildReport.py                           | 15 +++++++--------
 BaseTools/Source/Python/build/build.py                                 | 44 +++++++++++++++++++++-----------------------
 37 files changed, 247 insertions(+), 244 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index baa1842667..a95d2c710e 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -724,15 +724,15 @@ class WorkspaceAutoGen(AutoGen):
         if GlobalData.gUseHashCache:
             m = hashlib.md5()
             for files in AllWorkSpaceMetaFiles:
                 if files.endswith('.dec'):
                     continue
-                f = open(files, 'r')
+                f = open(files, 'rb')
                 Content = f.read()
                 f.close()
                 m.update(Content)
-            SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'), m.hexdigest(), True)
+            SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'), m.hexdigest(), False)
             GlobalData.gPlatformHash = m.hexdigest()
 
         #
         # Write metafile list to build directory
         #
@@ -753,25 +753,25 @@ class WorkspaceAutoGen(AutoGen):
         PkgDir = os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageName)
         CreateDirectory(PkgDir)
         HashFile = os.path.join(PkgDir, Pkg.PackageName + '.hash')
         m = hashlib.md5()
         # Get .dec file's hash value
-        f = open(Pkg.MetaFile.Path, 'r')
+        f = open(Pkg.MetaFile.Path, 'rb')
         Content = f.read()
         f.close()
         m.update(Content)
         # Get include files hash value
         if Pkg.Includes:
             for inc in sorted(Pkg.Includes, key=lambda x: str(x)):
                 for Root, Dirs, Files in os.walk(str(inc)):
                     for File in sorted(Files):
                         File_Path = os.path.join(Root, File)
-                        f = open(File_Path, 'r')
+                        f = open(File_Path, 'rb')
                         Content = f.read()
                         f.close()
                         m.update(Content)
-        SaveFileOnChange(HashFile, m.hexdigest(), True)
+        SaveFileOnChange(HashFile, m.hexdigest(), False)
         GlobalData.gPackageHash[Pkg.Arch][Pkg.PackageName] = m.hexdigest()
 
     def _GetMetaFiles(self, Target, Toolchain, Arch):
         AllWorkSpaceMetaFiles = set()
         #
@@ -1734,11 +1734,11 @@ class PlatformAutoGen(AutoGen):
         self._DynamicPcdList.extend(list(OtherPcdArray))
         allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName, Sku) in pcd.SkuInfoList.items()]
         for pcd in self._DynamicPcdList:
             if len(pcd.SkuInfoList) == 1:
                 for (SkuName, SkuId) in allskuset:
-                    if type(SkuId) in (str, unicode) and eval(SkuId) == 0 or SkuId == 0:
+                    if isinstance(SkuId, str) and eval(SkuId) == 0 or SkuId == 0:
                         continue
                     pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList[TAB_DEFAULT])
                     pcd.SkuInfoList[SkuName].SkuId = SkuId
                     pcd.SkuInfoList[SkuName].SkuIdName = SkuName
 
@@ -1904,11 +1904,11 @@ class PlatformAutoGen(AutoGen):
                             MakeFlags = Value
                     else:
                         ToolsDef += "%s_%s = %s\n" % (Tool, Attr, Value)
             ToolsDef += "\n"
 
-        SaveFileOnChange(self.ToolDefinitionFile, ToolsDef)
+        SaveFileOnChange(self.ToolDefinitionFile, ToolsDef, False)
         for DllPath in DllPathList:
             os.environ["PATH"] = DllPath + os.pathsep + os.environ["PATH"]
         os.environ["MAKE_FLAGS"] = MakeFlags
 
         return RetVal
@@ -3301,22 +3301,22 @@ class ModuleAutoGen(AutoGen):
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
         if str(StringH) != "":
             AutoFile = PathClass(gAutoGenStringFileName % {"module_name":self.Name}, self.DebugDir)
             RetVal[AutoFile] = str(StringH)
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
-        if UniStringBinBuffer is not None and UniStringBinBuffer.getvalue() != "":
+        if UniStringBinBuffer is not None and UniStringBinBuffer.getvalue() != b"":
             AutoFile = PathClass(gAutoGenStringFormFileName % {"module_name":self.Name}, self.OutputDir)
             RetVal[AutoFile] = UniStringBinBuffer.getvalue()
             AutoFile.IsBinary = True
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
         if UniStringBinBuffer is not None:
             UniStringBinBuffer.close()
         if str(StringIdf) != "":
             AutoFile = PathClass(gAutoGenImageDefFileName % {"module_name":self.Name}, self.DebugDir)
             RetVal[AutoFile] = str(StringIdf)
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
-        if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue() != "":
+        if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue() != b"":
             AutoFile = PathClass(gAutoGenIdfFileName % {"module_name":self.Name}, self.OutputDir)
             RetVal[AutoFile] = IdfGenBinBuffer.getvalue()
             AutoFile.IsBinary = True
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
         if IdfGenBinBuffer is not None:
@@ -3530,33 +3530,31 @@ class ModuleAutoGen(AutoGen):
             fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
         except:
             EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
 
         # Use a instance of BytesIO to cache data
-        fStringIO = BytesIO('')
+        fStringIO = BytesIO()
 
         for Item in VfrUniOffsetList:
             if (Item[0].find("Strings") != -1):
                 #
                 # UNI offset in image.
                 # GUID + Offset
                 # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66 } }
                 #
-                UniGuid = [0xe0, 0xc5, 0x13, 0x89, 0xf6, 0x33, 0x86, 0x4d, 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66]
-                UniGuid = [chr(ItemGuid) for ItemGuid in UniGuid]
-                fStringIO.write(''.join(UniGuid))
+                UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
+                fStringIO.write(UniGuid)
                 UniValue = pack ('Q', int (Item[1], 16))
                 fStringIO.write (UniValue)
             else:
                 #
                 # VFR binary offset in image.
                 # GUID + Offset
                 # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2 } };
                 #
-                VfrGuid = [0xb4, 0x7c, 0xbc, 0xd0, 0x47, 0x6a, 0x5f, 0x49, 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2]
-                VfrGuid = [chr(ItemGuid) for ItemGuid in VfrGuid]
-                fStringIO.write(''.join(VfrGuid))
+                VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
+                fStringIO.write(VfrGuid)
                 VfrValue = pack ('Q', int (Item[1], 16))
                 fStringIO.write (VfrValue)
         #
         # write data into file.
         #
@@ -4093,44 +4091,44 @@ class ModuleAutoGen(AutoGen):
     def GenModuleHash(self):
         if self.Arch not in GlobalData.gModuleHash:
             GlobalData.gModuleHash[self.Arch] = {}
         m = hashlib.md5()
         # Add Platform level hash
-        m.update(GlobalData.gPlatformHash)
+        m.update(GlobalData.gPlatformHash.encode('utf-8'))
         # Add Package level hash
         if self.DependentPackageList:
             for Pkg in sorted(self.DependentPackageList, key=lambda x: x.PackageName):
                 if Pkg.PackageName in GlobalData.gPackageHash[self.Arch]:
-                    m.update(GlobalData.gPackageHash[self.Arch][Pkg.PackageName])
+                    m.update(GlobalData.gPackageHash[self.Arch][Pkg.PackageName].encode('utf-8'))
 
         # Add Library hash
         if self.LibraryAutoGenList:
             for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.Name):
                 if Lib.Name not in GlobalData.gModuleHash[self.Arch]:
                     Lib.GenModuleHash()
-                m.update(GlobalData.gModuleHash[self.Arch][Lib.Name])
+                m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].encode('utf-8'))
 
         # Add Module self
-        f = open(str(self.MetaFile), 'r')
+        f = open(str(self.MetaFile), 'rb')
         Content = f.read()
         f.close()
         m.update(Content)
         # Add Module's source files
         if self.SourceFileList:
             for File in sorted(self.SourceFileList, key=lambda x: str(x)):
-                f = open(str(File), 'r')
+                f = open(str(File), 'rb')
                 Content = f.read()
                 f.close()
                 m.update(Content)
 
         ModuleHashFile = path.join(self.BuildDir, self.Name + ".hash")
         if self.Name not in GlobalData.gModuleHash[self.Arch]:
             GlobalData.gModuleHash[self.Arch][self.Name] = m.hexdigest()
         if GlobalData.gBinCacheSource:
             if self.AttemptModuleCacheCopy():
                 return False
-        return SaveFileOnChange(ModuleHashFile, m.hexdigest(), True)
+        return SaveFileOnChange(ModuleHashFile, m.hexdigest(), False)
 
     ## Decide whether we can skip the ModuleAutoGen process
     def CanSkipbyHash(self):
         if GlobalData.gUseHashCache:
             return not self.GenModuleHash()
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 700c94b3a7..9700bf8527 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1780,11 +1780,11 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
                                 TempBuffer = pack('B', EFI_HII_IIBT_IMAGE_PNG)
                                 TempBuffer += pack('I', len(Buffer))
                                 TempBuffer += Buffer
                             elif File.Ext.upper() == '.JPG':
                                 ImageType, = struct.unpack('4s', Buffer[6:10])
-                                if ImageType != 'JFIF':
+                                if ImageType != b'JFIF':
                                     EdkLogger.error("build", FILE_TYPE_MISMATCH, "The file %s is not a standard JPG file." % File.Path)
                                 TempBuffer = pack('B', EFI_HII_IIBT_IMAGE_JPEG)
                                 TempBuffer += pack('I', len(Buffer))
                                 TempBuffer += Buffer
                             elif File.Ext.upper() == '.BMP':
@@ -1880,11 +1880,11 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
 #   UINT8    BlockBody[];
 # } EFI_HII_IMAGE_BLOCK;
 
 def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
     ImageType, = struct.unpack('2s', Buffer[0:2])
-    if ImageType!= 'BM': # BMP file type is 'BM'
+    if ImageType!= b'BM': # BMP file type is 'BM'
         EdkLogger.error("build", FILE_TYPE_MISMATCH, "The file %s is not a standard BMP file." % File.Path)
     BMP_IMAGE_HEADER = collections.namedtuple('BMP_IMAGE_HEADER', ['bfSize', 'bfReserved1', 'bfReserved2', 'bfOffBits', 'biSize', 'biWidth', 'biHeight', 'biPlanes', 'biBitCount', 'biCompression', 'biSizeImage', 'biXPelsPerMeter', 'biYPelsPerMeter', 'biClrUsed', 'biClrImportant'])
     BMP_IMAGE_HEADER_STRUCT = struct.Struct('IHHIIIIHHIIIIII')
     BmpHeader = BMP_IMAGE_HEADER._make(BMP_IMAGE_HEADER_STRUCT.unpack_from(Buffer[2:]))
     #
@@ -1952,11 +1952,11 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
     if PaletteBuffer and len(PaletteBuffer) > 1:
         PaletteTemp = pack('x')
         for Index in range(0, len(PaletteBuffer)):
             if Index % 4 == 3:
                 continue
-            PaletteTemp += PaletteBuffer[Index]
+            PaletteTemp += PaletteBuffer[Index:Index+1]
         PaletteBuffer = PaletteTemp[1:]
     return ImageBuffer, PaletteBuffer
 
 ## Create common code
 #
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index c42053eb4c..dc4cd688f4 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1036,21 +1036,25 @@ cleanlib:
             CurrentFileDependencyList = []
             if F in DepDb:
                 CurrentFileDependencyList = DepDb[F]
             else:
                 try:
-                    Fd = open(F.Path, 'r')
+                    Fd = open(F.Path, 'rb')
+                    FileContent = Fd.read()
+                    Fd.close()
                 except BaseException as X:
                     EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
-
-                FileContent = Fd.read()
-                Fd.close()
                 if len(FileContent) == 0:
                     continue
 
                 if FileContent[0] == 0xff or FileContent[0] == 0xfe:
-                    FileContent = unicode(FileContent, "utf-16")
+                    FileContent = FileContent.decode('utf-16')
+                else:
+                    try:
+                        FileContent = str(FileContent)
+                    except:
+                        pass
                 IncludedFileList = gIncludePattern.findall(FileContent)
 
                 for Inc in IncludedFileList:
                     Inc = Inc.strip()
                     # if there's macro used to reference header file, expand it
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 2cb1745823..cbf7a39dd5 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -293,11 +293,11 @@ class DbItemList:
             GuidString = GuidStructureStringToGuidString(GuidStructureValue)
             return PackGUID(GuidString.split('-'))
 
         PackStr = PACK_CODE_BY_SIZE[self.ItemSize]
 
-        Buffer = ''
+        Buffer = bytearray()
         for Datas in self.RawDataList:
             if type(Datas) in (list, tuple):
                 for Data in Datas:
                     if PackStr:
                         Buffer += pack(PackStr, GetIntegerValue(Data))
@@ -318,11 +318,11 @@ class DbItemList:
 class DbExMapTblItemList (DbItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbItemList.__init__(self, ItemSize, DataList, RawDataList)
 
     def PackData(self):
-        Buffer = ''
+        Buffer = bytearray()
         PackStr = "=LHH"
         for Datas in self.RawDataList:
             Buffer += pack(PackStr,
                            GetIntegerValue(Datas[0]),
                            GetIntegerValue(Datas[1]),
@@ -367,11 +367,11 @@ class DbComItemList (DbItemList):
         return self.ListSize
 
     def PackData(self):
         PackStr = PACK_CODE_BY_SIZE[self.ItemSize]
 
-        Buffer = ''
+        Buffer = bytearray()
         for DataList in self.RawDataList:
             for Data in DataList:
                 if type(Data) in (list, tuple):
                     for SingleData in Data:
                         Buffer += pack(PackStr, GetIntegerValue(SingleData))
@@ -388,11 +388,11 @@ class DbVariableTableItemList (DbComItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbComItemList.__init__(self, ItemSize, DataList, RawDataList)
 
     def PackData(self):
         PackStr = "=LLHHLHH"
-        Buffer = ''
+        Buffer = bytearray()
         for DataList in self.RawDataList:
             for Data in DataList:
                 Buffer += pack(PackStr,
                                GetIntegerValue(Data[0]),
                                GetIntegerValue(Data[1]),
@@ -449,11 +449,11 @@ class DbSkuHeadTableItemList (DbItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbItemList.__init__(self, ItemSize, DataList, RawDataList)
 
     def PackData(self):
         PackStr = "=LL"
-        Buffer = ''
+        Buffer = bytearray()
         for Data in self.RawDataList:
             Buffer += pack(PackStr,
                            GetIntegerValue(Data[0]),
                            GetIntegerValue(Data[1]))
         return Buffer
@@ -471,11 +471,11 @@ class DbSizeTableItemList (DbItemList):
         for Data in self.RawDataList:
             length += (1 + len(Data[1]))
         return length * self.ItemSize
     def PackData(self):
         PackStr = "=H"
-        Buffer = ''
+        Buffer = bytearray()
         for Data in self.RawDataList:
             Buffer += pack(PackStr,
                            GetIntegerValue(Data[0]))
             for subData in Data[1]:
                 Buffer += pack(PackStr,
@@ -851,12 +851,13 @@ def BuildExDataBase(Dict):
     Buffer += b
 
     Index = 0
     for Item in DbItemTotal:
         Index +=1
-        b = Item.PackData()
-        Buffer += b
+        packdata = Item.PackData()
+        for i in range(len(packdata)):
+            Buffer += packdata[i:i + 1]
         if Index == InitTableNum:
             if len(Buffer) % 8:
                 for num in range(8 - len(Buffer) % 8):
                     b = pack('=B', Pad)
                     Buffer += b
@@ -919,13 +920,13 @@ def CreatePcdDataBase(PcdDBData):
             databasebuff = databasebuff[:-1] + pack("=B", item[1])
     totallen = len(databasebuff)
     totallenbuff = pack("=L", totallen)
     newbuffer = databasebuff[:32]
     for i in range(4):
-        newbuffer += totallenbuff[i]
+        newbuffer += totallenbuff[i:i+1]
     for i in range(36, totallen):
-        newbuffer += databasebuff[i]
+        newbuffer += databasebuff[i:i+1]
 
     return newbuffer
 
 def CreateVarCheckBin(VarCheckTab):
     return VarCheckTab[(TAB_DEFAULT, "0")]
@@ -963,12 +964,12 @@ def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase):
     VarCheckTableData = {}
     if DynamicPcdSet_Sku:
         for skuname, skuid in DynamicPcdSet_Sku:
             AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdSet_Sku[(skuname, skuid)], Phase)
             final_data = ()
-            for item in PcdDbBuffer:
-                final_data += unpack("B", item)
+            for item in range(len(PcdDbBuffer)):
+                final_data += unpack("B", PcdDbBuffer[item:item+1])
             PcdDBData[(skuname, skuid)] = (PcdDbBuffer, final_data)
             PcdDriverAutoGenData[(skuname, skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
             VarCheckTableData[(skuname, skuid)] = VarCheckTab
         if Platform.Platform.VarCheckFlag:
             dest = os.path.join(Platform.BuildDir, TAB_FV_DIRECTORY)
@@ -976,12 +977,12 @@ def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase):
             VarCheckTable.dump(dest, Phase)
         AdditionalAutoGenH, AdditionalAutoGenC =  CreateAutoGen(PcdDriverAutoGenData)
     else:
         AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, {}, Phase)
         final_data = ()
-        for item in PcdDbBuffer:
-            final_data += unpack("B", item)
+        for item in range(len(PcdDbBuffer)):
+            final_data += unpack("B", PcdDbBuffer[item:item + 1])
         PcdDBData[(TAB_DEFAULT, "0")] = (PcdDbBuffer, final_data)
 
     return AdditionalAutoGenH, AdditionalAutoGenC, CreatePcdDataBase(PcdDBData)
 ## Create PCD database in DXE or PEI phase
 #
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 98f88e2497..453af66022 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -71,24 +71,26 @@ class VariableMgr(object):
             firstdata_type = sku_var_info_offset_list[0].data_type
             if firstdata_type in DataType.TAB_PCD_NUMERIC_TYPES:
                 fisrtdata_flag = DataType.PACK_CODE_BY_SIZE[MAX_SIZE_TYPE[firstdata_type]]
                 fisrtdata = fisrtvalue_list[0]
                 fisrtvalue_list = []
-                for data_byte in pack(fisrtdata_flag, int(fisrtdata, 16) if fisrtdata.upper().startswith('0X') else int(fisrtdata)):
-                    fisrtvalue_list.append(hex(unpack("B", data_byte)[0]))
+                pack_data = pack(fisrtdata_flag, int(fisrtdata, 0))
+                for data_byte in range(len(pack_data)):
+                    fisrtvalue_list.append(hex(unpack("B", pack_data[data_byte:data_byte + 1])[0]))
             newvalue_list = ["0x00"] * FirstOffset + fisrtvalue_list
 
             for var_item in sku_var_info_offset_list[1:]:
                 CurOffset = int(var_item.var_offset, 16) if var_item.var_offset.upper().startswith("0X") else int(var_item.var_offset)
                 CurvalueList = var_item.default_value.strip("{").strip("}").split(",")
                 Curdata_type = var_item.data_type
                 if Curdata_type in DataType.TAB_PCD_NUMERIC_TYPES:
                     data_flag = DataType.PACK_CODE_BY_SIZE[MAX_SIZE_TYPE[Curdata_type]]
                     data = CurvalueList[0]
                     CurvalueList = []
-                    for data_byte in pack(data_flag, int(data, 16) if data.upper().startswith('0X') else int(data)):
-                        CurvalueList.append(hex(unpack("B", data_byte)[0]))
+                    pack_data = pack(data_flag, int(data, 0))
+                    for data_byte in range(len(pack_data)):
+                        CurvalueList.append(hex(unpack("B", pack_data[data_byte:data_byte + 1])[0]))
                 if CurOffset > len(newvalue_list):
                     newvalue_list = newvalue_list + ["0x00"] * (CurOffset - len(newvalue_list)) + CurvalueList
                 else:
                     newvalue_list[CurOffset : CurOffset + len(CurvalueList)] = CurvalueList
 
@@ -121,12 +123,12 @@ class VariableMgr(object):
                     tail = ",".join("0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(","))))
 
             default_data_buffer = VariableMgr.PACK_VARIABLES_DATA(default_sku_default.default_value, default_sku_default.data_type, tail)
 
             default_data_array = ()
-            for item in default_data_buffer:
-                default_data_array += unpack("B", item)
+            for item in range(len(default_data_buffer)):
+                default_data_array += unpack("B", default_data_buffer[item:item + 1])
 
             var_data[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)][index] = (default_data_buffer, sku_var_info[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)])
 
             for (skuid, defaultstoragename) in indexedvarinfo[index]:
                 tail = None
@@ -139,12 +141,12 @@ class VariableMgr(object):
                         tail = ",".join("0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(","))))
 
                 others_data_buffer = VariableMgr.PACK_VARIABLES_DATA(other_sku_other.default_value, other_sku_other.data_type, tail)
 
                 others_data_array = ()
-                for item in others_data_buffer:
-                    others_data_array += unpack("B", item)
+                for item in range(len(others_data_buffer)):
+                    others_data_array += unpack("B", others_data_buffer[item:item + 1])
 
                 data_delta = VariableMgr.calculate_delta(default_data_array, others_data_array)
 
                 var_data[(skuid, defaultstoragename)][index] = (data_delta, sku_var_info[(skuid, defaultstoragename)])
         return var_data
@@ -156,11 +158,11 @@ class VariableMgr(object):
 
         if not var_data:
             return []
 
         pcds_default_data = var_data.get((DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT), {})
-        NvStoreDataBuffer = ""
+        NvStoreDataBuffer = bytearray()
         var_data_offset = collections.OrderedDict()
         offset = NvStorageHeaderSize
         for default_data, default_info in pcds_default_data.values():
             var_name_buffer = VariableMgr.PACK_VARIABLE_NAME(default_info.var_name)
 
@@ -183,11 +185,11 @@ class VariableMgr(object):
 
         variable_storage_header_buffer = VariableMgr.PACK_VARIABLE_STORE_HEADER(len(NvStoreDataBuffer) + 28)
 
         nv_default_part = VariableMgr.AlignData(VariableMgr.PACK_DEFAULT_DATA(0, 0, VariableMgr.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
 
-        data_delta_structure_buffer = ""
+        data_delta_structure_buffer = bytearray()
         for skuname, defaultstore in var_data:
             if (skuname, defaultstore) == (DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT):
                 continue
             pcds_sku_data = var_data[(skuname, defaultstore)]
             delta_data_set = []
@@ -214,12 +216,12 @@ class VariableMgr(object):
         return  [hex(item) for item in VariableMgr.unpack_data(data)]
 
     @staticmethod
     def unpack_data(data):
         final_data = ()
-        for item in data:
-            final_data += unpack("B", item)
+        for item in range(len(data)):
+            final_data += unpack("B", data[item:item + 1])
         return final_data
 
     @staticmethod
     def calculate_delta(default, theother):
         if len(default) - len(theother) != 0:
@@ -283,11 +285,11 @@ class VariableMgr(object):
 
         return Buffer
 
     @staticmethod
     def PACK_VARIABLES_DATA(var_value,data_type, tail = None):
-        Buffer = ""
+        Buffer = bytearray()
         data_len = 0
         if data_type == DataType.TAB_VOID:
             for value_char in var_value.strip("{").strip("}").split(","):
                 Buffer += pack("=B", int(value_char, 16))
             data_len += len(var_value.split(","))
@@ -313,11 +315,11 @@ class VariableMgr(object):
 
         return Buffer
 
     @staticmethod
     def PACK_DEFAULT_DATA(defaultstoragename, skuid, var_value):
-        Buffer = ""
+        Buffer = bytearray()
         Buffer += pack("=L", 4+8+8)
         Buffer += pack("=Q", int(skuid))
         Buffer += pack("=Q", int(defaultstoragename))
 
         for item in var_value:
@@ -338,11 +340,11 @@ class VariableMgr(object):
         return self.DefaultStoreMap.get(dname)[0]
 
     def PACK_DELTA_DATA(self, skuname, defaultstoragename, delta_list):
         skuid = self.GetSkuId(skuname)
         defaultstorageid = self.GetDefaultStoreId(defaultstoragename)
-        Buffer = ""
+        Buffer = bytearray()
         Buffer += pack("=L", 4+8+8)
         Buffer += pack("=Q", int(skuid))
         Buffer += pack("=Q", int(defaultstorageid))
         for (delta_offset, value) in delta_list:
             Buffer += pack("=L", delta_offset)
@@ -361,10 +363,10 @@ class VariableMgr(object):
 
         return mybuffer
 
     @staticmethod
     def PACK_VARIABLE_NAME(var_name):
-        Buffer = ""
+        Buffer = bytearray()
         for name_char in var_name.strip("{").strip("}").split(","):
             Buffer += pack("=B", int(name_char, 16))
 
         return Buffer
diff --git a/BaseTools/Source/Python/AutoGen/InfSectionParser.py b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
index d985089738..09e9af3fb4 100644
--- a/BaseTools/Source/Python/AutoGen/InfSectionParser.py
+++ b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
@@ -32,11 +32,11 @@ class InfSectionParser():
         FileLastLine = False
         SectionLine = ''
         SectionData = []
 
         try:
-            FileLinesList = open(self._FilePath, "r", 0).readlines()
+            FileLinesList = open(self._FilePath, "r").readlines()
         except BaseException:
             EdkLogger.error("build", AUTOGEN_ERROR, 'File %s is opened failed.' % self._FilePath)
 
         for Index in range(0, len(FileLinesList)):
             line = str(FileLinesList[Index]).strip()
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index d87680b2e7..680ec16bd4 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -121,11 +121,14 @@ def DecToHexList(Dec, Digit = 8):
 # @param Ascii:  The acsii string
 #
 # @retval:       A list for formatted hex string
 #
 def AscToHexList(Ascii):
-    return ['0x{0:02X}'.format(ord(Item)) for Item in Ascii]
+    try:
+        return ['0x{0:02X}'.format(Item) for Item in Ascii]
+    except:
+        return ['0x{0:02X}'.format(ord(Item)) for Item in Ascii]
 
 ## Create content of .h file
 #
 # Create content of .h file
 #
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 764d95ec66..d162387cc5 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -22,11 +22,11 @@ import distutils.util
 import Common.EdkLogger as EdkLogger
 from io import BytesIO
 from Common.BuildToolError import *
 from Common.StringUtils import GetLineNo
 from Common.Misc import PathClass
-from Common.LongFilePathSupport import LongFilePath, UniToStr
+from Common.LongFilePathSupport import LongFilePath
 from Common.GlobalData import *
 ##
 # Static definitions
 #
 UNICODE_WIDE_CHAR = u'\\wide'
@@ -425,11 +425,11 @@ class UniFileClassObject(object):
             while (StartPos != -1):
                 EndPos = Line.find(u'\\', StartPos + 1, StartPos + 7)
                 if EndPos != -1 and EndPos - StartPos == 6 :
                     if g4HexChar.match(Line[StartPos + 2 : EndPos], re.UNICODE):
                         EndStr = Line[EndPos: ]
-                        UniStr = ('\u' + (Line[StartPos + 2 : EndPos])).decode('unicode_escape')
+                        UniStr = Line[StartPos + 2: EndPos]
                         if EndStr.startswith(u'\\x') and len(EndStr) >= 7:
                             if EndStr[6] == u'\\' and g4HexChar.match(EndStr[2 : 6], re.UNICODE):
                                 Line = Line[0 : StartPos] + UniStr + EndStr
                         else:
                             Line = Line[0 : StartPos] + UniStr + EndStr[1:]
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 6ddf38fd0d..91c2de621f 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -39,11 +39,11 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
             return
         if not os.path.exists(dest):
             os.mkdir(dest)
         BinFileName = "PcdVarCheck.bin"
         BinFilePath = os.path.join(dest, BinFileName)
-        Buffer = ''
+        Buffer = bytearray()
         index = 0
         for var_check_tab in self.var_check_info:
             index += 1
             realLength = 0
             realLength += 32
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index b4a2dd25a2..09712be386 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -184,11 +184,11 @@ class PcdEntry:
         # No null-terminator in 'string'
         if (QuotedFlag and len(ValueString) + 1 > Size) or (not QuotedFlag and len(ValueString) > Size):
             EdkLogger.error("BPDG", BuildToolError.RESOURCE_OVERFLOW,
                             "PCD value string %s is exceed to size %d(File: %s Line: %s)" % (ValueString, Size, self.FileName, self.Lineno))
         try:
-            self.PcdValue = pack('%ds' % Size, ValueString)
+            self.PcdValue = pack('%ds' % Size, ValueString.encode('utf-8'))
         except:
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid size or value for PCD %s to pack(File: %s Line: %s)." % (self.PcdCName, self.FileName, self.Lineno))
 
     ## Pack a byte-array PCD value.
@@ -304,11 +304,11 @@ class GenVPD :
         self.VpdFileName             = VpdFileName
         self.FileLinesList           = []
         self.PcdFixedOffsetSizeList  = []
         self.PcdUnknownOffsetList    = []
         try:
-            fInputfile = open(InputFileName, "r", 0)
+            fInputfile = open(InputFileName, "r")
             try:
                 self.FileLinesList = fInputfile.readlines()
             except:
                 EdkLogger.error("BPDG", BuildToolError.FILE_READ_FAILURE, "File read failed for %s" % InputFileName, None)
             finally:
@@ -643,11 +643,11 @@ class GenVPD :
     #
     def GenerateVpdFile (self, MapFileName, BinFileName):
         #Open an VPD file to process
 
         try:
-            fVpdFile = open(BinFileName, "wb", 0)
+            fVpdFile = open(BinFileName, "wb")
         except:
             # Open failed
             EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.VpdFileName, None)
 
         try :
@@ -655,11 +655,11 @@ class GenVPD :
         except:
             # Open failed
             EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.MapFileName, None)
 
         # Use a instance of BytesIO to cache data
-        fStringIO = BytesIO('')
+        fStringIO = BytesIO()
 
         # Write the header of map file.
         try :
             fMapFile.write (st.MAP_FILE_COMMENT_TEMPLATE + "\n")
         except:
diff --git a/BaseTools/Source/Python/Common/LongFilePathOs.py b/BaseTools/Source/Python/Common/LongFilePathOs.py
index 53528546b7..3d6fe9b01c 100644
--- a/BaseTools/Source/Python/Common/LongFilePathOs.py
+++ b/BaseTools/Source/Python/Common/LongFilePathOs.py
@@ -13,11 +13,10 @@
 
 from __future__ import absolute_import
 import os
 from . import LongFilePathOsPath
 from Common.LongFilePathSupport import LongFilePath
-from Common.LongFilePathSupport import UniToStr
 import time
 
 path = LongFilePathOsPath
 
 def access(path, mode):
@@ -62,11 +61,11 @@ def utime(path, times):
 
 def listdir(path):
     List = []
     uList = os.listdir(u"%s" % LongFilePath(path))
     for Item in uList:
-        List.append(UniToStr(Item))
+        List.append(Item)
     return List
 
 environ = os.environ
 getcwd = os.getcwd
 chdir = os.chdir
diff --git a/BaseTools/Source/Python/Common/LongFilePathSupport.py b/BaseTools/Source/Python/Common/LongFilePathSupport.py
index b3e3c8ea64..ed29d37d38 100644
--- a/BaseTools/Source/Python/Common/LongFilePathSupport.py
+++ b/BaseTools/Source/Python/Common/LongFilePathSupport.py
@@ -47,17 +47,5 @@ def CodecOpenLongFilePath(Filename, Mode='rb', Encoding=None, Errors='strict', B
 #
 def CopyLongFilePath(src, dst):
     with open(LongFilePath(src), 'rb') as fsrc:
         with open(LongFilePath(dst), 'wb') as fdst:
             shutil.copyfileobj(fsrc, fdst)
-
-## Convert a python unicode string to a normal string
-#
-# Convert a python unicode string to a normal string
-# UniToStr(u'I am a string') is 'I am a string'
-#
-# @param Uni:  The python unicode string
-#
-# @retval:     The formatted normal string
-#
-def UniToStr(Uni):
-    return repr(Uni)[2:-1]
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 3b228f2f32..c24e8adc9a 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -455,35 +455,48 @@ def RemoveDirectory(Directory, Recursively=False):
 #
 #   @retval     True            If the file content is changed and the file is renewed
 #   @retval     False           If the file content is the same
 #
 def SaveFileOnChange(File, Content, IsBinaryFile=True):
-    if not IsBinaryFile:
-        Content = Content.replace("\n", os.linesep)
 
     if os.path.exists(File):
-        try:
-            if Content == open(File, "rb").read():
-                return False
-        except:
-            EdkLogger.error(None, FILE_OPEN_FAILURE, ExtraData=File)
+        if IsBinaryFile:
+            try:
+                with open(File, "rb") as f:
+                    if Content == f.read():
+                        return False
+            except:
+                EdkLogger.error(None, FILE_OPEN_FAILURE, ExtraData=File)
+        else:
+            try:
+                with open(File, "r") as f:
+                    if Content == f.read():
+                        return False
+            except:
+                EdkLogger.error(None, FILE_OPEN_FAILURE, ExtraData=File)
 
     DirName = os.path.dirname(File)
     if not CreateDirectory(DirName):
         EdkLogger.error(None, FILE_CREATE_FAILURE, "Could not create directory %s" % DirName)
     else:
         if DirName == '':
             DirName = os.getcwd()
         if not os.access(DirName, os.W_OK):
             EdkLogger.error(None, PERMISSION_FAILURE, "Do not have write permission on directory %s" % DirName)
 
-    try:
-        Fd = open(File, "wb")
-        Fd.write(Content)
-        Fd.close()
-    except IOError as X:
-        EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
+    if IsBinaryFile:
+        try:
+            with open(File, "wb") as Fd:
+                Fd.write(Content)
+        except IOError as X:
+            EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
+    else:
+        try:
+            with open(File, 'w') as Fd:
+                Fd.write(Content)
+        except IOError as X:
+            EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
 
     return True
 
 ## Retrieve and cache the real path name in file system
 #
@@ -1123,11 +1136,14 @@ def ParseFieldValue (Value):
                 raise BadExpression("Invalid GUID value string %s" % Value)
             Value = TmpValue
         if Value[0] == '"' and Value[-1] == '"':
             Value = Value[1:-1]
         try:
-            Value = "'" + uuid.UUID(Value).bytes_le + "'"
+            Value = str(uuid.UUID(Value).bytes_le)
+            if Value.startswith("b'"):
+                Value = Value[2:-1]
+            Value = "'" + Value + "'"
         except ValueError as Message:
             raise BadExpression(Message)
         Value, Size = ParseFieldValue(Value)
         return Value, 16
     if Value.startswith('L"') and Value.endswith('"'):
@@ -1573,11 +1589,11 @@ class PeImageClass():
         PeOffset = self._ByteListToInt(ByteList[0x3C:0x3E])
         PeObject.seek(PeOffset)
         ByteArray = array.array('B')
         ByteArray.fromfile(PeObject, 4)
         # PE signature should be 'PE\0\0'
-        if ByteArray.tostring() != 'PE\0\0':
+        if ByteArray.tostring() != b'PE\0\0':
             self.ErrorInfo = self.FileName + ' has no valid PE signature PE00'
             return
 
         # Read PE file header
         ByteArray = array.array('B')
@@ -1789,11 +1805,11 @@ class SkuClass():
 #   @param      Input   The object that may be either a integer value or a string
 #
 #   @retval     Value    The integer value that the input represents
 #
 def GetIntegerValue(Input):
-    if type(Input) in (int, long):
+    if not isinstance(Input, str):
         return Input
     String = Input
     if String.endswith("U"):
         String = String[:-1]
     if String.endswith("ULL"):
diff --git a/BaseTools/Source/Python/Common/StringUtils.py b/BaseTools/Source/Python/Common/StringUtils.py
index 0fa51f365b..c6227271a4 100644
--- a/BaseTools/Source/Python/Common/StringUtils.py
+++ b/BaseTools/Source/Python/Common/StringUtils.py
@@ -814,15 +814,11 @@ def GetHelpTextList(HelpTextClassList):
                 List.extend(HelpText.String.split('\n'))
 
     return List
 
 def StringToArray(String):
-    if isinstance(String, unicode):
-        if len(unicode) == 0:
-            return "{0x00,0x00}"
-        return "{%s,0x00,0x00}" % ",".join("0x%02x,0x00" % ord(C) for C in String)
-    elif String.startswith('L"'):
+    if String.startswith('L"'):
         if String == "L\"\"":
             return "{0x00,0x00}"
         else:
             return "{%s,0x00,0x00}" % ",".join("0x%02x,0x00" % ord(C) for C in String[2:-1])
     elif String.startswith('"'):
@@ -841,13 +837,11 @@ def StringToArray(String):
             return '{%s,0}' % ','.join(String.split())
         else:
             return '{%s,0,0}' % ','.join(String.split())
 
 def StringArrayLength(String):
-    if isinstance(String, unicode):
-        return (len(String) + 1) * 2 + 1;
-    elif String.startswith('L"'):
+    if String.startswith('L"'):
         return (len(String) - 3 + 1) * 2
     elif String.startswith('"'):
         return (len(String) - 2 + 1)
     else:
         return len(String.split()) + 1
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index cebc1f7187..e6cc768ee1 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -90,22 +90,22 @@ class VpdInfoFile:
     #
     def Add(self, Vpd, skuname, Offset):
         if (Vpd is None):
             EdkLogger.error("VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOWN_ERROR, "Invalid VPD PCD entry.")
 
-        if not (Offset >= 0 or Offset == TAB_STAR):
+        if not (Offset >= "0" or Offset == TAB_STAR):
             EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID, "Invalid offset parameter: %s." % Offset)
 
         if Vpd.DatumType == TAB_VOID:
-            if Vpd.MaxDatumSize <= 0:
+            if Vpd.MaxDatumSize <= "0":
                 EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
                                 "Invalid max datum size for VPD PCD %s.%s" % (Vpd.TokenSpaceGuidCName, Vpd.TokenCName))
         elif Vpd.DatumType in TAB_PCD_NUMERIC_TYPES:
             if not Vpd.MaxDatumSize:
                 Vpd.MaxDatumSize = MAX_SIZE_TYPE[Vpd.DatumType]
         else:
-            if Vpd.MaxDatumSize <= 0:
+            if Vpd.MaxDatumSize <= "0":
                 EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
                                 "Invalid max datum size for VPD PCD %s.%s" % (Vpd.TokenSpaceGuidCName, Vpd.TokenCName))
 
         if Vpd not in self._VpdArray:
             #
@@ -125,11 +125,11 @@ class VpdInfoFile:
         if not (FilePath is not None or len(FilePath) != 0):
             EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
                             "Invalid parameter FilePath: %s." % FilePath)
 
         Content = FILE_COMMENT_TEMPLATE
-        Pcds = sorted(self._VpdArray.keys())
+        Pcds = sorted(self._VpdArray.keys(), key=lambda x: x.TokenCName)
         for Pcd in Pcds:
             i = 0
             PcdTokenCName = Pcd.TokenCName
             for PcdItem in GlobalData.MixedPcd:
                 if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
@@ -247,11 +247,11 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
                                         stderr= subprocess.PIPE,
                                         shell=True)
     except Exception as X:
         EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData=str(X))
     (out, error) = PopenObject.communicate()
-    print(out)
+    print(out.decode(encoding='utf-8', errors='ignore'))
     while PopenObject.returncode is None :
         PopenObject.wait()
 
     if PopenObject.returncode != 0:
         EdkLogger.debug(EdkLogger.DEBUG_1, "Fail to call BPDG tool", str(error))
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 55d99320c7..0a828278e8 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -51,11 +51,11 @@ class AprioriSection (object):
     #   @param  FvName      for whom apriori file generated
     #   @param  Dict        dictionary contains macro and its value
     #   @retval string      Generated file name
     #
     def GenFfs (self, FvName, Dict = {}, IsMakefile = False):
-        Buffer = BytesIO('')
+        Buffer = BytesIO()
         if self.AprioriType == "PEI":
             AprioriFileGuid = PEI_APRIORI_GUID
         else:
             AprioriFileGuid = DXE_APRIORI_GUID
 
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index 1cdbdcf7ba..9013fca410 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -179,11 +179,11 @@ class Capsule (CapsuleClassObject):
         BodySize = len(FwMgrHdr.getvalue()) + len(Content.getvalue())
         Header.write(pack('=I', HdrSize + BodySize))
         #
         # The real capsule header structure is 28 bytes
         #
-        Header.write('\x00'*(HdrSize-28))
+        Header.write(b'\x00'*(HdrSize-28))
         Header.write(FwMgrHdr.getvalue())
         Header.write(Content.getvalue())
         #
         # Generate FMP capsule file
         #
@@ -204,22 +204,21 @@ class Capsule (CapsuleClassObject):
         if ('CAPSULE_GUID' in self.TokensDict and
             uuid.UUID(self.TokensDict['CAPSULE_GUID']) == uuid.UUID('6DCBD5ED-E82D-4C44-BDA1-7194199AD92A')):
             return self.GenFmpCapsule()
 
         CapInfFile = self.GenCapInf()
-        CapInfFile.writelines("[files]" + TAB_LINE_BREAK)
+        CapInfFile.append("[files]" + TAB_LINE_BREAK)
         CapFileList = []
         for CapsuleDataObj in self.CapsuleDataList:
             CapsuleDataObj.CapsuleName = self.CapsuleName
             FileName = CapsuleDataObj.GenCapsuleSubItem()
             CapsuleDataObj.CapsuleName = None
             CapFileList.append(FileName)
-            CapInfFile.writelines("EFI_FILE_NAME = " + \
+            CapInfFile.append("EFI_FILE_NAME = " + \
                                    FileName      + \
                                    TAB_LINE_BREAK)
-        SaveFileOnChange(self.CapInfFileName, CapInfFile.getvalue(), False)
-        CapInfFile.close()
+        SaveFileOnChange(self.CapInfFileName, ''.join(CapInfFile), False)
         #
         # Call GenFv tool to generate capsule
         #
         CapOutputFile = os.path.join(GenFdsGlobalVariable.FvDir, self.UiCapsuleName)
         CapOutputFile = CapOutputFile + '.Cap'
@@ -241,16 +240,16 @@ class Capsule (CapsuleClassObject):
     #   @retval file        inf file object
     #
     def GenCapInf(self):
         self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiCapsuleName +  "_Cap" + '.inf')
-        CapInfFile = BytesIO() #open (self.CapInfFileName , 'w+')
+        CapInfFile = []
 
-        CapInfFile.writelines("[options]" + TAB_LINE_BREAK)
+        CapInfFile.append("[options]" + TAB_LINE_BREAK)
 
         for Item in self.TokensDict:
-            CapInfFile.writelines("EFI_"                    + \
+            CapInfFile.append("EFI_"                    + \
                                   Item                      + \
                                   ' = '                     + \
                                   self.TokensDict[Item]     + \
                                   TAB_LINE_BREAK)
 
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index db201c074b..ace4699a0e 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -80,11 +80,11 @@ class CapsuleFv (CapsuleData):
     #
     def GenCapsuleSubItem(self):
         if self.FvName.find('.fv') == -1:
             if self.FvName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
                 FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[self.FvName.upper()]
-                FdBuffer = BytesIO('')
+                FdBuffer = BytesIO()
                 FvObj.CapsuleName = self.CapsuleName
                 FvFile = FvObj.AddToBuffer(FdBuffer)
                 FvObj.CapsuleName = None
                 FdBuffer.close()
                 return FvFile
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index 9c43a62cc3..e1849a356c 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -70,11 +70,11 @@ class FD(FDClassObject):
         for RegionObj in self.RegionList:
             if RegionObj.RegionType == 'CAPSULE':
                 HasCapsuleRegion = True
                 break
         if HasCapsuleRegion:
-            TempFdBuffer = BytesIO('')
+            TempFdBuffer = BytesIO()
             PreviousRegionStart = -1
             PreviousRegionSize = 1
 
             for RegionObj in self.RegionList :
                 if RegionObj.RegionType == 'CAPSULE':
@@ -99,11 +99,11 @@ class FD(FDClassObject):
                 if PreviousRegionSize > self.Size:
                     pass
                 GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
                 RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict)
 
-        FdBuffer = BytesIO('')
+        FdBuffer = BytesIO()
         PreviousRegionStart = -1
         PreviousRegionSize = 1
         for RegionObj in self.RegionList :
             if RegionObj.Offset + RegionObj.Size <= PreviousRegionStart:
                 EdkLogger.error("GenFds", GENFDS_ERROR,
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 69cb7de8e5..63edf816ec 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -157,11 +157,11 @@ class IncludeFileProfile:
     #
     def __init__(self, FileName):
         self.FileName = FileName
         self.FileLinesList = []
         try:
-            with open(FileName, "rb", 0) as fsock:
+            with open(FileName, "r") as fsock:
                 self.FileLinesList = fsock.readlines()
                 for index, line in enumerate(self.FileLinesList):
                     if not line.endswith(TAB_LINE_BREAK):
                         self.FileLinesList[index] += TAB_LINE_BREAK
         except:
@@ -211,11 +211,11 @@ class FileProfile:
     #   @param  FileName    The file that to be parsed
     #
     def __init__(self, FileName):
         self.FileLinesList = []
         try:
-            with open(FileName, "rb", 0) as fsock:
+            with open(FileName, "r") as fsock:
                 self.FileLinesList = fsock.readlines()
 
         except:
             EdkLogger.error("FdfParser", FILE_OPEN_FAILURE, ExtraData=FileName)
 
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index 7479efff04..e05fb8ca42 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -77,11 +77,11 @@ class FileStatement (FileStatementClassObject):
             os.makedirs(OutputDir)
 
         Dict.update(self.DefineVarDict)
         SectionAlignments = None
         if self.FvName:
-            Buffer = BytesIO('')
+            Buffer = BytesIO()
             if self.FvName.upper() not in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
                 EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (self.FvName))
             Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
             FileName = Fv.AddToBuffer(Buffer)
             SectionFiles = [FileName]
@@ -94,11 +94,11 @@ class FileStatement (FileStatementClassObject):
             SectionFiles = [FileName]
 
         elif self.FileName:
             if hasattr(self, 'FvFileType') and self.FvFileType == 'RAW':
                 if isinstance(self.FileName, list) and isinstance(self.SubAlignment, list) and len(self.FileName) == len(self.SubAlignment):
-                    FileContent = ''
+                    FileContent = BytesIO()
                     MaxAlignIndex = 0
                     MaxAlignValue = 1
                     for Index, File in enumerate(self.FileName):
                         try:
                             f = open(File, 'rb')
@@ -110,19 +110,19 @@ class FileStatement (FileStatementClassObject):
                         if self.SubAlignment[Index]:
                             AlignValue = GenFdsGlobalVariable.GetAlignment(self.SubAlignment[Index])
                         if AlignValue > MaxAlignValue:
                             MaxAlignIndex = Index
                             MaxAlignValue = AlignValue
-                        FileContent += Content
-                        if len(FileContent) % AlignValue != 0:
-                            Size = AlignValue - len(FileContent) % AlignValue
+                        FileContent.write(Content)
+                        if len(FileContent.getvalue()) % AlignValue != 0:
+                            Size = AlignValue - len(FileContent.getvalue()) % AlignValue
                             for i in range(0, Size):
-                                FileContent += pack('B', 0xFF)
+                                FileContent.write(pack('B', 0xFF))
 
-                    if FileContent:
+                    if FileContent.getvalue() != b'':
                         OutputRAWFile = os.path.join(GenFdsGlobalVariable.FfsDir, self.NameGuid, self.NameGuid + '.raw')
-                        SaveFileOnChange(OutputRAWFile, FileContent, True)
+                        SaveFileOnChange(OutputRAWFile, FileContent.getvalue(), True)
                         self.FileName = OutputRAWFile
                         self.SubAlignment = self.SubAlignment[MaxAlignIndex]
 
                 if self.Alignment and self.SubAlignment:
                     if GenFdsGlobalVariable.GetAlignment (self.Alignment) < GenFdsGlobalVariable.GetAlignment (self.SubAlignment):
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index 80257923f0..6dcb57deed 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -1086,33 +1086,31 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #
     @staticmethod
     def __GenUniVfrOffsetFile(VfrUniOffsetList, UniVfrOffsetFileName):
 
         # Use a instance of StringIO to cache data
-        fStringIO = BytesIO('')
+        fStringIO = BytesIO()
 
         for Item in VfrUniOffsetList:
             if (Item[0].find("Strings") != -1):
                 #
                 # UNI offset in image.
                 # GUID + Offset
                 # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66 } }
                 #
-                UniGuid = [0xe0, 0xc5, 0x13, 0x89, 0xf6, 0x33, 0x86, 0x4d, 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66]
-                UniGuid = [chr(ItemGuid) for ItemGuid in UniGuid]
-                fStringIO.write(''.join(UniGuid))
+                UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
+                fStringIO.write(UniGuid)
                 UniValue = pack ('Q', int (Item[1], 16))
                 fStringIO.write (UniValue)
             else:
                 #
                 # VFR binary offset in image.
                 # GUID + Offset
                 # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2 } };
                 #
-                VfrGuid = [0xb4, 0x7c, 0xbc, 0xd0, 0x47, 0x6a, 0x5f, 0x49, 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2]
-                VfrGuid = [chr(ItemGuid) for ItemGuid in VfrGuid]
-                fStringIO.write(''.join(VfrGuid))
+                VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
+                fStringIO.write(VfrGuid)
                 type (Item[1])
                 VfrValue = pack ('Q', int (Item[1], 16))
                 fStringIO.write (VfrValue)
 
         #
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index b141d44dc4..2ae991128a 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -115,11 +115,11 @@ class FV (object):
         for AprSection in self.AprioriSectionList:
             FileName = AprSection.GenFfs (self.UiFvName, MacroDict, IsMakefile=Flag)
             FfsFileList.append(FileName)
             # Add Apriori file name to Inf file
             if not Flag:
-                self.FvInfFile.writelines("EFI_FILE_NAME = " + \
+                self.FvInfFile.append("EFI_FILE_NAME = " + \
                                             FileName          + \
                                             TAB_LINE_BREAK)
 
         # Process Modules in FfsList
         for FfsFile in self.FfsList:
@@ -129,16 +129,16 @@ class FV (object):
             if GenFdsGlobalVariable.EnableGenfdsMultiThread and GenFdsGlobalVariable.ModuleFile and GenFdsGlobalVariable.ModuleFile.Path.find(os.path.normpath(FfsFile.InfFileName)) == -1:
                 continue
             FileName = FfsFile.GenFfs(MacroDict, FvParentAddr=BaseAddress, IsMakefile=Flag, FvName=self.UiFvName)
             FfsFileList.append(FileName)
             if not Flag:
-                self.FvInfFile.writelines("EFI_FILE_NAME = " + \
+                self.FvInfFile.append("EFI_FILE_NAME = " + \
                                             FileName          + \
                                             TAB_LINE_BREAK)
         if not Flag:
-            SaveFileOnChange(self.InfFileName, self.FvInfFile.getvalue(), False)
-            self.FvInfFile.close()
+            FvInfFile = ''.join(self.FvInfFile)
+            SaveFileOnChange(self.InfFileName, FvInfFile, False)
         #
         # Call GenFv tool
         #
         FvOutputFile = os.path.join(GenFdsGlobalVariable.FvDir, self.UiFvName)
         FvOutputFile = FvOutputFile + '.Fv'
@@ -206,18 +206,18 @@ class FV (object):
             if os.path.isfile(FvOutputFile) and os.path.getsize(FvOutputFile) >= 0x48:
                 FvFileObj = open(FvOutputFile, 'rb')
                 # PI FvHeader is 0x48 byte
                 FvHeaderBuffer = FvFileObj.read(0x48)
                 Signature = FvHeaderBuffer[0x28:0x32]
-                if Signature and Signature.startswith('_FVH'):
+                if Signature and Signature.startswith(b'_FVH'):
                     GenFdsGlobalVariable.VerboseLogger("\nGenerate %s FV Successfully" % self.UiFvName)
                     GenFdsGlobalVariable.SharpCounter = 0
 
                     FvFileObj.seek(0)
                     Buffer.write(FvFileObj.read())
                     # FV alignment position.
-                    FvAlignmentValue = 1 << (ord(FvHeaderBuffer[0x2E]) & 0x1F)
+                    FvAlignmentValue = 1 << (ord(FvHeaderBuffer[0x2E:0x2F]) & 0x1F)
                     if FvAlignmentValue >= 0x400:
                         if FvAlignmentValue >= 0x100000:
                             if FvAlignmentValue >= 0x1000000:
                             #The max alignment supported by FFS is 16M.
                                 self.FvAlignment = "16M"
@@ -274,73 +274,73 @@ class FV (object):
         #
         # Create FV inf file
         #
         self.InfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiFvName + '.inf')
-        self.FvInfFile = BytesIO()
+        self.FvInfFile = []
 
         #
         # Add [Options]
         #
-        self.FvInfFile.writelines("[options]" + TAB_LINE_BREAK)
+        self.FvInfFile.append("[options]" + TAB_LINE_BREAK)
         if BaseAddress is not None:
-            self.FvInfFile.writelines("EFI_BASE_ADDRESS = " + \
+            self.FvInfFile.append("EFI_BASE_ADDRESS = " + \
                                        BaseAddress          + \
                                        TAB_LINE_BREAK)
 
         if BlockSize is not None:
-            self.FvInfFile.writelines("EFI_BLOCK_SIZE = " + \
+            self.FvInfFile.append("EFI_BLOCK_SIZE = " + \
                                       '0x%X' %BlockSize    + \
                                       TAB_LINE_BREAK)
             if BlockNum is not None:
-                self.FvInfFile.writelines("EFI_NUM_BLOCKS   = "  + \
+                self.FvInfFile.append("EFI_NUM_BLOCKS   = "  + \
                                       ' 0x%X' %BlockNum    + \
                                       TAB_LINE_BREAK)
         else:
             if self.BlockSizeList == []:
                 if not self._GetBlockSize():
                     #set default block size is 1
-                    self.FvInfFile.writelines("EFI_BLOCK_SIZE  = 0x1" + TAB_LINE_BREAK)
+                    self.FvInfFile.append("EFI_BLOCK_SIZE  = 0x1" + TAB_LINE_BREAK)
 
             for BlockSize in self.BlockSizeList:
                 if BlockSize[0] is not None:
-                    self.FvInfFile.writelines("EFI_BLOCK_SIZE  = "  + \
+                    self.FvInfFile.append("EFI_BLOCK_SIZE  = "  + \
                                           '0x%X' %BlockSize[0]    + \
                                           TAB_LINE_BREAK)
 
                 if BlockSize[1] is not None:
-                    self.FvInfFile.writelines("EFI_NUM_BLOCKS   = "  + \
+                    self.FvInfFile.append("EFI_NUM_BLOCKS   = "  + \
                                           ' 0x%X' %BlockSize[1]    + \
                                           TAB_LINE_BREAK)
 
         if self.BsBaseAddress is not None:
-            self.FvInfFile.writelines('EFI_BOOT_DRIVER_BASE_ADDRESS = ' + \
+            self.FvInfFile.append('EFI_BOOT_DRIVER_BASE_ADDRESS = ' + \
                                        '0x%X' %self.BsBaseAddress)
         if self.RtBaseAddress is not None:
-            self.FvInfFile.writelines('EFI_RUNTIME_DRIVER_BASE_ADDRESS = ' + \
+            self.FvInfFile.append('EFI_RUNTIME_DRIVER_BASE_ADDRESS = ' + \
                                       '0x%X' %self.RtBaseAddress)
         #
         # Add attribute
         #
-        self.FvInfFile.writelines("[attributes]" + TAB_LINE_BREAK)
+        self.FvInfFile.append("[attributes]" + TAB_LINE_BREAK)
 
-        self.FvInfFile.writelines("EFI_ERASE_POLARITY   = "       + \
+        self.FvInfFile.append("EFI_ERASE_POLARITY   = "       + \
                                           ' %s' %ErasePloarity    + \
                                           TAB_LINE_BREAK)
         if not (self.FvAttributeDict is None):
             for FvAttribute in self.FvAttributeDict.keys():
                 if FvAttribute == "FvUsedSizeEnable":
                     if self.FvAttributeDict[FvAttribute].upper() in ('TRUE', '1'):
                         self.UsedSizeEnable = True
                     continue
-                self.FvInfFile.writelines("EFI_"            + \
+                self.FvInfFile.append("EFI_"            + \
                                           FvAttribute       + \
                                           ' = '             + \
                                           self.FvAttributeDict[FvAttribute] + \
                                           TAB_LINE_BREAK )
         if self.FvAlignment is not None:
-            self.FvInfFile.writelines("EFI_FVB2_ALIGNMENT_"     + \
+            self.FvInfFile.append("EFI_FVB2_ALIGNMENT_"     + \
                                        self.FvAlignment.strip() + \
                                        " = TRUE"                + \
                                        TAB_LINE_BREAK)
 
         #
@@ -349,11 +349,11 @@ class FV (object):
         if not self.FvNameGuid:
             if len(self.FvExtEntryType) > 0 or self.UsedSizeEnable:
                 GenFdsGlobalVariable.ErrorLogger("FV Extension Header Entries declared for %s with no FvNameGuid declaration." % (self.UiFvName))
         else:
             TotalSize = 16 + 4
-            Buffer = ''
+            Buffer = bytearray()
             if self.UsedSizeEnable:
                 TotalSize += (4 + 4)
                 ## define EFI_FV_EXT_TYPE_USED_SIZE_TYPE 0x03
                 #typedef  struct
                 # {
@@ -376,11 +376,11 @@ class FV (object):
                 #   GUID: size 16
                 #   FV UI name
                 #
                 Buffer += (pack('HH', (FvUiLen + 16 + 4), 0x0002)
                            + PackGUID(Guid)
-                           + self.UiFvName)
+                           + self.UiFvName.encode('utf-8'))
 
             for Index in range (0, len(self.FvExtEntryType)):
                 if self.FvExtEntryType[Index] == 'FILE':
                     # check if the path is absolute or relative
                     if os.path.isabs(self.FvExtEntryData[Index]):
@@ -423,13 +423,13 @@ class FV (object):
                 Changed = SaveFileOnChange(FvExtHeaderFileName, FvExtHeaderFile.getvalue(), True)
                 FvExtHeaderFile.close()
                 if Changed:
                   if os.path.exists (self.InfFileName):
                     os.remove (self.InfFileName)
-                self.FvInfFile.writelines("EFI_FV_EXT_HEADER_FILE_NAME = "      + \
+                self.FvInfFile.append("EFI_FV_EXT_HEADER_FILE_NAME = "      + \
                                            FvExtHeaderFileName                  + \
                                            TAB_LINE_BREAK)
 
         #
         # Add [Files]
         #
-        self.FvInfFile.writelines("[files]" + TAB_LINE_BREAK)
+        self.FvInfFile.append("[files]" + TAB_LINE_BREAK)
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 7ea931e1b5..85e59cc347 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -100,11 +100,11 @@ class FvImageSection(FvImageSectionClassObject):
             return OutputFileList, self.Alignment
         #
         # Generate Fv
         #
         if self.FvName is not None:
-            Buffer = BytesIO('')
+            Buffer = BytesIO()
             Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName)
             if Fv is not None:
                 self.Fv = Fv
                 if not self.FvAddr and self.Fv.BaseAddress:
                     self.FvAddr = self.Fv.BaseAddress
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 2efb2edd9a..a99d56a9fd 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -520,11 +520,11 @@ class GenFds(object):
                 FvObj.AddToBuffer(Buffer)
                 Buffer.close()
                 return
         elif GenFds.OnlyGenerateThisFv is None:
             for FvObj in GenFdsGlobalVariable.FdfParser.Profile.FvDict.values():
-                Buffer = BytesIO('')
+                Buffer = BytesIO()
                 FvObj.AddToBuffer(Buffer)
                 Buffer.close()
 
         if GenFds.OnlyGenerateThisFv is None and GenFds.OnlyGenerateThisFd is None and GenFds.OnlyGenerateThisCap is None:
             if GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict != {}:
@@ -671,11 +671,11 @@ class GenFds(object):
             print(ModuleObj.BaseName + ' ' + ModuleObj.ModuleType)
 
     @staticmethod
     def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
         GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
-        GuidXRefFile = BytesIO('')
+        GuidXRefFile = []
         PkgGuidDict = {}
         GuidDict = {}
         ModuleList = []
         FileGuidList = []
         VariableGuidSet = set()
@@ -698,13 +698,13 @@ class GenFds(object):
                 if Module in ModuleList:
                     continue
                 else:
                     ModuleList.append(Module)
                 if GlobalData.gGuidPattern.match(ModuleFile.BaseName):
-                    GuidXRefFile.write("%s %s\n" % (ModuleFile.BaseName, Module.BaseName))
+                    GuidXRefFile.append("%s %s\n" % (ModuleFile.BaseName, Module.BaseName))
                 else:
-                    GuidXRefFile.write("%s %s\n" % (Module.Guid, Module.BaseName))
+                    GuidXRefFile.append("%s %s\n" % (Module.Guid, Module.BaseName))
                 GuidDict.update(Module.Protocols)
                 GuidDict.update(Module.Guids)
                 GuidDict.update(Module.Ppis)
             for FvName in FdfParserObj.Profile.FvDict:
                 for FfsObj in FdfParserObj.Profile.FvDict[FvName].FfsList:
@@ -713,11 +713,11 @@ class GenFds(object):
                         FdfModule = BuildDb.BuildObject[InfPath, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
                         if FdfModule in ModuleList:
                             continue
                         else:
                             ModuleList.append(FdfModule)
-                        GuidXRefFile.write("%s %s\n" % (FdfModule.Guid, FdfModule.BaseName))
+                        GuidXRefFile.append("%s %s\n" % (FdfModule.Guid, FdfModule.BaseName))
                         GuidDict.update(FdfModule.Protocols)
                         GuidDict.update(FdfModule.Guids)
                         GuidDict.update(FdfModule.Ppis)
                     else:
                         FileStatementGuid = FfsObj.NameGuid
@@ -774,23 +774,23 @@ class GenFds(object):
                                     Name.append((F.read().split()[-1]))
                         if not Name:
                             continue
 
                         Name = ' '.join(Name) if isinstance(Name, type([])) else Name
-                        GuidXRefFile.write("%s %s\n" %(FileStatementGuid, Name))
+                        GuidXRefFile.append("%s %s\n" %(FileStatementGuid, Name))
 
        # Append GUIDs, Protocols, and PPIs to the Xref file
-        GuidXRefFile.write("\n")
+        GuidXRefFile.append("\n")
         for key, item in GuidDict.items():
-            GuidXRefFile.write("%s %s\n" % (GuidStructureStringToGuidString(item).upper(), key))
+            GuidXRefFile.append("%s %s\n" % (GuidStructureStringToGuidString(item).upper(), key))
 
-        if GuidXRefFile.getvalue():
-            SaveFileOnChange(GuidXRefFileName, GuidXRefFile.getvalue(), False)
+        if GuidXRefFile:
+            GuidXRefFile = ''.join(GuidXRefFile)
+            SaveFileOnChange(GuidXRefFileName, GuidXRefFile, False)
             GenFdsGlobalVariable.InfLogger("\nGUID cross reference file can be found at %s" % GuidXRefFileName)
         elif os.path.exists(GuidXRefFileName):
             os.remove(GuidXRefFileName)
-        GuidXRefFile.close()
 
 
 if __name__ == '__main__':
     r = main()
     ## 0-127 is a safe return range, and 1 is a standard default error
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index febe0737a2..028bcc480c 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -720,12 +720,12 @@ class GenFdsGlobalVariable:
             #get command return value
             returnValue[0] = PopenObject.returncode
             return
         if PopenObject.returncode != 0 or GenFdsGlobalVariable.VerboseMode or GenFdsGlobalVariable.DebugLevel != -1:
             GenFdsGlobalVariable.InfLogger ("Return Value = %d" % PopenObject.returncode)
-            GenFdsGlobalVariable.InfLogger (out)
-            GenFdsGlobalVariable.InfLogger (error)
+            GenFdsGlobalVariable.InfLogger(out.decode(encoding='utf-8', errors='ignore'))
+            GenFdsGlobalVariable.InfLogger(error.decode(encoding='utf-8', errors='ignore'))
             if PopenObject.returncode != 0:
                 print("###", cmd)
                 EdkLogger.error("GenFds", COMMAND_FAILURE, errorMess)
 
     @staticmethod
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 83363276d2..972847efae 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -60,12 +60,12 @@ class Region(object):
         if Size > 0:
             if (ErasePolarity == '1') :
                 PadByte = pack('B', 0xFF)
             else:
                 PadByte = pack('B', 0)
-            PadData = ''.join(PadByte for i in range(0, Size))
-            Buffer.write(PadData)
+            for i in range(0, Size):
+                Buffer.write(PadByte)
 
     ## AddToBuffer()
     #
     #   Add region data to the Buffer
     #
@@ -129,11 +129,11 @@ class Region(object):
                         self.FvAddress = self.FvAddress + FvOffset
                         FvAlignValue = GenFdsGlobalVariable.GetAlignment(FvObj.FvAlignment)
                         if self.FvAddress % FvAlignValue != 0:
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "FV (%s) is NOT %s Aligned!" % (FvObj.UiFvName, FvObj.FvAlignment))
-                        FvBuffer = BytesIO('')
+                        FvBuffer = BytesIO()
                         FvBaseAddress = '0x%X' % self.FvAddress
                         BlockSize = None
                         BlockNum = None
                         FvObj.AddToBuffer(FvBuffer, FvBaseAddress, BlockSize, BlockNum, ErasePolarity, Flag=Flag)
                         if Flag:
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 2a7c308895..003f052a90 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -120,11 +120,11 @@ if __name__ == '__main__':
 
   Version = Process.communicate()
   if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print(Version[0])
+  print(Version[0].decode())
 
   #
   # Read input file into a buffer and save input filename
   #
   args.InputFileName   = args.InputFile.name
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index f96ceb2637..c0b661d03c 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -82,11 +82,11 @@ if __name__ == '__main__':
 
   Version = Process.communicate()
   if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print(Version[0])
+  print(Version[0].decode())
 
   args.PemFileName = []
 
   #
   # Check for output file argument
@@ -117,23 +117,23 @@ if __name__ == '__main__':
       # Save PEM filename and close input file
       #
       args.PemFileName.append(Item.name)
       Item.close()
 
-  PublicKeyHash = ''
+  PublicKeyHash = bytearray()
   for Item in args.PemFileName:
     #
     # Extract public key from private key into STDOUT
     #
     Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-    PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
+    PublicKeyHexString = Process.communicate()[0].split(b'=')[1].strip()
     if Process.returncode != 0:
       print('ERROR: Unable to extract public key from private key')
       sys.exit(Process.returncode)
-    PublicKey = ''
+    PublicKey = bytearray()
     for Index in range (0, len(PublicKeyHexString), 2):
-      PublicKey = PublicKey + chr(int(PublicKeyHexString[Index:Index + 2], 16))
+      PublicKey = PublicKey + PublicKeyHexString[Index:Index + 2]
 
     #
     # Generate SHA 256 hash of RSA 2048 bit public key into STDOUT
     #
     Process = subprocess.Popen('%s dgst -sha256 -binary' % (OpenSslCommand), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
@@ -155,18 +155,18 @@ if __name__ == '__main__':
   #
   # Convert public key hash to a C structure string
   #
   PublicKeyHashC = '{'
   for Item in PublicKeyHash:
-    PublicKeyHashC = PublicKeyHashC + '0x%02x, ' % (ord(Item))
+    PublicKeyHashC = PublicKeyHashC + '0x%02x, ' % (Item)
   PublicKeyHashC = PublicKeyHashC[:-2] + '}'
 
   #
   # Write SHA 256 of 2048 bit binary public key to public key hash C structure file
   #
   try:
-    args.PublicKeyHashCFile.write (PublicKeyHashC)
+    args.PublicKeyHashCFile.write (bytes(PublicKeyHashC))
     args.PublicKeyHashCFile.close ()
   except:
     pass
 
   #
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index c285a69ec0..6cea885853 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -103,11 +103,11 @@ if __name__ == '__main__':
 
   Version = Process.communicate()
   if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print(Version[0])
+  print(Version[0].decode('utf-8'))
 
   #
   # Read input file into a buffer and save input filename
   #
   args.InputFileName   = args.InputFile.name
@@ -151,11 +151,12 @@ if __name__ == '__main__':
 
   #
   # Extract public key from private key into STDOUT
   #
   Process = subprocess.Popen('%s rsa -in "%s" -modulus -noout' % (OpenSslCommand, args.PrivateKeyFileName), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-  PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
+  PublicKeyHexString = Process.communicate()[0].split(b'=')[1].strip()
+  PublicKeyHexString = PublicKeyHexString.decode('utf-8')
   PublicKey = ''
   while len(PublicKeyHexString) > 0:
     PublicKey = PublicKey + PublicKeyHexString[0:2]
     PublicKeyHexString=PublicKeyHexString[2:]
   if Process.returncode != 0:
@@ -208,11 +209,11 @@ if __name__ == '__main__':
       sys.exit(1)
 
     #
     # Verify the public key
     #
-    if Header.PublicKey != PublicKey:
+    if Header.PublicKey != bytearray.fromhex(PublicKey):
       print('ERROR: Public key in input file does not match public key from private key file')
       sys.exit(1)
 
     FullInputFileBuffer = args.InputFileBuffer
     if args.MonotonicCountStr:
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index 51010bf326..428bf0d681 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -243,11 +243,11 @@ def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
             if Brace == 0 and Line.find(";") >= 0:
                 MulPatternFlag = False
 
     # save to file
     try:
-        f = open (Target, 'wb')
+        f = open (Target, 'w')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
     f.writelines(NewLines)
     f.close()
 
@@ -456,33 +456,31 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
         fInputfile = open(OutputFile, "wb+", 0)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
 
     # Use a instance of BytesIO to cache data
-    fStringIO = BytesIO('')
+    fStringIO = BytesIO()
 
     for Item in VfrUniOffsetList:
         if (Item[0].find("Strings") != -1):
             #
             # UNI offset in image.
             # GUID + Offset
             # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66 } }
             #
-            UniGuid = [0xe0, 0xc5, 0x13, 0x89, 0xf6, 0x33, 0x86, 0x4d, 0x9b, 0xf1, 0x43, 0xef, 0x89, 0xfc, 0x6, 0x66]
-            UniGuid = [chr(ItemGuid) for ItemGuid in UniGuid]
-            fStringIO.write(''.join(UniGuid))
+            UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
+            fStringIO.write(UniGuid)
             UniValue = pack ('Q', int (Item[1], 16))
             fStringIO.write (UniValue)
         else:
             #
             # VFR binary offset in image.
             # GUID + Offset
             # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2 } };
             #
-            VfrGuid = [0xb4, 0x7c, 0xbc, 0xd0, 0x47, 0x6a, 0x5f, 0x49, 0xaa, 0x11, 0x71, 0x7, 0x46, 0xda, 0x6, 0xa2]
-            VfrGuid = [chr(ItemGuid) for ItemGuid in VfrGuid]
-            fStringIO.write(''.join(VfrGuid))
+            VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
+            fStringIO.write(VfrGuid)
             type (Item[1])
             VfrValue = pack ('Q', int (Item[1], 16))
             fStringIO.write (VfrValue)
 
     #
@@ -560,11 +558,11 @@ def TrimEdkSources(Source, Target):
 def TrimEdkSourceCode(Source, Target):
     EdkLogger.verbose("\t%s -> %s" % (Source, Target))
     CreateDirectory(os.path.dirname(Target))
 
     try:
-        f = open (Source, 'rb')
+        f = open (Source, 'r')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
     # read whole file
     Lines = f.read()
     f.close()
@@ -579,11 +577,11 @@ def TrimEdkSourceCode(Source, Target):
     # save all lines if trimmed
     if Source == Target and NewLines == Lines:
         return
 
     try:
-        f = open (Target, 'wb')
+        f = open (Target, 'w')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
     f.write(NewLines)
     f.close()
 
diff --git a/BaseTools/Source/Python/UPT/Library/StringUtils.py b/BaseTools/Source/Python/UPT/Library/StringUtils.py
index 90946337d0..a3391daa91 100644
--- a/BaseTools/Source/Python/UPT/Library/StringUtils.py
+++ b/BaseTools/Source/Python/UPT/Library/StringUtils.py
@@ -677,13 +677,11 @@ def GetHelpTextList(HelpTextClassList):
 # Get String Array Length
 #
 # @param String: the source string
 #
 def StringArrayLength(String):
-    if isinstance(String, unicode):
-        return (len(String) + 1) * 2 + 1
-    elif String.startswith('L"'):
+    if String.startswith('L"'):
         return (len(String) - 3 + 1) * 2
     elif String.startswith('"'):
         return (len(String) - 2 + 1)
     else:
         return len(String.split()) + 1
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index b67414b930..cff77a71ae 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -92,17 +92,17 @@ class PcdClassObject(object):
                     fields = self.SkuOverrideValues[sku][defaultstore]
                     for demesionattr in fields:
                         deme = ArrayIndex.findall(demesionattr)
                         for i in range(len(deme)-1):
                             if int(deme[i].lstrip("[").rstrip("]").strip()) > int(self._Capacity[i]):
-                                print "error"
+                                print ("error")
         if hasattr(self,"DefaultValues"):
             for demesionattr in self.DefaultValues:
                 deme = ArrayIndex.findall(demesionattr)
                 for i in range(len(deme)-1):
                     if int(deme[i].lstrip("[").rstrip("]").strip()) > int(self._Capacity[i]):
-                        print "error"
+                        print ("error")
         return self._Capacity
     @property
     def DatumType(self):
         return self._DatumType
 
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 13b2cef59d..a96502b4bf 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -154,11 +154,18 @@ def GetDependencyList(FileStack, SearchPathList):
 
             if len(FileContent) == 0:
                 continue
 
             if FileContent[0] == 0xff or FileContent[0] == 0xfe:
-                FileContent = unicode(FileContent, "utf-16")
+                FileContent = FileContent.decode('utf-16')
+                IncludedFileList = gIncludePattern.findall(FileContent)
+            else:
+                try:
+                    FileContent = str(FileContent)
+                    IncludedFileList = gIncludePattern.findall(FileContent)
+                except:
+                    pass
             IncludedFileList = gIncludePattern.findall(FileContent)
 
             for Inc in IncludedFileList:
                 Inc = Inc.strip()
                 Inc = os.path.normpath(Inc)
@@ -1613,11 +1620,11 @@ class DscBuildData(PlatformBuildClassObject):
         FdfInfList = []
         if GlobalData.gFdfParser:
             FdfInfList = GlobalData.gFdfParser.Profile.InfList
         FdfModuleList = [PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch=self._Arch) for Inf in FdfInfList]
         AllModulePcds = set()
-        ModuleSet = set(self._Modules.keys() + FdfModuleList)
+        ModuleSet = set(list(self._Modules.keys()) + FdfModuleList)
         for ModuleFile in ModuleSet:
             ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
             AllModulePcds = AllModulePcds | ModuleData.PcdsName
         for ModuleFile in self.LibraryInstances:
             ModuleData = self._Bdb.CreateBuildObject(ModuleFile, self._Arch, self._Target, self._Toolchain)
@@ -1741,11 +1748,11 @@ class DscBuildData(PlatformBuildClassObject):
         try:
             Process = subprocess.Popen(Command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
         except:
             EdkLogger.error('Build', COMMAND_FAILURE, 'Can not execute command: %s' % Command)
         Result = Process.communicate()
-        return Process.returncode, Result[0], Result[1]
+        return Process.returncode, Result[0].decode(encoding='utf-8', errors='ignore'), Result[1].decode(encoding='utf-8', errors='ignore')
 
     @staticmethod
     def IntToCString(Value, ValueSize):
         Result = '"'
         if not isinstance (Value, str):
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 6df0d3cdf8..c1e7033f5c 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -1993,14 +1993,14 @@ class DecParser(MetaFileParser):
                     self._ValueList = None
                     self._include_flag = False
                     return
 
                 if self._include_flag:
-                    self._ValueList[1] = "<HeaderFiles>_" + md5(self._CurrentLine).hexdigest()
+                    self._ValueList[1] = "<HeaderFiles>_" + md5(self._CurrentLine.encode('utf-8')).hexdigest()
                     self._ValueList[2] = self._CurrentLine
                 if self._package_flag and "}" != self._CurrentLine:
-                    self._ValueList[1] = "<Packages>_" + md5(self._CurrentLine).hexdigest()
+                    self._ValueList[1] = "<Packages>_" + md5(self._CurrentLine.encode('utf-8')).hexdigest()
                     self._ValueList[2] = self._CurrentLine
                 if self._CurrentLine == "}":
                     self._package_flag = False
                     self._include_flag = False
                     self._ValueList = None
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 86c4c5bf7f..a385794cdf 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -141,11 +141,11 @@ VPDPcdList = []
 # @Wrapper                   Indicates whether to wrap the string
 #
 def FileWrite(File, String, Wrapper=False):
     if Wrapper:
         String = textwrap.fill(String, 120)
-    File.write(String + gEndOfLine)
+    File.append(String + gEndOfLine)
 
 def ByteArrayForamt(Value):
     IsByteArray = False
     SplitNum = 16
     ArrayList = []
@@ -634,11 +634,11 @@ class ModuleReport(object):
                 if Match:
                     self.Size = int(Match.group(1))
 
                 Match = gTimeStampPattern.search(FileContents)
                 if Match:
-                    self.BuildTimeStamp = datetime.fromtimestamp(int(Match.group(1)))
+                    self.BuildTimeStamp = datetime.utcfromtimestamp(int(Match.group(1)))
             except IOError:
                 EdkLogger.warn(None, "Fail to read report file", FwReportFileName)
 
         if "HASH" in ReportType:
             OutputDir = os.path.join(self._BuildDir, "OUTPUT")
@@ -719,12 +719,12 @@ class ModuleReport(object):
 def ReadMessage(From, To, ExitFlag):
     while True:
         # read one line a time
         Line = From.readline()
         # empty string means "end"
-        if Line is not None and Line != "":
-            To(Line.rstrip())
+        if Line is not None and Line != b"":
+            To(Line.rstrip().decode(encoding='utf-8', errors='ignore'))
         else:
             break
         if ExitFlag.isSet():
             break
 
@@ -2264,22 +2264,21 @@ class BuildReport(object):
     # @param GenFdsTime      The total time of GenFds phase
     #
     def GenerateReport(self, BuildDuration, AutoGenTime, MakeTime, GenFdsTime):
         if self.ReportFile:
             try:
-                File = BytesIO('')
+                File = []
                 for (Wa, MaList) in self.ReportList:
                     PlatformReport(Wa, MaList, self.ReportType).GenerateReport(File, BuildDuration, AutoGenTime, MakeTime, GenFdsTime, self.ReportType)
-                Content = FileLinesSplit(File.getvalue(), gLineMaxLength)
-                SaveFileOnChange(self.ReportFile, Content, True)
+                Content = FileLinesSplit(''.join(File), gLineMaxLength)
+                SaveFileOnChange(self.ReportFile, Content, False)
                 EdkLogger.quiet("Build report can be found at %s" % os.path.abspath(self.ReportFile))
             except IOError:
                 EdkLogger.error(None, FILE_WRITE_FAILURE, ExtraData=self.ReportFile)
             except:
                 EdkLogger.error("BuildReport", CODE_ERROR, "Unknown fatal error when generating build report", ExtraData=self.ReportFile, RaiseError=False)
                 EdkLogger.quiet("(Python %s on %s\n%s)" % (platform.python_version(), sys.platform, traceback.format_exc()))
-            File.close()
 
 # This acts like the main() function for the script, unless it is 'import'ed into another script.
 if __name__ == '__main__':
     pass
 
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 8d3761313f..1afd9cf7ed 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -18,11 +18,10 @@
 # Import Modules
 #
 from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
-from io import BytesIO
 import sys
 import glob
 import time
 import platform
 import traceback
@@ -217,12 +216,12 @@ def NormFile(FilePath, Workspace):
 def ReadMessage(From, To, ExitFlag):
     while True:
         # read one line a time
         Line = From.readline()
         # empty string means "end"
-        if Line is not None and Line != "":
-            To(Line.rstrip())
+        if Line is not None and Line != b"":
+            To(Line.rstrip().decode(encoding='utf-8', errors='ignore'))
         else:
             break
         if ExitFlag.isSet():
             break
 
@@ -1445,15 +1444,15 @@ class Build():
                 ImageMap.close()
             #
             # Add general information.
             #
             if ModeIsSmm:
-                MapBuffer.write('\n\n%s (Fixed SMRAM Offset,   BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
+                MapBuffer.append('\n\n%s (Fixed SMRAM Offset,   BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
             elif AddrIsOffset:
-                MapBuffer.write('\n\n%s (Fixed Memory Offset,  BaseAddress=-0x%010X, EntryPoint=-0x%010X)\n' % (ModuleName, 0 - BaseAddress, 0 - (BaseAddress + ModuleInfo.Image.EntryPoint)))
+                MapBuffer.append('\n\n%s (Fixed Memory Offset,  BaseAddress=-0x%010X, EntryPoint=-0x%010X)\n' % (ModuleName, 0 - BaseAddress, 0 - (BaseAddress + ModuleInfo.Image.EntryPoint)))
             else:
-                MapBuffer.write('\n\n%s (Fixed Memory Address, BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
+                MapBuffer.append('\n\n%s (Fixed Memory Address, BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
             #
             # Add guid and general seciton section.
             #
             TextSectionAddress = 0
             DataSectionAddress = 0
@@ -1461,25 +1460,25 @@ class Build():
                 if SectionHeader[0] == '.text':
                     TextSectionAddress = SectionHeader[1]
                 elif SectionHeader[0] in ['.data', '.sdata']:
                     DataSectionAddress = SectionHeader[1]
             if AddrIsOffset:
-                MapBuffer.write('(GUID=%s, .textbaseaddress=-0x%010X, .databaseaddress=-0x%010X)\n' % (ModuleInfo.Guid, 0 - (BaseAddress + TextSectionAddress), 0 - (BaseAddress + DataSectionAddress)))
+                MapBuffer.append('(GUID=%s, .textbaseaddress=-0x%010X, .databaseaddress=-0x%010X)\n' % (ModuleInfo.Guid, 0 - (BaseAddress + TextSectionAddress), 0 - (BaseAddress + DataSectionAddress)))
             else:
-                MapBuffer.write('(GUID=%s, .textbaseaddress=0x%010X, .databaseaddress=0x%010X)\n' % (ModuleInfo.Guid, BaseAddress + TextSectionAddress, BaseAddress + DataSectionAddress))
+                MapBuffer.append('(GUID=%s, .textbaseaddress=0x%010X, .databaseaddress=0x%010X)\n' % (ModuleInfo.Guid, BaseAddress + TextSectionAddress, BaseAddress + DataSectionAddress))
             #
             # Add debug image full path.
             #
-            MapBuffer.write('(IMAGE=%s)\n\n' % (ModuleDebugImage))
+            MapBuffer.append('(IMAGE=%s)\n\n' % (ModuleDebugImage))
             #
             # Add funtion address
             #
             for Function in FunctionList:
                 if AddrIsOffset:
-                    MapBuffer.write('  -0x%010X    %s\n' % (0 - (BaseAddress + Function[1]), Function[0]))
+                    MapBuffer.append('  -0x%010X    %s\n' % (0 - (BaseAddress + Function[1]), Function[0]))
                 else:
-                    MapBuffer.write('  0x%010X    %s\n' % (BaseAddress + Function[1], Function[0]))
+                    MapBuffer.append('  0x%010X    %s\n' % (BaseAddress + Function[1], Function[0]))
             ImageMap.close()
 
             #
             # for SMM module in SMRAM, the SMRAM will be allocated from base to top.
             #
@@ -1510,19 +1509,19 @@ class Build():
                         # Replace GUID with module name
                         #
                         GuidString = MatchGuid.group()
                         if GuidString.upper() in ModuleList:
                             Line = Line.replace(GuidString, ModuleList[GuidString.upper()].Name)
-                    MapBuffer.write(Line)
+                    MapBuffer.append(Line)
                     #
                     # Add the debug image full path.
                     #
                     MatchGuid = GuidName.match(Line)
                     if MatchGuid is not None:
                         GuidString = MatchGuid.group().split("=")[1]
                         if GuidString.upper() in ModuleList:
-                            MapBuffer.write('(IMAGE=%s)\n' % (os.path.join(ModuleList[GuidString.upper()].DebugDir, ModuleList[GuidString.upper()].Name + '.efi')))
+                            MapBuffer.append('(IMAGE=%s)\n' % (os.path.join(ModuleList[GuidString.upper()].DebugDir, ModuleList[GuidString.upper()].Name + '.efi')))
 
                 FvMap.close()
 
     ## Collect MAP information of all modules
     #
@@ -1634,25 +1633,25 @@ class Build():
                 elif PcdInfo[0] == TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE and len (SmmModuleList) > 0:
                     ReturnValue, ErrorInfo = PatchBinaryFile (EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE_DATA_TYPE, str (SmmSize // 0x1000))
                 if ReturnValue != 0:
                     EdkLogger.error("build", PARAMETER_INVALID, "Patch PCD value failed", ExtraData=ErrorInfo)
 
-        MapBuffer.write('PEI_CODE_PAGE_NUMBER      = 0x%x\n' % (PeiSize // 0x1000))
-        MapBuffer.write('BOOT_CODE_PAGE_NUMBER     = 0x%x\n' % (BtSize // 0x1000))
-        MapBuffer.write('RUNTIME_CODE_PAGE_NUMBER  = 0x%x\n' % (RtSize // 0x1000))
+        MapBuffer.append('PEI_CODE_PAGE_NUMBER      = 0x%x\n' % (PeiSize // 0x1000))
+        MapBuffer.append('BOOT_CODE_PAGE_NUMBER     = 0x%x\n' % (BtSize // 0x1000))
+        MapBuffer.append('RUNTIME_CODE_PAGE_NUMBER  = 0x%x\n' % (RtSize // 0x1000))
         if len (SmmModuleList) > 0:
-            MapBuffer.write('SMM_CODE_PAGE_NUMBER      = 0x%x\n' % (SmmSize // 0x1000))
+            MapBuffer.append('SMM_CODE_PAGE_NUMBER      = 0x%x\n' % (SmmSize // 0x1000))
 
         PeiBaseAddr = TopMemoryAddress - RtSize - BtSize
         BtBaseAddr  = TopMemoryAddress - RtSize
         RtBaseAddr  = TopMemoryAddress - ReservedRuntimeMemorySize
 
         self._RebaseModule (MapBuffer, PeiBaseAddr, PeiModuleList, TopMemoryAddress == 0)
         self._RebaseModule (MapBuffer, BtBaseAddr, BtModuleList, TopMemoryAddress == 0)
         self._RebaseModule (MapBuffer, RtBaseAddr, RtModuleList, TopMemoryAddress == 0)
         self._RebaseModule (MapBuffer, 0x1000, SmmModuleList, AddrIsOffset=False, ModeIsSmm=True)
-        MapBuffer.write('\n\n')
+        MapBuffer.append('\n\n')
         sys.stdout.write ("\n")
         sys.stdout.flush()
 
     ## Save platform Map file
     #
@@ -1662,12 +1661,11 @@ class Build():
         #
         MapFilePath = os.path.join(Wa.BuildDir, Wa.Name + '.map')
         #
         # Save address map into MAP file.
         #
-        SaveFileOnChange(MapFilePath, MapBuffer.getvalue(), False)
-        MapBuffer.close()
+        SaveFileOnChange(MapFilePath, ''.join(MapBuffer), False)
         if self.LoadFixAddress != 0:
             sys.stdout.write ("\nLoad Module At Fix Address Map file can be found at %s\n" % (MapFilePath))
         sys.stdout.flush()
 
     ## Build active platform for different build targets and different tool chains
@@ -1738,11 +1736,11 @@ class Build():
                             if Ma is None:
                                 continue
                             if not Ma.IsLibrary:
                                 ModuleList[Ma.Guid.upper()] = Ma
 
-                    MapBuffer = BytesIO('')
+                    MapBuffer = []
                     if self.LoadFixAddress != 0:
                         #
                         # Rebase module to the preferred memory address before GenFds
                         #
                         self._CollectModuleMapBuffer(MapBuffer, ModuleList)
@@ -1896,11 +1894,11 @@ class Build():
                             if Ma is None:
                                 continue
                             if not Ma.IsLibrary:
                                 ModuleList[Ma.Guid.upper()] = Ma
 
-                    MapBuffer = BytesIO('')
+                    MapBuffer = []
                     if self.LoadFixAddress != 0:
                         #
                         # Rebase module to the preferred memory address before GenFds
                         #
                         self._CollectModuleMapBuffer(MapBuffer, ModuleList)
@@ -2077,11 +2075,11 @@ class Build():
                             if not Ma.IsLibrary:
                                 ModuleList[Ma.Guid.upper()] = Ma
                     #
                     # Rebase module to the preferred memory address before GenFds
                     #
-                    MapBuffer = BytesIO('')
+                    MapBuffer = []
                     if self.LoadFixAddress != 0:
                         self._CollectModuleMapBuffer(MapBuffer, ModuleList)
 
                     if self.Fdf:
                         #
-- 
2.20.1.windows.1



  parent reply	other threads:[~2019-01-25  5:05 UTC|newest]

Thread overview: 42+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2019-01-25  4:55 [Patch 00/33] BaseTools python3 migration patch set Feng, Bob C
2019-01-25  4:55 ` [Patch 01/33] BaseTool:Rename xrange() to range() Feng, Bob C
2019-01-25  4:55 ` [Patch 02/33] BaseTools:use iterate list to replace the itertools Feng, Bob C
2019-01-25  4:55 ` [Patch 03/33] BaseTools: Rename iteritems to items Feng, Bob C
2019-01-25  4:55 ` [Patch 04/33] BaseTools: replace get_bytes_le() to bytes_le Feng, Bob C
2019-01-25  4:55 ` [Patch 05/33] BaseTools: use OrderedDict instead of sdict Feng, Bob C
2019-01-25  4:55 ` [Patch 06/33] BaseTools: nametuple not have verbose parameter in python3 Feng, Bob C
2019-01-25  4:56 ` [Patch 07/33] BaseTools: Remove unnecessary super function Feng, Bob C
2019-01-25  4:56 ` [Patch 08/33] BaseTools: replace long by int Feng, Bob C
2019-01-25  4:56 ` [Patch 09/33] BaseTools:Solve the data sorting problem use python3 Feng, Bob C
2019-01-25  4:56 ` [Patch 10/33] BaseTools: Update argparse arguments since it not have version now Feng, Bob C
2019-01-25  4:56 ` [Patch 11/33] BaseTools:Similar to octal data rectification Feng, Bob C
2019-01-25  4:56 ` [Patch 12/33] BaseTools/UPT:merge UPT Tool use Python2 and Python3 Feng, Bob C
2019-01-25  4:56 ` [Patch 13/33] BaseTools: update Test scripts support python3 Feng, Bob C
2019-01-25  4:56 ` [Patch 14/33] BaseTools/Scripts: Porting PackageDocumentTools code to use Python3 Feng, Bob C
2019-01-25  4:56 ` [Patch 15/33] Basetools: It went wrong when use os.linesep Feng, Bob C
2019-01-25  4:56 ` [Patch 16/33] BaseTools:Fv BaseAddress must set If it not set Feng, Bob C
2019-01-25  4:56 ` [Patch 17/33] BaseTools: Make sure AllPcdList valid Feng, Bob C
2019-01-25  4:56 ` [Patch 18/33] BaseTools:TestTools character encoding issue Feng, Bob C
2019-01-25  4:56 ` [Patch 19/33] BaseTools:Double carriage return inserted from Trim.py on Python3 Feng, Bob C
2019-01-25  4:56 ` [Patch 20/33] BaseTools:File open failed for VPD MapFile Feng, Bob C
2019-01-25  4:56 ` [Patch 21/33] BaseTools: change the Division Operator Feng, Bob C
2019-01-25  4:56 ` [Patch 22/33] BaseTools:There is extra blank line in datalog Feng, Bob C
2019-01-25  4:56 ` [Patch 23/33] BaseTools: Similar to octal data rectification Feng, Bob C
2019-01-25  4:56 ` [Patch 24/33] BaseTools: Update windows and linux run scripts file to use Python3 Feng, Bob C
2019-01-25  4:56 ` [Patch 25/33] BaseTools:Update build tool to print python version information Feng, Bob C
2019-01-25  4:56 ` [Patch 26/33] BaseTools:Linux Python highest version check Feng, Bob C
2019-01-25  4:56 ` [Patch 27/33] BaseTools: Update PYTHON env to PYTHON_COMMAND Feng, Bob C
2019-01-25  4:56 ` [Patch 28/33] BaseTools:Fixed Rsa issue and a set define issue Feng, Bob C
2019-01-25  4:56 ` [Patch 29/33] BaseTools:ord() don't match in py2 and py3 Feng, Bob C
2019-01-25  4:56 ` [Patch 30/33] BaseTools: the list and iterator translation Feng, Bob C
2019-01-25  4:56 ` Feng, Bob C [this message]
2019-01-25  4:56 ` [Patch 32/33] BaseTools: ECC tool Python3 adaption Feng, Bob C
2019-01-25  4:56 ` [Patch 33/33] BaseTools: Eot " Feng, Bob C
2019-01-25  8:56 ` [Patch 00/33] BaseTools python3 migration patch set Laszlo Ersek
2019-01-25  9:42   ` Feng, Bob C
2019-01-25 18:18     ` Laszlo Ersek
2019-01-28  2:33       ` Feng, Bob C
2019-01-28 10:35       ` Feng, Bob C
2019-01-28 13:48         ` Laszlo Ersek
2019-01-29  2:15           ` Feng, Bob C
  -- strict thread matches above, loose matches on Subject: below --
2019-01-29  2:05 [Patch v2 " Feng, Bob C
2019-01-29  2:06 ` [Patch 31/33] BaseTools: Handle the bytes and str difference Feng, Bob C

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-list from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=20190125045626.14700-32-bob.c.feng@intel.com \
    --to=devel@edk2.groups.io \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox