public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
From: Jaben Carsey <jaben.carsey@intel.com>
To: edk2-devel@lists.01.org
Cc: Liming Gao <liming.gao@intel.com>, Yonghong Zhu <yonghong.zhu@intel.com>
Subject: [PATCH v1 07/11] BaseTools: refactor file opening/writing
Date: Wed, 20 Jun 2018 14:08:13 -0700	[thread overview]
Message-ID: <605bc234de703e41e099fb1105678f9fb8411d28.1529528784.git.jaben.carsey@intel.com> (raw)
In-Reply-To: <cover.1529528783.git.jaben.carsey@intel.com>
In-Reply-To: <cover.1529528783.git.jaben.carsey@intel.com>

change mode minimal needed permissions
change to use with statement

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                   |  57 ++++-----
 BaseTools/Source/Python/AutoGen/GenC.py                      |   5 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                   |   5 +-
 BaseTools/Source/Python/AutoGen/IdfClassObject.py            |  21 ++--
 BaseTools/Source/Python/AutoGen/StrGather.py                 |  18 +--
 BaseTools/Source/Python/AutoGen/UniClassObject.py            |   5 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py   |   5 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                       |  30 +++--
 BaseTools/Source/Python/Common/Misc.py                       |  39 +++----
 BaseTools/Source/Python/Common/TargetTxtClassObject.py       |  13 +--
 BaseTools/Source/Python/Common/ToolDefClassObject.py         |   4 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                |   4 +-
 BaseTools/Source/Python/Ecc/Ecc.py                           |   4 +-
 BaseTools/Source/Python/Eot/EotGlobalData.py                 |  14 +--
 BaseTools/Source/Python/Eot/FileProfile.py                   |   8 +-
 BaseTools/Source/Python/Eot/Report.py                        |   6 +-
 BaseTools/Source/Python/GenFds/Capsule.py                    |   7 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                |  10 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                  |  20 +---
 BaseTools/Source/Python/GenFds/FfsFileStatement.py           |   5 +-
 BaseTools/Source/Python/GenFds/Fv.py                         |  37 +++---
 BaseTools/Source/Python/GenFds/FvImageSection.py             |   8 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py       |  53 ++++-----
 BaseTools/Source/Python/GenFds/GuidSection.py                |  45 ++++----
 BaseTools/Source/Python/GenFds/Region.py                     |  15 +--
 BaseTools/Source/Python/GenFds/Vtf.py                        | 122 ++++++++++----------
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py |  21 ++--
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py       |  19 ++-
 BaseTools/Source/Python/Table/TableReport.py                 |  47 ++++----
 BaseTools/Source/Python/TargetTool/TargetTool.py             |  97 ++++++++--------
 BaseTools/Source/Python/Trim/Trim.py                         |  67 +++++------
 BaseTools/Source/Python/Workspace/DscBuildData.py            |  17 +--
 BaseTools/Source/Python/build/BuildReport.py                 |  81 +++++++------
 BaseTools/Source/Python/build/build.py                       |  16 +--
 34 files changed, 401 insertions(+), 524 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index b7dd086e28a8..4ce39cf9583a 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -675,10 +675,8 @@ class WorkspaceAutoGen(AutoGen):
             for files in AllWorkSpaceMetaFiles:
                 if files.endswith('.dec'):
                     continue
-                f = open(files, 'r')
-                Content = f.read()
-                f.close()
-                m.update(Content)
+                with open(files, 'r') as f:
+                    m.update(f.read())
             SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'), m.hexdigest(), True)
             GlobalData.gPlatformHash = m.hexdigest()
 
@@ -686,11 +684,11 @@ class WorkspaceAutoGen(AutoGen):
         # Write metafile list to build directory
         #
         AutoGenFilePath = os.path.join(self.BuildDir, 'AutoGen')
-        if os.path.exists (AutoGenFilePath):
-            os.remove(AutoGenFilePath)
         if not os.path.exists(self.BuildDir):
             os.makedirs(self.BuildDir)
-        with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file:
+        elif os.path.exists (AutoGenFilePath):
+            os.remove(AutoGenFilePath)
+        with open(AutoGenFilePath, 'w') as file:
             for f in AllWorkSpaceMetaFiles:
                 print >> file, f
         return True
@@ -704,20 +702,16 @@ class WorkspaceAutoGen(AutoGen):
         HashFile = os.path.join(PkgDir, Pkg.PackageName + '.hash')
         m = hashlib.md5()
         # Get .dec file's hash value
-        f = open(Pkg.MetaFile.Path, 'r')
-        Content = f.read()
-        f.close()
-        m.update(Content)
+        with open(Pkg.MetaFile.Path, 'r') as f:
+            m.update(f.read())
         # Get include files hash value
         if Pkg.Includes:
             for inc in sorted(Pkg.Includes, key=lambda x: str(x)):
                 for Root, Dirs, Files in os.walk(str(inc)):
                     for File in sorted(Files):
                         File_Path = os.path.join(Root, File)
-                        f = open(File_Path, 'r')
-                        Content = f.read()
-                        f.close()
-                        m.update(Content)
+                        with open(File_Path, 'r') as f:
+                            m.update(f.read())
         SaveFileOnChange(HashFile, m.hexdigest(), True)
         GlobalData.gPackageHash[Pkg.Arch][Pkg.PackageName] = m.hexdigest()
 
@@ -3641,9 +3635,8 @@ class ModuleAutoGen(AutoGen):
             Vfri = os.path.join(self.OutputDir, SrcFile.BaseName + '.i')
             if not os.path.exists(Vfri):
                 continue
-            VfriFile = open(Vfri, 'r')
+            with open(Vfri, 'r') as VfriFile:
             Content = VfriFile.read()
-            VfriFile.close()
             Pos = Content.find('efivarstore')
             while Pos != -1:
                 #
@@ -3711,11 +3704,6 @@ class ModuleAutoGen(AutoGen):
         OutputName = '%sOffset.bin' % self.Name
         UniVfrOffsetFileName    =  os.path.join( self.OutputDir, OutputName)
 
-        try:
-            fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
-        except:
-            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
-
         # Use a instance of StringIO to cache data
         fStringIO = StringIO('')  
 
@@ -3742,17 +3730,19 @@ class ModuleAutoGen(AutoGen):
                 fStringIO.write(''.join(VfrGuid))                   
                 VfrValue = pack ('Q', int (Item[1], 16))
                 fStringIO.write (VfrValue)
-        #
+
+        try:
+            with open(UniVfrOffsetFileName, "wb", 0) as fInputfile:
         # write data into file.
-        #
         try :  
             fInputfile.write (fStringIO.getvalue())
         except:
             EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
                             "file been locked or using by other applications." %UniVfrOffsetFileName,None)
+        except:
+            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
 
         fStringIO.close ()
-        fInputfile.close ()
         return OutputName
 
     ## Create AsBuilt INF file the module
@@ -4130,9 +4120,8 @@ class ModuleAutoGen(AutoGen):
         FileDir = path.join(GlobalData.gBinCacheSource, self.Arch, self.SourceDir, self.MetaFile.BaseName)
         HashFile = path.join(FileDir, self.Name + '.hash')
         if os.path.exists(HashFile):
-            f = open(HashFile, 'r')
+            with open(HashFile, 'r') as f:
             CacheHash = f.read()
-            f.close()
             if GlobalData.gModuleHash[self.Arch][self.Name]:
                 if CacheHash == GlobalData.gModuleHash[self.Arch][self.Name]:
                     for root, dir, files in os.walk(FileDir):
@@ -4296,17 +4285,13 @@ class ModuleAutoGen(AutoGen):
                 m.update(GlobalData.gModuleHash[self.Arch][Lib.Name])
 
         # Add Module self
-        f = open(str(self.MetaFile), 'r')
-        Content = f.read()
-        f.close()
-        m.update(Content)
+        with open(str(self.MetaFile), 'r') as f:
+            m.update(f.read())
         # Add Module's source files
         if self.SourceFileList:
             for File in sorted(self.SourceFileList, key=lambda x: str(x)):
-                f = open(str(File), 'r')
-                Content = f.read()
-                f.close()
-                m.update(Content)
+                with open(str(File), 'r') as f:
+                    m.update(f.read())
 
         ModuleHashFile = path.join(self.BuildDir, self.Name + ".hash")
         if self.Name not in GlobalData.gModuleHash[self.Arch]:
@@ -4364,7 +4349,7 @@ class ModuleAutoGen(AutoGen):
 
         if os.path.exists (self.GetTimeStampPath()):
             os.remove (self.GetTimeStampPath())
-        with open(self.GetTimeStampPath(), 'w+') as file:
+        with open(self.GetTimeStampPath(), 'w') as file:
             for f in FileSet:
                 print >> file, f
 
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index ae3af085a16b..c6395a1b7da3 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1814,9 +1814,8 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
                                 Index += 1
                                 continue
 
-                            TmpFile = open(File.Path, 'rb')
-                            Buffer = TmpFile.read()
-                            TmpFile.close()
+                            with open(File.Path, 'rb') as f:
+                                Buffer = f.read()
                             if File.Ext.upper() == '.PNG':
                                 TempBuffer = pack('B', EFI_HII_IIBT_IMAGE_PNG)
                                 TempBuffer += pack('I', len(Buffer))
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index f19c1bfdaff1..59e195ddf1ef 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1026,12 +1026,11 @@ cleanlib:
                 CurrentFileDependencyList = DepDb[F]
             else:
                 try:
-                    Fd = open(F.Path, 'r')
+                    with open(F.Path, 'r') as f:
+                        FileContent = f.read()
                 except BaseException, X:
                     EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
 
-                FileContent = Fd.read()
-                Fd.close()
                 if len(FileContent) == 0:
                     continue
 
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index e5b933c2036f..dbd9dfe30c20 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to collect all defined strings in Image Definition files
 #
-# Copyright (c) 2016, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2016 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -69,13 +69,12 @@ class IdfFileClassObject(object):
         self.ImageFilesDict = {}
         self.ImageIDList = []
         for File in FileList:
-            if File is None:
+            if not File:
                 EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'No Image definition file is given.')
 
             try:
-                IdfFile = open(LongFilePath(File.Path), mode='r')
-                FileIn = IdfFile.read()
-                IdfFile.close()
+                with open(LongFilePath(File.Path), mode='r') as f:
+                    FileIn = f.read()
             except:
                 EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File)
 
@@ -118,12 +117,12 @@ def SearchImageID(ImageFileObject, FileList):
 
     for File in FileList:
         if os.path.isfile(File):
-            Lines = open(File, 'r')
-            for Line in Lines:
-                ImageIdList = IMAGE_TOKEN.findall(Line)
-                for ID in ImageIdList:
-                    EdkLogger.debug(EdkLogger.DEBUG_5, "Found ImageID identifier: " + ID)
-                    ImageFileObject.SetImageIDReferenced(ID)
+            with open(File, 'r') as f:
+                for Line in f:
+                    ImageIdList = IMAGE_TOKEN.findall(Line)
+                    for ID in ImageIdList:
+                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found ImageID identifier: " + ID)
+                        ImageFileObject.SetImageIDReferenced(ID)
 
 class ImageFileObject(object):
     def __init__(self, FileName, ImageID, TransParent = False):
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index d41bcb3f7137..082d8cdc5f3c 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -529,11 +529,11 @@ def SearchString(UniObjectClass, FileList, IsCompatibleMode):
 
     for File in FileList:
         if os.path.isfile(File):
-            Lines = open(File, 'r')
-            for Line in Lines:
-                for StrName in STRING_TOKEN.findall(Line):
-                    EdkLogger.debug(EdkLogger.DEBUG_5, "Found string identifier: " + StrName)
-                    UniObjectClass.SetStringReferenced(StrName)
+            with open(File, 'r') as f:
+                for Line in f:
+                    for StrName in STRING_TOKEN.findall(Line):
+                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found string identifier: " + StrName)
+                        UniObjectClass.SetStringReferenced(StrName)
 
     UniObjectClass.ReToken()
 
@@ -603,9 +603,9 @@ if __name__ == '__main__':
     SkipList = ['.inf', '.uni']
     BaseName = 'DriverSample'
     (h, c) = GetStringFiles(UniFileList, SrcFileList, IncludeList, SkipList, BaseName, True)
-    hfile = open('unistring.h', 'w')
-    cfile = open('unistring.c', 'w')
-    hfile.write(h)
-    cfile.write(c)
+    with open('unistring.h', 'w') as f:
+        f.write(h)
+    with open('unistring.c', 'w') as f:
+        f.write(c)
 
     EdkLogger.info('end')
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 5a3c2547783b..b1b9c96c39bb 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -303,9 +303,8 @@ class UniFileClassObject(object):
         # Read file
         #
         try:
-            UniFile = open(FileName, mode='rb')
-            FileIn = UniFile.read()
-            UniFile.close()
+            with open(FileName, mode='rb') as f:
+                FileIn = f.read()
         except:
             EdkLogger.Error("build", FILE_OPEN_FAILURE, ExtraData=File)
 
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 3b54865000bf..2c6bb8e396a9 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -165,9 +165,8 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
         
         DbFile = StringIO()
         if Phase == 'DXE' and os.path.exists(BinFilePath):
-            BinFile = open(BinFilePath, "rb")
-            BinBuffer = BinFile.read()
-            BinFile.close()
+            with open(BinFilePath, "rb") as f:
+                BinBuffer = f.read()
             BinBufferSize = len(BinBuffer)
             if (BinBufferSize % 4):
                 for i in range(4 - (BinBufferSize % 4)):
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 4fa12b7d59de..dba815415f92 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -323,13 +323,11 @@ class GenVPD :
         self.PcdFixedOffsetSizeList  = []
         self.PcdUnknownOffsetList    = []
         try:
-            fInputfile = open(InputFileName, "r", 0)
-            try:
-                self.FileLinesList = fInputfile.readlines()
-            except:
-                EdkLogger.error("BPDG", BuildToolError.FILE_READ_FAILURE, "File read failed for %s" % InputFileName, None)
-            finally:
-                fInputfile.close()
+            with open(InputFileName, "r", 0) as f:
+                try:
+                    self.FileLinesList = f.readlines()
+                except:
+                    EdkLogger.error("BPDG", BuildToolError.FILE_READ_FAILURE, "File read failed for %s" % InputFileName, None)
         except:
             EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % InputFileName, None)
 
@@ -661,12 +659,6 @@ class GenVPD :
     def GenerateVpdFile (self, MapFileName, BinFileName):
         #Open an VPD file to process
 
-        try:
-            fVpdFile = open(BinFileName, "wb", 0)
-        except:
-            # Open failed
-            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.VpdFileName, None)
-
         try :
             fMapFile = open(MapFileName, "w", 0)
         except:
@@ -697,12 +689,16 @@ class GenVPD :
             else:
                 fStringIO.write (eachPcd.PcdValue)
 
-        try :
-            fVpdFile.write (fStringIO.getvalue())
+        try:
+            with open(BinFileName, "wb", 0) as fVpdFile:
+                try :
+                    fVpdFile.write (fStringIO.getvalue())
+                except:
+                    EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.VpdFileName, None)
         except:
-            EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.VpdFileName, None)
+            # Open failed
+            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.VpdFileName, None)
 
         fStringIO.close ()
-        fVpdFile.close ()
         fMapFile.close ()
         
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 8fba734568bd..148cbe2cf99d 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -63,15 +63,14 @@ def GetVariableOffset(mapfilepath, efifilepath, varnames):
     
     @return List whos elements are tuple with variable name and raw offset
     """
-    lines = []
     try:
-        f = open(mapfilepath, 'r')
-        lines = f.readlines()
-        f.close()
+        with open(mapfilepath, 'r') as f:
+            lines = f.readlines()
     except:
         return None
     
-    if len(lines) == 0: return None
+    if len(lines) == 0: 
+        return None
     firstline = lines[0].strip()
     if (firstline.startswith("Archive member included ") and
         firstline.endswith(" file (symbol)")):
@@ -471,13 +470,11 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
                 if not SaveFileToDisk(File, Content):
                     EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData=File)
             except:
-                Fd = open(File, "wb")
+                with open(File, "wb") as Fd:
+                    Fd.write(Content)
+        else:
+            with open(File, "wb") as Fd:
                 Fd.write(Content)
-                Fd.close()
-        else:
-            Fd = open(File, "wb")
-            Fd.write(Content)
-            Fd.close()
     except IOError, X:
         EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
 
@@ -489,15 +486,11 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
 #   @param      File    The path of file to store the object
 #
 def DataDump(Data, File):
-    Fd = None
     try:
-        Fd = open(File, 'wb')
-        cPickle.dump(Data, Fd, cPickle.HIGHEST_PROTOCOL)
+        with open(File, 'wb') as Fd:
+            cPickle.dump(Data, Fd, cPickle.HIGHEST_PROTOCOL)
     except:
         EdkLogger.error("", FILE_OPEN_FAILURE, ExtraData=File, RaiseError=False)
-    finally:
-        if Fd is not None:
-            Fd.close()
 
 ## Restore a Python object from a file
 #
@@ -507,18 +500,12 @@ def DataDump(Data, File):
 #   @retval     None    If failure in file operation
 #
 def DataRestore(File):
-    Data = None
-    Fd = None
     try:
-        Fd = open(File, 'rb')
-        Data = cPickle.load(Fd)
+        with open(File, 'rb') as Fd:
+            return cPickle.load(Fd)
     except Exception, e:
         EdkLogger.verbose("Failed to load [%s]\n\t%s" % (File, str(e)))
-        Data = None
-    finally:
-        if Fd is not None:
-            Fd.close()
-    return Data
+    return None
 
 ## Retrieve and cache the real path name in file system
 #
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index f8459c892e36..6f5e5f0d173d 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -77,16 +77,14 @@ class TargetTxtClassObject(object):
     # @retval 1 Open file failed
     #
     def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCharacter):
-        F = None
+        self.ConfDirectoryPath = os.path.dirname(FileName)
         try:
-            F = open(FileName, 'r')
-            self.ConfDirectoryPath = os.path.dirname(FileName)
+            with open(FileName, 'r') as F:
+                Lines = F.readlines()
         except:
             EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=FileName)
-            if F is not None:
-                F.close()
 
-        for Line in F:
+        for Line in Lines:
             Line = Line.strip()
             if Line.startswith(CommentCharacter) or Line == '':
                 continue
@@ -131,10 +129,7 @@ class TargetTxtClassObject(object):
                     EdkLogger.error("build", FORMAT_INVALID, "Invalid number of [%s]: %s." % (Key, Value),
                                     File=FileName)
                 self.TargetTxtDictionary[Key] = Value
-            #elif Key not in GlobalData.gGlobalDefines:
-            #    GlobalData.gGlobalDefines[Key] = Value
 
-        F.close()
         return 0
 
 ## TargetTxtDict
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index dd985ab30359..feac169df5c8 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -117,8 +117,8 @@ class ToolDefClassObject(object):
         FileContent = []
         if os.path.isfile(FileName):
             try:
-                F = open(FileName, 'r')
-                FileContent = F.readlines()
+                with open(FileName, 'r') as F:
+                    FileContent = F.readlines()
             except:
                 EdkLogger.error("tools_def.txt parser", FILE_OPEN_FAILURE, ExtraData=FileName)
         else:
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 2b447772eafe..117be45cf20c 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -153,12 +153,12 @@ class VpdInfoFile:
     #  @param FilePath The full path string for existing VPD PCD info file.
     def Read(self, FilePath):
         try:
-            fd = open(FilePath, "r")
+            with open(FilePath, "r") as fd:
+                Lines = fd.readlines()
         except:
             EdkLogger.error("VpdInfoFile", 
                             BuildToolError.FILE_OPEN_FAILURE, 
                             "Fail to open file %s for written." % FilePath)
-        Lines = fd.readlines()
         for Line in Lines:
             Line = Line.strip()
             if len(Line) == 0 or Line.startswith("#"):
diff --git a/BaseTools/Source/Python/Ecc/Ecc.py b/BaseTools/Source/Python/Ecc/Ecc.py
index e78d70372e36..c4d9b36ac228 100644
--- a/BaseTools/Source/Python/Ecc/Ecc.py
+++ b/BaseTools/Source/Python/Ecc/Ecc.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to be the main entrance of ECC tool
 #
-# Copyright (c) 2009 - 2016, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2009 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -201,7 +201,7 @@ class Ecc(object):
             for specificDir in SpecificDirs:    
                 ScanFolders.append(os.path.join(EccGlobalData.gTarget, specificDir))
         EdkLogger.quiet("Building database for meta data files ...")
-        Op = open(EccGlobalData.gConfig.MetaDataFileCheckPathOfGenerateFileList, 'w+')
+        Op = open(EccGlobalData.gConfig.MetaDataFileCheckPathOfGenerateFileList, 'w')
         #SkipDirs = Read from config file
         SkipDirs = EccGlobalData.gConfig.SkipDirList
         SkipDirString = string.join(SkipDirs, '|')
diff --git a/BaseTools/Source/Python/Eot/EotGlobalData.py b/BaseTools/Source/Python/Eot/EotGlobalData.py
index cb6a940ab8f9..a13c7e8f5851 100644
--- a/BaseTools/Source/Python/Eot/EotGlobalData.py
+++ b/BaseTools/Source/Python/Eot/EotGlobalData.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to save global datas
 #
-# Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -38,27 +38,27 @@ gMACRO['CAPSULE_INF'] = ''
 
 # Log file for unmatched variables
 gUN_MATCHED_LOG = 'Log_UnMatched.log'
-gOP_UN_MATCHED = open(gUN_MATCHED_LOG, 'w+')
+gOP_UN_MATCHED = open(gUN_MATCHED_LOG, 'w')
 
 # Log file for all INF files
 gINF_FILES = 'Log_Inf_File.log'
-gOP_INF = open(gINF_FILES, 'w+')
+gOP_INF = open(gINF_FILES, 'w')
 
 # Log file for not dispatched PEIM/DRIVER
 gUN_DISPATCHED_LOG = 'Log_UnDispatched.log'
-gOP_UN_DISPATCHED = open(gUN_DISPATCHED_LOG, 'w+')
+gOP_UN_DISPATCHED = open(gUN_DISPATCHED_LOG, 'w')
 
 # Log file for unmatched variables in function calling
 gUN_MATCHED_IN_LIBRARY_CALLING_LOG = 'Log_UnMatchedInLibraryCalling.log'
-gOP_UN_MATCHED_IN_LIBRARY_CALLING = open(gUN_MATCHED_IN_LIBRARY_CALLING_LOG, 'w+')
+gOP_UN_MATCHED_IN_LIBRARY_CALLING = open(gUN_MATCHED_IN_LIBRARY_CALLING_LOG, 'w')
 
 # Log file for order of dispatched PEIM/DRIVER
 gDISPATCH_ORDER_LOG = 'Log_DispatchOrder.log'
-gOP_DISPATCH_ORDER = open(gDISPATCH_ORDER_LOG, 'w+')
+gOP_DISPATCH_ORDER = open(gDISPATCH_ORDER_LOG, 'w')
 
 # Log file for found source files
 gSOURCE_FILES = 'Log_SourceFiles.log'
-gOP_SOURCE_FILES = open(gSOURCE_FILES, 'w+')
+gOP_SOURCE_FILES = open(gSOURCE_FILES, 'w')
 
 # Dict for GUID found in DEC files
 gGuidDict = dict()
diff --git a/BaseTools/Source/Python/Eot/FileProfile.py b/BaseTools/Source/Python/Eot/FileProfile.py
index 0544c0d55b44..bf6a4c054baa 100644
--- a/BaseTools/Source/Python/Eot/FileProfile.py
+++ b/BaseTools/Source/Python/Eot/FileProfile.py
@@ -1,7 +1,7 @@
 ## @file
 # fragments of source file
 #
-#  Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -49,11 +49,7 @@ class FileProfile :
         self.FileLinesList = []
         self.FileLinesListFromFile = []
         try:
-            fsock = open(FileName, "rb", 0)
-            try:
+            with open(FileName, "rb", 0) as fsock:
                 self.FileLinesListFromFile = fsock.readlines()
-            finally:
-                fsock.close()
-
         except IOError:
             raise Warning("Error when opening file %s" % FileName)
diff --git a/BaseTools/Source/Python/Eot/Report.py b/BaseTools/Source/Python/Eot/Report.py
index 7435b4d7c930..99b8b152180a 100644
--- a/BaseTools/Source/Python/Eot/Report.py
+++ b/BaseTools/Source/Python/Eot/Report.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to create report for Eot tool
 #
-# Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -33,10 +33,10 @@ class Report(object):
     #
     def __init__(self, ReportName = 'Report.html', FvObj = None, DispatchName=None):
         self.ReportName = ReportName
-        self.Op = open(ReportName, 'w+')
+        self.Op = open(ReportName, 'w')
         self.DispatchList = None
         if DispatchName:
-            self.DispatchList = open(DispatchName, 'w+')
+            self.DispatchList = open(DispatchName, 'w')
         self.FvObj = FvObj
         self.FfsIndex = 0
         self.PpiIndex = 0
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index fbd48f3c6d76..6aae2fcb7d97 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -137,9 +137,8 @@ class Capsule (CapsuleClassObject) :
             FileName = driver.GenCapsuleSubItem()
             FwMgrHdr.write(pack('=Q', PreSize))
             PreSize += os.path.getsize(FileName)
-            File = open(FileName, 'rb')
-            Content.write(File.read())
-            File.close()
+            with open(FileName, 'rb') as File:
+                Content.write(File.read())
         for fmp in self.FmpPayloadList:
             if fmp.Existed:
                 FwMgrHdr.write(pack('=Q', PreSize))
@@ -247,7 +246,7 @@ class Capsule (CapsuleClassObject) :
     def GenCapInf(self):
         self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiCapsuleName +  "_Cap" + '.inf')
-        CapInfFile = StringIO.StringIO() #open (self.CapInfFileName , 'w+')
+        CapInfFile = StringIO.StringIO()
 
         CapInfFile.writelines("[options]" + T_CHAR_LF)
 
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index dd4c27bd15c7..9916bd4d2627 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -233,12 +233,10 @@ class CapsulePayload(CapsuleData):
         #
         # Append file content to the structure
         #
-        ImageFile = open(self.ImageFile, 'rb')
-        Buffer += ImageFile.read()
-        ImageFile.close()
+        with open(self.ImageFile, 'rb') as ImageFile:
+            Buffer += ImageFile.read()
         if self.VendorCodeFile:
-            VendorFile = open(self.VendorCodeFile, 'rb')
-            Buffer += VendorFile.read()
-            VendorFile.close()
+            with open(self.VendorCodeFile, 'rb') as VendorFile:
+                Buffer += VendorFile.read()
         self.Existed = True
         return Buffer
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index ba076c8c1ecd..af1760bf729c 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -155,16 +155,11 @@ class IncludeFileProfile :
         self.FileName = FileName
         self.FileLinesList = []
         try:
-            fsock = open(FileName, "rb", 0)
-            try:
+            with open(FileName, "rb", 0) as fsock:
                 self.FileLinesList = fsock.readlines()
-                for index, line in enumerate(self.FileLinesList):
-                    if not line.endswith('\n'):
-                        self.FileLinesList[index] += '\n'
-
-            finally:
-                fsock.close()
-
+            for index, line in enumerate(self.FileLinesList):
+                if not line.endswith('\n'):
+                    self.FileLinesList[index] += '\n'
         except:
             EdkLogger.error("FdfParser", FILE_OPEN_FAILURE, ExtraData=FileName)
 
@@ -216,16 +211,11 @@ class FileProfile :
     def __init__(self, FileName):
         self.FileLinesList = []
         try:
-            fsock = open(FileName, "rb", 0)
-            try:
+            with open(FileName, "rb", 0) as fsock:
                 self.FileLinesList = fsock.readlines()
-            finally:
-                fsock.close()
-
         except:
             EdkLogger.error("FdfParser", FILE_OPEN_FAILURE, ExtraData=FileName)
 
-
         self.PcdDict = {}
         self.InfList = []
         self.InfDict = {'ArchTBD':[]}
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index 871499d3d2ad..1449d363eac3 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -104,11 +104,10 @@ class FileStatement (FileStatementClassObject) :
                     MaxAlignValue = 1
                     for Index, File in enumerate(self.FileName):
                         try:
-                            f = open(File, 'rb')
+                            with open(File, 'rb') as f:
+                                Content = f.read()
                         except:
                             GenFdsGlobalVariable.ErrorLogger("Error opening RAW file %s." % (File))
-                        Content = f.read()
-                        f.close()
                         AlignValue = 1
                         if self.SubAlignment[Index] is not None:
                             AlignValue = GenFdsGlobalVariable.GetAlignment(self.SubAlignment[Index])
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 29daba5a3a3e..d4b0611fc55a 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -163,8 +163,8 @@ class FV (FvClassObject):
                 NewFvInfo = open(FvInfoFileName, 'r').read()
             if NewFvInfo is not None and NewFvInfo != OrigFvInfo:
                 FvChildAddr = []
-                AddFileObj = open(FvInfoFileName, 'r')
-                AddrStrings = AddFileObj.readlines()
+                with open(FvInfoFileName, 'r') as AddFileObj:
+                    AddrStrings = AddFileObj.readlines()
                 AddrKeyFound = False
                 for AddrString in AddrStrings:
                     if AddrKeyFound:
@@ -172,7 +172,6 @@ class FV (FvClassObject):
                         FvChildAddr.append (AddrString)
                     elif AddrString.find ("[FV_BASE_ADDRESS]") != -1:
                         AddrKeyFound = True
-                AddFileObj.close()
 
                 if FvChildAddr != []:
                     # Update Ffs again
@@ -195,14 +194,14 @@ class FV (FvClassObject):
             # Write the Fv contents to Buffer
             #
             if os.path.isfile(FvOutputFile):
-                FvFileObj = open(FvOutputFile, 'rb')
+                with open(FvOutputFile, 'rb') as FvFileObj:
+                    Buffer.write(FvFileObj.read())
+                    FvFileObj.seek(0)
+                    # PI FvHeader is 0x48 byte
+                    FvHeaderBuffer = FvFileObj.read(0x48)
+
                 GenFdsGlobalVariable.VerboseLogger("\nGenerate %s FV Successfully" % self.UiFvName)
                 GenFdsGlobalVariable.SharpCounter = 0
-
-                Buffer.write(FvFileObj.read())
-                FvFileObj.seek(0)
-                # PI FvHeader is 0x48 byte
-                FvHeaderBuffer = FvFileObj.read(0x48)
                 # FV alignment position.
                 FvAlignmentValue = 1 << (ord(FvHeaderBuffer[0x2E]) & 0x1F)
                 if FvAlignmentValue >= 0x400:
@@ -217,7 +216,6 @@ class FV (FvClassObject):
                 else:
                     # FvAlignmentValue is less than 1K
                     self.FvAlignment = str (FvAlignmentValue)
-                FvFileObj.close()
                 GenFds.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
                 GenFdsGlobalVariable.LargeFileInFvFlags.pop()
             else:
@@ -378,16 +376,15 @@ class FV (FvClassObject):
                     # check if the file path exists or not
                     if not os.path.isfile(FileFullPath):
                         GenFdsGlobalVariable.ErrorLogger("Error opening FV Extension Header Entry file %s." % (self.FvExtEntryData[Index]))
-                    FvExtFile = open (FileFullPath,'rb')
-                    FvExtFile.seek(0,2)
-                    Size = FvExtFile.tell()
-                    if Size >= 0x10000:
-                        GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
-                    TotalSize += (Size + 4)
-                    FvExtFile.seek(0)
-                    Buffer += pack('HH', (Size + 4), int(self.FvExtEntryTypeValue[Index], 16))
-                    Buffer += FvExtFile.read() 
-                    FvExtFile.close()
+                    with open (FileFullPath,'rb') as FvExtFile:
+                        FvExtFile.seek(0,2)
+                        Size = FvExtFile.tell()
+                        if Size >= 0x10000:
+                            GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
+                        TotalSize += (Size + 4)
+                        FvExtFile.seek(0)
+                        Buffer += pack('HH', (Size + 4), int(self.FvExtEntryTypeValue[Index], 16))
+                        Buffer += FvExtFile.read()
                 if self.FvExtEntryType[Index] == 'DATA':
                     ByteList = self.FvExtEntryData[Index].split(',')
                     Size = len (ByteList)
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 0fc7115b677b..5a6e652d2fc5 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -64,8 +64,8 @@ class FvImageSection(FvImageSectionClassObject):
                 FvAlignmentValue = 0
                 if os.path.isfile(FvFileName):
                     with open (FvFileName,'rb') as FvFileObj:
-                    # PI FvHeader is 0x48 byte
-                    FvHeaderBuffer = FvFileObj.read(0x48)
+                        # PI FvHeader is 0x48 byte
+                        FvHeaderBuffer = FvFileObj.read(0x48)
                     # FV alignment position.
                     FvAlignmentValue = 1 << (ord (FvHeaderBuffer[0x2E]) & 0x1F)
                 if FvAlignmentValue > MaxFvAlignment:
@@ -110,8 +110,8 @@ class FvImageSection(FvImageSectionClassObject):
                     FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvFileName)
                     if os.path.isfile(FvFileName):
                         with open (FvFileName,'rb') as FvFileObj:
-                        # PI FvHeader is 0x48 byte
-                        FvHeaderBuffer = FvFileObj.read(0x48)
+                            # PI FvHeader is 0x48 byte
+                            FvHeaderBuffer = FvFileObj.read(0x48)
                         # FV alignment position.
                         FvAlignmentValue = 1 << (ord (FvHeaderBuffer[0x2E]) & 0x1F)
                         # FvAlignmentValue is larger than or equal to 1K
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index c2e82de891d3..6876068dbe3e 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -300,31 +300,27 @@ class GenFdsGlobalVariable:
         # Create FV Address inf file
         #
         GenFdsGlobalVariable.FvAddressFileName = os.path.join(GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
-        FvAddressFile = open(GenFdsGlobalVariable.FvAddressFileName, 'w')
-        #
-        # Add [Options]
-        #
-        FvAddressFile.writelines("[options]" + T_CHAR_LF)
         BsAddress = '0'
         for Arch in ArchList:
             if GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].BsBaseAddress:
                 BsAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].BsBaseAddress
                 break
-
-        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                       BsAddress + \
-                                       T_CHAR_LF)
-
         RtAddress = '0'
         for Arch in ArchList:
             if GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].RtBaseAddress:
                 RtAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].RtBaseAddress
+        with open(GenFdsGlobalVariable.FvAddressFileName, 'w') as FvAddressFile:
+            #
+            # Add [Options]
+            #
+            FvAddressFile.writelines("[options]" + T_CHAR_LF)
+            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
+                                           BsAddress + \
+                                           T_CHAR_LF)
+            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
+                                           RtAddress + \
+                                           T_CHAR_LF)
 
-        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                       RtAddress + \
-                                       T_CHAR_LF)
-
-        FvAddressFile.close()
 
     def SetEnv(FdfParser, WorkSpace, ArchList, GlobalData):
         GenFdsGlobalVariable.ModuleFile = WorkSpace.ModuleFile
@@ -361,11 +357,6 @@ class GenFdsGlobalVariable:
         # Create FV Address inf file
         #
         GenFdsGlobalVariable.FvAddressFileName = os.path.join(GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
-        FvAddressFile = open(GenFdsGlobalVariable.FvAddressFileName, 'w')
-        #
-        # Add [Options]
-        #
-        FvAddressFile.writelines("[options]" + T_CHAR_LF)
         BsAddress = '0'
         for Arch in ArchList:
             BsAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,
@@ -373,11 +364,6 @@ class GenFdsGlobalVariable:
                                                                    GlobalData.gGlobalDefines["TOOL_CHAIN_TAG"]].BsBaseAddress
             if BsAddress:
                 break
-
-        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                 BsAddress + \
-                                 T_CHAR_LF)
-
         RtAddress = '0'
         for Arch in ArchList:
             if GenFdsGlobalVariable.WorkSpace.BuildObject[
@@ -386,12 +372,17 @@ class GenFdsGlobalVariable:
                 RtAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[
                     GenFdsGlobalVariable.ActivePlatform, Arch, GlobalData.gGlobalDefines['TARGET'],
                     GlobalData.gGlobalDefines["TOOL_CHAIN_TAG"]].RtBaseAddress
-
-        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                 RtAddress + \
-                                 T_CHAR_LF)
-
-        FvAddressFile.close()
+        with open(GenFdsGlobalVariable.FvAddressFileName, 'w') as FvAddressFile:
+            #
+            # Add [Options]
+            #
+            FvAddressFile.writelines("[options]" + T_CHAR_LF)
+            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
+                                     BsAddress + \
+                                     T_CHAR_LF)
+            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
+                                     RtAddress + \
+                                     T_CHAR_LF)
 
     ## ReplaceWorkspaceMacro()
     #
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index bc95c7cd9d42..28571292f5a6 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -202,33 +202,28 @@ class GuidSection(GuidSectionClassObject) :
                 if not os.path.exists(TempFile) :
                     EdkLogger.error("GenFds", COMMAND_FAILURE, 'Fail to call %s, no output file was generated' % ExternalTool)
 
-                FileHandleIn = open(DummyFile, 'rb')
-                FileHandleIn.seek(0, 2)
-                InputFileSize = FileHandleIn.tell()
+                with open(DummyFile, 'rb') as FileHandleIn, open(TempFile, 'rb') as FileHandleOut:
+                    FileHandleIn.seek(0, 2)
+                    InputFileSize = FileHandleIn.tell()
+                    FileHandleOut.seek(0, 2)
+                    TempFileSize = FileHandleOut.tell()
 
-                FileHandleOut = open(TempFile, 'rb')
-                FileHandleOut.seek(0, 2)
-                TempFileSize = FileHandleOut.tell()
+                    Attribute = []
+                    HeaderLength = None
+                    if self.ExtraHeaderSize != -1:
+                        HeaderLength = str(self.ExtraHeaderSize)
 
-                Attribute = []
-                HeaderLength = None
-                if self.ExtraHeaderSize != -1:
-                    HeaderLength = str(self.ExtraHeaderSize)
-
-                if self.ProcessRequired == "NONE" and HeaderLength is None:
-                    if TempFileSize > InputFileSize:
-                        FileHandleIn.seek(0)
-                        BufferIn = FileHandleIn.read()
-                        FileHandleOut.seek(0)
-                        BufferOut = FileHandleOut.read()
-                        if BufferIn == BufferOut[TempFileSize - InputFileSize:]:
-                            HeaderLength = str(TempFileSize - InputFileSize)
-                    #auto sec guided attribute with process required
-                    if HeaderLength is None:
-                        Attribute.append('PROCESSING_REQUIRED')
-
-                FileHandleIn.close()
-                FileHandleOut.close()
+                    if self.ProcessRequired == "NONE" and HeaderLength is None:
+                        if TempFileSize > InputFileSize:
+                            FileHandleIn.seek(0)
+                            BufferIn = FileHandleIn.read()
+                            FileHandleOut.seek(0)
+                            BufferOut = FileHandleOut.read()
+                            if BufferIn == BufferOut[TempFileSize - InputFileSize:]:
+                                HeaderLength = str(TempFileSize - InputFileSize)
+                        #auto sec guided attribute with process required
+                        if HeaderLength is None:
+                            Attribute.append('PROCESSING_REQUIRED')
 
                 if FirstCall and 'PROCESSING_REQUIRED' in Attribute:
                     # Guided data by -z option on first call is the process required data. Call the guided tool with the real option.
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 9d632b6321e2..e67d056cc178 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -159,9 +159,8 @@ class Region(RegionClassObject):
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "Size of FV File (%s) is larger than Region Size 0x%X specified." \
                                             % (RegionData, Size))
-                        BinFile = open(FileName, 'rb')
-                        Buffer.write(BinFile.read())
-                        BinFile.close()
+                        with open(FileName, 'rb') as BinFile:
+                            Buffer.write(BinFile.read())
                         Size = Size - FileLength
             #
             # Pad the left buffer
@@ -213,9 +212,8 @@ class Region(RegionClassObject):
                     EdkLogger.error("GenFds", GENFDS_ERROR,
                                     "Size 0x%X of Capsule File (%s) is larger than Region Size 0x%X specified." \
                                     % (FileLength, RegionData, Size))
-                BinFile = open(FileName, 'rb')
-                Buffer.write(BinFile.read())
-                BinFile.close()
+                with open(FileName, 'rb') as BinFile:
+                    Buffer.write(BinFile.read())
                 Size = Size - FileLength
             #
             # Pad the left buffer
@@ -245,9 +243,8 @@ class Region(RegionClassObject):
                                     "Size of File (%s) is larger than Region Size 0x%X specified." \
                                     % (RegionData, Size))
                 GenFdsGlobalVariable.InfLogger('   Region File Name = %s' % RegionData)
-                BinFile = open(RegionData, 'rb')
-                Buffer.write(BinFile.read())
-                BinFile.close()
+                with open(RegionData, 'rb') as BinFile:
+                    Buffer.write(BinFile.read())
                 Size = Size - FileLength
             #
             # Pad the left buffer
diff --git a/BaseTools/Source/Python/GenFds/Vtf.py b/BaseTools/Source/Python/GenFds/Vtf.py
index 18ea37b9afdd..291070827b78 100644
--- a/BaseTools/Source/Python/GenFds/Vtf.py
+++ b/BaseTools/Source/Python/GenFds/Vtf.py
@@ -1,7 +1,7 @@
 ## @file
 # process VTF generation
 #
-#  Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -67,81 +67,79 @@ class Vtf (VtfClassObject):
     def GenBsfInf (self):
         FvList = self.GetFvList()
         self.BsfInfName = os.path.join(GenFdsGlobalVariable.FvDir, self.UiName + '.inf')
-        BsfInf = open(self.BsfInfName, 'w+')
-        if self.ResetBin is not None:
-            BsfInf.writelines ("[OPTIONS]" + T_CHAR_LF)
-            BsfInf.writelines ("IA32_RST_BIN" + \
-                               " = " + \
-                               GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.ResetBin)) + \
-                               T_CHAR_LF)
-            BsfInf.writelines (T_CHAR_LF)
-
-        BsfInf.writelines ("[COMPONENTS]" + T_CHAR_LF)
-
-        for ComponentObj in self.ComponentStatementList :
-            BsfInf.writelines ("COMP_NAME" + \
-                               " = " + \
-                               ComponentObj.CompName + \
-                               T_CHAR_LF)
-            if ComponentObj.CompLoc.upper() == 'NONE':
-                BsfInf.writelines ("COMP_LOC" + \
+        with open(self.BsfInfName, 'w') as BsfInf:
+            if self.ResetBin is not None:
+                BsfInf.writelines ("[OPTIONS]" + T_CHAR_LF)
+                BsfInf.writelines ("IA32_RST_BIN" + \
                                    " = " + \
-                                   'N' + \
+                                   GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.ResetBin)) + \
                                    T_CHAR_LF)
+                BsfInf.writelines (T_CHAR_LF)
+
+            BsfInf.writelines ("[COMPONENTS]" + T_CHAR_LF)
 
-            elif ComponentObj.FilePos is not None:
-                BsfInf.writelines ("COMP_LOC" + \
+            for ComponentObj in self.ComponentStatementList :
+                BsfInf.writelines ("COMP_NAME" + \
                                    " = " + \
-                                   ComponentObj.FilePos + \
+                                   ComponentObj.CompName + \
                                    T_CHAR_LF)
-            else:
-                Index = FvList.index(ComponentObj.CompLoc.upper())
-                if Index == 0:
+                if ComponentObj.CompLoc.upper() == 'NONE':
                     BsfInf.writelines ("COMP_LOC" + \
                                        " = " + \
-                                       'F' + \
+                                       'N' + \
                                        T_CHAR_LF)
-                elif Index == 1:
+
+                elif ComponentObj.FilePos is not None:
                     BsfInf.writelines ("COMP_LOC" + \
                                        " = " + \
-                                       'S' + \
+                                       ComponentObj.FilePos + \
                                        T_CHAR_LF)
+                else:
+                    Index = FvList.index(ComponentObj.CompLoc.upper())
+                    if Index == 0:
+                        BsfInf.writelines ("COMP_LOC" + \
+                                           " = " + \
+                                           'F' + \
+                                           T_CHAR_LF)
+                    elif Index == 1:
+                        BsfInf.writelines ("COMP_LOC" + \
+                                           " = " + \
+                                           'S' + \
+                                           T_CHAR_LF)
 
-            BsfInf.writelines ("COMP_TYPE" + \
-                               " = " + \
-                               ComponentObj.CompType + \
-                               T_CHAR_LF)
-            BsfInf.writelines ("COMP_VER" + \
-                               " = " + \
-                               ComponentObj.CompVer + \
-                               T_CHAR_LF)
-            BsfInf.writelines ("COMP_CS" + \
-                               " = " + \
-                               ComponentObj.CompCs + \
-                               T_CHAR_LF)
-
-            BinPath = ComponentObj.CompBin
-            if BinPath != '-':
-                BinPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(BinPath))
-            BsfInf.writelines ("COMP_BIN" + \
-                               " = " + \
-                               BinPath + \
-                               T_CHAR_LF)
+                BsfInf.writelines ("COMP_TYPE" + \
+                                   " = " + \
+                                   ComponentObj.CompType + \
+                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_VER" + \
+                                   " = " + \
+                                   ComponentObj.CompVer + \
+                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_CS" + \
+                                   " = " + \
+                                   ComponentObj.CompCs + \
+                                   T_CHAR_LF)
 
-            SymPath = ComponentObj.CompSym
-            if SymPath != '-':
-                SymPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(SymPath))
-            BsfInf.writelines ("COMP_SYM" + \
-                               " = " + \
-                               SymPath + \
-                               T_CHAR_LF)
-            BsfInf.writelines ("COMP_SIZE" + \
-                               " = " + \
-                               ComponentObj.CompSize + \
-                               T_CHAR_LF)
-            BsfInf.writelines (T_CHAR_LF)
+                BinPath = ComponentObj.CompBin
+                if BinPath != '-':
+                    BinPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(BinPath))
+                BsfInf.writelines ("COMP_BIN" + \
+                                   " = " + \
+                                   BinPath + \
+                                   T_CHAR_LF)
 
-        BsfInf.close()
+                SymPath = ComponentObj.CompSym
+                if SymPath != '-':
+                    SymPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(SymPath))
+                BsfInf.writelines ("COMP_SYM" + \
+                                   " = " + \
+                                   SymPath + \
+                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_SIZE" + \
+                                   " = " + \
+                                   ComponentObj.CompSize + \
+                                   T_CHAR_LF)
+                BsfInf.writelines (T_CHAR_LF)
 
     ## GenFvList() method
     #
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index f40c8bd01b23..aa61bc00f277 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -46,13 +46,13 @@ def parsePcdInfoFromMapFile(mapfilepath, efifilepath):
     """
     lines = []
     try:
-        f = open(mapfilepath, 'r')
-        lines = f.readlines()
-        f.close()
+        with open(mapfilepath, 'r') as f:
+            lines = f.readlines()
     except:
         return None
     
-    if len(lines) == 0: return None
+    if len(lines) == 0: 
+        return None
     firstline = lines[0].strip()
     if (firstline.startswith("Archive member included ") and
         firstline.endswith(" file (symbol)")):
@@ -190,18 +190,13 @@ def _parseGeneral(lines, efifilepath):
     
 def generatePcdTable(list, pcdpath):
     try:
-        f = open(pcdpath, 'w')
+        with open(pcdpath, 'w') as f:
+            f.write('PCD Name                       Offset    Section Name\r\n')
+            for pcditem in list:
+                f.write('%-30s 0x%-08X %-6s\r\n' % (pcditem[0], pcditem[1], pcditem[2]))
     except:
         pass
 
-    f.write('PCD Name                       Offset    Section Name\r\n')
-    
-    for pcditem in list:
-        f.write('%-30s 0x%-08X %-6s\r\n' % (pcditem[0], pcditem[1], pcditem[2]))
-    f.close()
-
-    #print 'Success to generate Binary Patch PCD table at %s!' % pcdpath 
-
 if __name__ == '__main__':
     UsageString = "%prog -m <MapFile> -e <EfiFile> -o <OutFile>"
     AdditionalNotes = "\nPCD table is generated in file name with .BinaryPcdTable.txt postfix"
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index cf2fc7c4f70a..76fef41176ac 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -49,10 +49,9 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     #
     # Length of Binary File
     #
-    FileHandle = open(FileName, 'rb')
-    FileHandle.seek (0, 2)
-    FileLength = FileHandle.tell()
-    FileHandle.close()
+    with open(FileName, 'rb') as FileHandle:
+        FileHandle.seek (0, 2)
+        FileLength = FileHandle.tell()
     #
     # Unify string to upper string
     #
@@ -85,10 +84,9 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     #
     # Read binary file into array
     #
-    FileHandle = open(FileName, 'rb')
-    ByteArray = array.array('B')
-    ByteArray.fromfile(FileHandle, FileLength)
-    FileHandle.close()
+    with open(FileName, 'rb') as FileHandle:
+        ByteArray = array.array('B')
+        ByteArray.fromfile(FileHandle, FileLength)
     OrigByteList = ByteArray.tolist()
     ByteList = ByteArray.tolist()
     #
@@ -193,9 +191,8 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     if ByteList != OrigByteList:
         ByteArray = array.array('B')
         ByteArray.fromlist(ByteList)
-        FileHandle = open(FileName, 'wb')
-        ByteArray.tofile(FileHandle)
-        FileHandle.close()
+        with open(FileName, 'wb') as FileHandle:
+            ByteArray.tofile(FileHandle)
     return 0, "Patch Value into File %s successfully." % (FileName)
 
 ## Parse command line options
diff --git a/BaseTools/Source/Python/Table/TableReport.py b/BaseTools/Source/Python/Table/TableReport.py
index 9ce1d0aa2518..194c733033c7 100644
--- a/BaseTools/Source/Python/Table/TableReport.py
+++ b/BaseTools/Source/Python/Table/TableReport.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to create/update/query/erase table for ECC reports
 #
-# Copyright (c) 2008 - 2015, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -100,31 +100,30 @@ class TableReport(Table):
     #
     def ToCSV(self, Filename='Report.csv'):
         try:
-            File = open(Filename, 'w+')
-            File.write("""No, Error Code, Error Message, File, LineNo, Other Error Message\n""")
-            RecordSet = self.Query()
-            Index = 0
-            for Record in RecordSet:
-                Index = Index + 1
-                ErrorID = Record[1]
-                OtherMsg = Record[2]
-                BelongsToTable = Record[3]
-                BelongsToItem = Record[4]
-                IsCorrected = Record[5]
-                SqlCommand = ''
-                if BelongsToTable == 'File':
-                    SqlCommand = """select 1, FullPath from %s where ID = %s
-                             """ % (BelongsToTable, BelongsToItem)
-                else:
-                    SqlCommand = """select A.StartLine, B.FullPath from %s as A, File as B
-                                    where A.ID = %s and B.ID = A.BelongsToFile
+            with open(Filename, 'w') as File:
+                File.write("""No, Error Code, Error Message, File, LineNo, Other Error Message\n""")
+                RecordSet = self.Query()
+                Index = 0
+                for Record in RecordSet:
+                    Index = Index + 1
+                    ErrorID = Record[1]
+                    OtherMsg = Record[2]
+                    BelongsToTable = Record[3]
+                    BelongsToItem = Record[4]
+                    IsCorrected = Record[5]
+                    SqlCommand = ''
+                    if BelongsToTable == 'File':
+                        SqlCommand = """select 1, FullPath from %s where ID = %s
                                  """ % (BelongsToTable, BelongsToItem)
-                NewRecord = self.Exec(SqlCommand)
-                if NewRecord != []:
-                    File.write("""%s,%s,"%s",%s,%s,"%s"\n""" % (Index, ErrorID, EccToolError.gEccErrorMessage[ErrorID], NewRecord[0][1], NewRecord[0][0], OtherMsg))
-                    EdkLogger.quiet("%s(%s): [%s]%s %s" % (NewRecord[0][1], NewRecord[0][0], ErrorID, EccToolError.gEccErrorMessage[ErrorID], OtherMsg))
+                    else:
+                        SqlCommand = """select A.StartLine, B.FullPath from %s as A, File as B
+                                        where A.ID = %s and B.ID = A.BelongsToFile
+                                     """ % (BelongsToTable, BelongsToItem)
+                    NewRecord = self.Exec(SqlCommand)
+                    if NewRecord != []:
+                        File.write("""%s,%s,"%s",%s,%s,"%s"\n""" % (Index, ErrorID, EccToolError.gEccErrorMessage[ErrorID], NewRecord[0][1], NewRecord[0][0], OtherMsg))
+                        EdkLogger.quiet("%s(%s): [%s]%s %s" % (NewRecord[0][1], NewRecord[0][0], ErrorID, EccToolError.gEccErrorMessage[ErrorID], OtherMsg))
 
-            File.close()
         except IOError:
             NewFilename = 'Report_' + time.strftime("%Y%m%d_%H%M%S.csv", time.localtime())
             EdkLogger.warn("ECC", "The report file %s is locked by other progress, use %s instead!" % (Filename, NewFilename))
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index ecac316b7a3a..5c463df6bce5 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -58,22 +58,21 @@ class TargetTool():
     def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCharacter):
         """Convert a text file to a dictionary of (name:value) pairs."""
         try:
-            f = open(FileName,'r')
-            for Line in f:
-                if Line.startswith(CommentCharacter) or Line.strip() == '':
-                    continue
-                LineList = Line.split(KeySplitCharacter,1)
-                if len(LineList) >= 2:
-                    Key = LineList[0].strip()
-                    if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
-                        if Key == TAB_TAT_DEFINES_ACTIVE_PLATFORM or Key == TAB_TAT_DEFINES_TOOL_CHAIN_CONF \
-                          or Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER \
-                          or Key == TAB_TAT_DEFINES_ACTIVE_MODULE:
-                            self.TargetTxtDictionary[Key] = LineList[1].replace('\\', '/').strip()
-                        elif Key == TAB_TAT_DEFINES_TARGET or Key == TAB_TAT_DEFINES_TARGET_ARCH \
-                          or Key == TAB_TAT_DEFINES_TOOL_CHAIN_TAG or Key == TAB_TAT_DEFINES_BUILD_RULE_CONF:
-                            self.TargetTxtDictionary[Key] = LineList[1].split()
-            f.close()
+            with open(FileName,'r') as f:
+                for Line in f:
+                    if Line.startswith(CommentCharacter) or Line.strip() == '':
+                        continue
+                    LineList = Line.split(KeySplitCharacter,1)
+                    if len(LineList) >= 2:
+                        Key = LineList[0].strip()
+                        if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
+                            if Key == TAB_TAT_DEFINES_ACTIVE_PLATFORM or Key == TAB_TAT_DEFINES_TOOL_CHAIN_CONF \
+                              or Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER \
+                              or Key == TAB_TAT_DEFINES_ACTIVE_MODULE:
+                                self.TargetTxtDictionary[Key] = LineList[1].replace('\\', '/').strip()
+                            elif Key == TAB_TAT_DEFINES_TARGET or Key == TAB_TAT_DEFINES_TARGET_ARCH \
+                              or Key == TAB_TAT_DEFINES_TOOL_CHAIN_TAG or Key == TAB_TAT_DEFINES_BUILD_RULE_CONF:
+                                self.TargetTxtDictionary[Key] = LineList[1].split()
             return 0
         except:
             last_type, last_value, last_tb = sys.exc_info()
@@ -94,42 +93,38 @@ class TargetTool():
             
     def RWFile(self, CommentCharacter, KeySplitCharacter, Num):
         try:
-            fr = open(self.FileName, 'r')
-            fw = open(os.path.normpath(os.path.join(self.WorkSpace, 'Conf\\targetnew.txt')), 'w')
-
-            existKeys = []
-            for Line in fr:
-                if Line.startswith(CommentCharacter) or Line.strip() == '':
-                    fw.write(Line)
-                else:
-                    LineList = Line.split(KeySplitCharacter,1)
-                    if len(LineList) >= 2:
-                        Key = LineList[0].strip()
-                        if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
-                            if Key not in existKeys:
-                                existKeys.append(Key)
-                            else:
-                                print "Warning: Found duplicate key item in original configuration files!"
-                                
-                            if Num == 0:
-                                Line = "%-30s = \n" % Key
-                            else:
-                                ret = GetConfigureKeyValue(self, Key)
-                                if ret is not None:
-                                    Line = ret
-                            fw.write(Line)
-            for key in self.TargetTxtDictionary:
-                if key not in existKeys:
-                    print "Warning: %s does not exist in original configuration file" % key
-                    Line = GetConfigureKeyValue(self, key)
-                    if Line is None:
-                        Line = "%-30s = " % key
-                    fw.write(Line)
+            with open(self.FileName, 'r') as fr:
+                FileRead = fr.readlines()
+            with open(self.FileName, 'w') as fw:
+                existKeys = []
+                for Line in FileRead:
+                    if Line.startswith(CommentCharacter) or Line.strip() == '':
+                        fw.write(Line)
+                    else:
+                        LineList = Line.split(KeySplitCharacter,1)
+                        if len(LineList) >= 2:
+                            Key = LineList[0].strip()
+                            if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
+                                if Key not in existKeys:
+                                    existKeys.append(Key)
+                                else:
+                                    print "Warning: Found duplicate key item in original configuration files!"
+                                    
+                                if Num == 0:
+                                    Line = "%-30s = \n" % Key
+                                else:
+                                    ret = GetConfigureKeyValue(self, Key)
+                                    if ret is not None:
+                                        Line = ret
+                                fw.write(Line)
+                for key in self.TargetTxtDictionary:
+                    if key not in existKeys:
+                        print "Warning: %s does not exist in original configuration file" % key
+                        Line = GetConfigureKeyValue(self, key)
+                        if Line is None:
+                            Line = "%-30s = " % key
+                        fw.write(Line)
                 
-            fr.close()
-            fw.close()
-            os.remove(self.FileName)
-            os.rename(os.path.normpath(os.path.join(self.WorkSpace, 'Conf\\targetnew.txt')), self.FileName)
             
         except:
             last_type, last_value, last_tb = sys.exc_info()
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index d2e6d317676c..a92df52979c6 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -141,14 +141,12 @@ gIncludedAslFile = []
 def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
     CreateDirectory(os.path.dirname(Target))
     try:
-        f = open (Source, 'r')
+        with open (Source, 'r') as f:
+            # read whole file
+            Lines = f.readlines()
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
 
-    # read whole file
-    Lines = f.readlines()
-    f.close()
-
     PreprocessedFile = ""
     InjectedFile = ""
     LineIndexOfOriginalFile = None
@@ -243,11 +241,10 @@ def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
 
     # save to file
     try:
-        f = open (Target, 'wb')
+        with open (Target, 'wb') as f:
+            f.writelines(NewLines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
-    f.writelines(NewLines)
-    f.close()
 
 ## Trim preprocessed VFR file
 #
@@ -261,12 +258,11 @@ def TrimPreprocessedVfr(Source, Target):
     CreateDirectory(os.path.dirname(Target))
     
     try:
-        f = open (Source,'r')
+        with open (Source,'r') as f:
+            # read whole file
+            Lines = f.readlines()
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
-    # read whole file
-    Lines = f.readlines()
-    f.close()
 
     FoundTypedef = False
     Brace = 0
@@ -310,11 +306,10 @@ def TrimPreprocessedVfr(Source, Target):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        with open (Target,'w') as f:
+            f.writelines(Lines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
-    f.writelines(Lines)
-    f.close()
 
 ## Read the content  ASL file, including ASL included, recursively
 #
@@ -340,7 +335,8 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         for IncludePath in SearchPathList:
             IncludeFile = os.path.join(IncludePath, Source)
             if os.path.isfile(IncludeFile):
-                F = open(IncludeFile, "r")
+                with open(IncludeFile, "r") as OpenFile:
+                    FileLines = OpenFile.readlines()
                 break
         else:
             EdkLogger.error("Trim", "Failed to find include file %s" % Source)
@@ -356,7 +352,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         return []
     gIncludedAslFile.append(IncludeFile)
     
-    for Line in F:
+    for Line in FileLines:
         LocalSearchPath = None
         Result = gAslIncludePattern.findall(Line)
         if len(Result) == 0:
@@ -375,7 +371,6 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         NewFileContent.append("\n")
 
     gIncludedAslFile.pop()
-    F.close()
 
     return NewFileContent
 
@@ -425,12 +420,11 @@ def TrimAslFile(Source, Target, IncludePathFile):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        with open (Target,'w') as f:
+            f.writelines(Lines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
-    f.writelines(Lines)
-    f.close()
 
 def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     VfrNameList = []
@@ -450,11 +444,6 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     if not VfrUniOffsetList:
         return
 
-    try:
-        fInputfile = open(OutputFile, "wb+", 0)
-    except:
-        EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
-
     # Use a instance of StringIO to cache data
     fStringIO = StringIO.StringIO('')
 
@@ -483,16 +472,16 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
             VfrValue = pack ('Q', int (Item[1], 16))
             fStringIO.write (VfrValue)
 
-    #
-    # write data into file.
-    #
-    try :
-        fInputfile.write (fStringIO.getvalue())
+    try:
+        with open(OutputFile, "wb+", 0) as fInputfile:
+            try :
+                fInputfile.write (fStringIO.getvalue())
+            except:
+                EdkLogger.error("Trim", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %OutputFile, None)
     except:
-        EdkLogger.error("Trim", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %OutputFile, None)
+        EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
 
     fStringIO.close ()
-    fInputfile.close ()
 
 ## Trim EDK source code file(s)
 #
@@ -560,12 +549,11 @@ def TrimEdkSourceCode(Source, Target):
     CreateDirectory(os.path.dirname(Target))
 
     try:
-        f = open (Source,'rb')
+        with open (Source,'rb') as f:
+            # read whole file
+            Lines = f.read()
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
-    # read whole file
-    Lines = f.read()
-    f.close()
 
     NewLines = None
     for Re,Repl in gImportCodePatterns:
@@ -579,11 +567,10 @@ def TrimEdkSourceCode(Source, Target):
         return
 
     try:
-        f = open (Target,'wb')
+        with open (Target,'wb') as f:
+            f.write(NewLines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
-    f.write(NewLines)
-    f.close()
 
 
 ## Parse command line options
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index edae4e5e7fcf..a472e13da7e0 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -118,13 +118,10 @@ def GetDependencyList(FileStack,SearchPathList):
             CurrentFileDependencyList = DepDb[F]
         else:
             try:
-                Fd = open(F, 'r')
-                FileContent = Fd.read()
+                with open(F, 'r') as Fd:
+                    FileContent = Fd.read()
             except BaseException, X:
                 EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F + "\n\t" + str(X))
-            finally:
-                if "Fd" in dir(locals()):
-                    Fd.close()
 
             if len(FileContent) == 0:
                 continue
@@ -2109,9 +2106,8 @@ class DscBuildData(PlatformBuildClassObject):
         MessageGroup = []
         if returncode <>0:
             CAppBaseFileName = os.path.join(self.OutputPath, PcdValueInitName)
-            File = open (CAppBaseFileName + '.c', 'r')
-            FileData = File.readlines()
-            File.close()
+            with open (CAppBaseFileName + '.c', 'r') as File:
+                FileData = File.readlines()
             for Message in Messages:
                 if " error" in Message or "warning" in Message:
                     FileInfo = Message.strip().split('(')
@@ -2155,9 +2151,8 @@ class DscBuildData(PlatformBuildClassObject):
             if returncode <> 0:
                 EdkLogger.warn('Build', COMMAND_FAILURE, 'Can not collect output from command: %s' % Command)
 
-        File = open (OutputValueFile, 'r')
-        FileBuffer = File.readlines()
-        File.close()
+        with open (OutputValueFile, 'r') as File:
+            FileBuffer = File.readlines()
 
         StructurePcdSet = []
         for Pcd in FileBuffer:
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 4e1d0cc18c16..534d73e15c11 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -291,18 +291,18 @@ class DepexParser(object):
     # @param DepexFileName   The file name of binary dependency expression file.
     #
     def ParseDepexFile(self, DepexFileName):
-        DepexFile = open(DepexFileName, "rb")
         DepexStatement = []
-        OpCode = DepexFile.read(1)
-        while OpCode:
-            Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
-            if Statement in ["BEFORE", "AFTER", "PUSH"]:
-                GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
-                            struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
-                GuidString = self._GuidDb.get(GuidValue, GuidValue)
-                Statement = "%s %s" % (Statement, GuidString)
-            DepexStatement.append(Statement)
+        with open(DepexFileName, "rb") as DepexFile:
             OpCode = DepexFile.read(1)
+            while OpCode:
+                Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
+                if Statement in ["BEFORE", "AFTER", "PUSH"]:
+                    GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
+                                struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
+                    GuidString = self._GuidDb.get(GuidValue, GuidValue)
+                    Statement = "%s %s" % (Statement, GuidString)
+                DepexStatement.append(Statement)
+                OpCode = DepexFile.read(1)
 
         return DepexStatement
     
@@ -629,14 +629,14 @@ class ModuleReport(object):
         FwReportFileName = os.path.join(self._BuildDir, "DEBUG", self.ModuleName + ".txt")
         if os.path.isfile(FwReportFileName):
             try:
-                FileContents = open(FwReportFileName).read()
-                Match = gModuleSizePattern.search(FileContents)
-                if Match:
-                    self.Size = int(Match.group(1))
+                with open(FwReportFileName).read() as FileContents:
+                    Match = gModuleSizePattern.search(FileContents)
+                    if Match:
+                        self.Size = int(Match.group(1))
 
-                Match = gTimeStampPattern.search(FileContents)
-                if Match:
-                    self.BuildTimeStamp = datetime.fromtimestamp(int(Match.group(1)))
+                    Match = gTimeStampPattern.search(FileContents)
+                    if Match:
+                        self.BuildTimeStamp = datetime.fromtimestamp(int(Match.group(1)))
             except IOError:
                 EdkLogger.warn(None, "Fail to read report file", FwReportFileName)
 
@@ -1527,14 +1527,12 @@ class PredictionReport(object):
         GuidList = os.path.join(self._EotDir, "GuidList.txt")
         DispatchList = os.path.join(self._EotDir, "Dispatch.txt")
 
-        TempFile = open(SourceList, "w+")
-        for Item in self._SourceList:
-            FileWrite(TempFile, Item)
-        TempFile.close()
-        TempFile = open(GuidList, "w+")
-        for Key in self._GuidMap:
-            FileWrite(TempFile, "%s %s" % (Key, self._GuidMap[Key]))
-        TempFile.close()
+        with open(SourceList, "w") as TempFile:
+            for Item in self._SourceList:
+                FileWrite(TempFile, Item)
+        with open(GuidList, "w") as TempFile:
+            for Key in self._GuidMap:
+                FileWrite(TempFile, "%s %s" % (Key, self._GuidMap[Key]))
 
         try:
             from Eot.Eot import Eot
@@ -1925,23 +1923,22 @@ class FdReport(object):
                 break
 
         if os.path.isfile(self.VpdFilePath):
-            fd = open(self.VpdFilePath, "r")
-            Lines = fd.readlines()
-            for Line in Lines:
-                Line = Line.strip()
-                if len(Line) == 0 or Line.startswith("#"):
-                    continue
-                try:
-                    PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
-                    PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
-                    if Offset.lower().startswith('0x'):
-                        Offset = '0x%08X' % (int(Offset, 16) + self.VPDBaseAddress)
-                    else:
-                        Offset = '0x%08X' % (int(Offset, 10) + self.VPDBaseAddress)
-                    self.VPDInfoList.append("%s | %s | %s | %s | %s" % (PcdName, SkuId, Offset, Size, Value))
-                except:
-                    EdkLogger.error("BuildReport", CODE_ERROR, "Fail to parse VPD information file %s" % self.VpdFilePath)
-            fd.close()
+            with open(self.VpdFilePath, "r") as fd:
+                Lines = fd.readlines()
+                for Line in Lines:
+                    Line = Line.strip()
+                    if len(Line) == 0 or Line.startswith("#"):
+                        continue
+                    try:
+                        PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
+                        PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
+                        if Offset.lower().startswith('0x'):
+                            Offset = '0x%08X' % (int(Offset, 16) + self.VPDBaseAddress)
+                        else:
+                            Offset = '0x%08X' % (int(Offset, 10) + self.VPDBaseAddress)
+                        self.VPDInfoList.append("%s | %s | %s | %s | %s" % (PcdName, SkuId, Offset, Size, Value))
+                    except:
+                        EdkLogger.error("BuildReport", CODE_ERROR, "Fail to parse VPD information file %s" % self.VpdFilePath)
 
     ##
     # Generate report for the firmware device.
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 57a5e3525d88..0e1278f5446d 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -320,9 +320,8 @@ def LaunchCommand(Command, WorkingDir):
         # print out the Response file and its content when make failure
         RespFile = os.path.join(WorkingDir, 'OUTPUT', 'respfilelist.txt')
         if os.path.isfile(RespFile):
-            f = open(RespFile)
+            with open(RespFile) as f:
             RespContent = f.read()
-            f.close()
             EdkLogger.info(RespContent)
 
         EdkLogger.error("build", COMMAND_FAILURE, ExtraData="%s [%s]" % (Command, WorkingDir))
@@ -1169,9 +1168,8 @@ class Build():
                 EdkLogger.error("Prebuild", PREBUILD_ERROR, 'Prebuild process is not success!')
 
             if os.path.exists(PrebuildEnvFile):
-                f = open(PrebuildEnvFile)
+                with open(PrebuildEnvFile) as f:
                 envs = f.readlines()
-                f.close()
                 envs = itertools.imap(lambda l: l.split('=',1), envs)
                 envs = itertools.ifilter(lambda l: len(l) == 2, envs)
                 envs = itertools.imap(lambda l: [i.strip() for i in l], envs)
@@ -1451,7 +1449,7 @@ class Build():
             FunctionList = []
             if os.path.exists(ImageMapTable):
                 OrigImageBaseAddress = 0
-                ImageMap = open(ImageMapTable, 'r')
+                with open(ImageMapTable, 'r') as ImageMap:
                 for LinStr in ImageMap:
                     if len (LinStr.strip()) == 0:
                         continue
@@ -1473,7 +1471,6 @@ class Build():
                                 # Get the real entry point address for IPF image.
                                 #
                                 ModuleInfo.Image.EntryPoint = RelativeAddress
-                ImageMap.close()
             #
             # Add general information.
             #
@@ -1528,7 +1525,7 @@ class Build():
                 FvMapBuffer = os.path.join(Wa.FvDir, FvName + '.Fv.map')
                 if not os.path.exists(FvMapBuffer):
                     continue
-                FvMap = open(FvMapBuffer, 'r')
+                with open(FvMapBuffer, 'r') as FvMap:
                 #skip FV size information
                 FvMap.readline()
                 FvMap.readline()
@@ -1553,8 +1550,6 @@ class Build():
                         if GuidString.upper() in ModuleList:
                             MapBuffer.write('(IMAGE=%s)\n' % (os.path.join(ModuleList[GuidString.upper()].DebugDir, ModuleList[GuidString.upper()].Name + '.efi')))
 
-                FvMap.close()
-
     ## Collect MAP information of all modules
     #
     def _CollectModuleMapBuffer (self, MapBuffer, ModuleList):
@@ -2193,10 +2188,9 @@ class Build():
 
                     # Write out GuidedSecTools.txt
                     toolsFile = os.path.join(FvDir, 'GuidedSectionTools.txt')
-                    toolsFile = open(toolsFile, 'wt')
+                    with open(toolsFile, 'w') as toolsFile:
                     for guidedSectionTool in guidAttribs:
                         print >> toolsFile, ' '.join(guidedSectionTool)
-                    toolsFile.close()
 
     ## Returns the full path of the tool.
     #
-- 
2.16.2.windows.1



  parent reply	other threads:[~2018-06-20 21:08 UTC|newest]

Thread overview: 13+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2018-06-20 21:08 [PATCH v2 00/11] BaseTools Refactoring Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 01/11] BaseTools: decorate base classes to prevent instantiation Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 02/11] BaseTools: Workspace - create a base class Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 03/11] BaseTools: remove unused code Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 04/11] BaseTools: remove repeated calls to startswith/endswith Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 05/11] BaseTools: use set presence instead of series of equality Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 06/11] BaseTools: refactor section generation Jaben Carsey
2018-06-20 21:08 ` Jaben Carsey [this message]
2018-06-20 21:08 ` [PATCH v1 08/11] BaseTools: refactor to change object types Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 09/11] BaseTools: refactor to stop re-allocating strings Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 10/11] BaseTools: change to set for membership testing Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 11/11] BaseTools: remove extra assignment Jaben Carsey
  -- strict thread matches above, loose matches on Subject: below --
2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 07/11] BaseTools: refactor file opening/writing Jaben Carsey

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-list from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=605bc234de703e41e099fb1105678f9fb8411d28.1529528784.git.jaben.carsey@intel.com \
    --to=devel@edk2.groups.io \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox