FAQ Search Today's Posts Mark Forums Read
» Video Reviews

» Linux Archive

Linux-archive is a website aiming to archive linux email lists and to make them easily accessible for linux users/developers.


» Sponsor

» Partners

» Sponsor

Go Back   Linux Archive > CentOS > CentOS

 
 
LinkBack Thread Tools
 
Old 02-19-2010, 08:04 PM
Chris Lumens
 
Default Reset the resolver cache after bringing up the network (#562209).

Throughout stage2 if we bring up the network after doing various actions,
we need to make sure the DNS resolver is reset to pick up the new information.
However, pycurl/libcurl uses the c-ares resolver which does not have a
method similar to res_init. Instead we need to tear down the pycurl.Curl
object instance cached in urlgrabber and create a new one. This does the
same thing as re-reading /etc/resolv.conf, but in more steps.
---
anaconda.spec.in | 4 ++--
iw/task_gui.py | 7 +++++++
yuminstall.py | 14 +++++++++++++-
3 files changed, 22 insertions(+), 3 deletions(-)

diff --git a/anaconda.spec.in b/anaconda.spec.in
index 5502725..7e36738 100644
--- a/anaconda.spec.in
+++ b/anaconda.spec.in
@@ -67,7 +67,7 @@ BuildRequires: newt-devel
BuildRequires: pango-devel
BuildRequires: pykickstart >= %{pykickstartver}
BuildRequires: python-devel
-BuildRequires: python-urlgrabber
+BuildRequires: python-urlgrabber >= 3.9.1-5
BuildRequires: rpm-devel
BuildRequires: rpm-python >= %{rpmpythonver}
BuildRequires: slang-devel >= %{slangver}
@@ -94,7 +94,7 @@ Requires: parted >= %{partedver}
Requires: pyparted >= %{pypartedver}
Requires: yum >= %{yumver}
Requires: libxml2-python
-Requires: python-urlgrabber
+Requires: python-urlgrabber >= 3.9.1-5
Requires: system-logos
Requires: pykickstart >= %{pykickstartver}
Requires: system-config-date >= %{syscfgdatever}
diff --git a/iw/task_gui.py b/iw/task_gui.py
index b5aa73b..0a18da8 100644
--- a/iw/task_gui.py
+++ b/iw/task_gui.py
@@ -35,6 +35,7 @@ import network
import iutil

from yuminstall import AnacondaYumRepo
+import urlgrabber.grabber
import yum.Errors

import logging
@@ -520,6 +521,8 @@ class TaskWindow(InstallWindow):
if not self.anaconda.intf.enableNetwork():
return gtk.RESPONSE_CANCEL

+ urlgrabber.grabber.reset_curl_obj()
+
dialog = RepoEditor(self.anaconda, repo)
dialog.createDialog()
dialog.run()
@@ -538,6 +541,8 @@ class TaskWindow(InstallWindow):
if not self.anaconda.intf.enableNetwork():
return gtk.RESPONSE_CANCEL

+ urlgrabber.grabber.reset_curl_obj()
+
s = self.xml.get_widget("repoList").get_model()
s.append([dialog.repo.isEnabled(), dialog.repo.name, dialog.repo])

@@ -573,6 +578,8 @@ class TaskWindow(InstallWindow):
if not self.anaconda.intf.enableNetwork():
return

+ urlgrabber.grabber.reset_curl_obj()
+
repo.enable()
if not setupRepo(self.anaconda, repo):
return
diff --git a/yuminstall.py b/yuminstall.py
index 32d22c2..9d70600 100644
--- a/yuminstall.py
+++ b/yuminstall.py
@@ -454,6 +454,8 @@ class AnacondaYum(YumSorter):
self._baseRepoURL = None
return

+ urlgrabber.grabber.reset_curl_obj()
+
self._switchImage(1)
self.mediagrabber = self.mediaHandler
elif m.startswith("http") or m.startswith("ftp:"):
@@ -463,6 +465,8 @@ class AnacondaYum(YumSorter):
if not self.anaconda.intf.enableNetwork():
self._baseRepoURL = None

+ urlgrabber.grabber.reset_curl_obj()
+
(opts, server, path) = iutil.parseNfsUrl(m)
isys.mount(server+":"+path, self.tree, "nfs", options=opts)

@@ -690,6 +694,8 @@ class AnacondaYum(YumSorter):
custom_buttons=[_("_Exit installer")])
sys.exit(1)

+ urlgrabber.grabber.reset_curl_obj()
+
dest = tempfile.mkdtemp("", ksrepo.name.replace(" ", ""), "/mnt")

# handle "nfs://" prefix
@@ -785,6 +791,8 @@ class AnacondaYum(YumSorter):
if not self.anaconda.intf.enableNetwork():
return

+ urlgrabber.grabber.reset_curl_obj()
+
rc = self.anaconda.intf.messageWindow(_("Error"),
_("The file %s cannot be opened. This is due to a missing "
"file, a corrupt package or corrupt media. Please "
@@ -1153,6 +1161,8 @@ reposdir=/etc/anaconda.repos.d,/tmp/updates/anaconda.repos.d,/tmp/product/anacon
custom_buttons=[_("_Exit installer")])
sys.exit(1)

+ urlgrabber.grabber.reset_curl_obj()
+
self.doRepoSetup(anaconda)
self.doSackSetup(anaconda)
self.doGroupSetup(anaconda)
@@ -1231,7 +1241,9 @@ reposdir=/etc/anaconda.repos.d,/tmp/updates/anaconda.repos.d,/tmp/product/anacon
if repo.needsNetwork() and not network.hasActiveNetDev():
if anaconda.intf.enableNetwork():
repo.mirrorlistparsed = False
- continue
+ continue
+
+ urlgrabber.grabber.reset_curl_obj()

buttons = [_("_Exit installer"), _("Edit"), _("_Retry")]
else:
--
1.6.5.1

_______________________________________________
Anaconda-devel-list mailing list
Anaconda-devel-list@redhat.com
https://www.redhat.com/mailman/listinfo/anaconda-devel-list
 
Old 02-22-2010, 07:38 AM
Ales Kozumplik
 
Default Reset the resolver cache after bringing up the network (#562209).

On 02/19/2010 10:04 PM, Chris Lumens wrote:

Throughout stage2 if we bring up the network after doing various actions,
we need to make sure the DNS resolver is reset to pick up the new information.
However, pycurl/libcurl uses the c-ares resolver which does not have a
method similar to res_init. Instead we need to tear down the pycurl.Curl
object instance cached in urlgrabber and create a new one. This does the
same thing as re-reading /etc/resolv.conf, but in more steps.


Ack, just two little comments.



diff --git a/anaconda.spec.in b/anaconda.spec.in
index 5502725..7e36738 100644
--- a/anaconda.spec.in
+++ b/anaconda.spec.in
@@ -67,7 +67,7 @@ BuildRequires: newt-devel
BuildRequires: pango-devel
BuildRequires: pykickstart>= %{pykickstartver}
BuildRequires: python-devel
-BuildRequires: python-urlgrabber
+BuildRequires: python-urlgrabber>= 3.9.1-5


Why is it here at all? Do we need python-urlgrabber to buill the
anaconda rpm?



- continue
+ continue

Trailing whitespace?

Ales

_______________________________________________
Anaconda-devel-list mailing list
Anaconda-devel-list@redhat.com
https://www.redhat.com/mailman/listinfo/anaconda-devel-list
 

Thread Tools




All times are GMT. The time now is 11:35 PM.

VBulletin, Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO ©2007, Crawlability, Inc.
Copyright 2007 - 2008, www.linux-archive.org