This is the mail archive of the crossgcc@sourceware.org mailing list for the crossgcc project.

See the CrossGCC FAQ for lots more information.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [PATCH] option to prefer HTTP for downloads


Matthias,

On Tuesday 06 May 2008 15:10:06 Yann E. MORIN wrote:
> May be we should shorten this timeout to something more humanly acceptable,
> still accounting for those slow servers and networks.

I've made the connect timeout a configurable item.
Care to test the attached patch, please?

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +0/33 662376056 | Software  Designer | \ / CAMPAIGN     |   ^                |
| --==< °_° >==-- °------------.-------:  X  AGAINST      |  /e\  There is no  |
| http://ymorin.is-a-geek.org/ | * _ * | / \ HTML MAIL    |  """  conspiracy.  |
°------------------------------°-------°------------------°--------------------°
Index: ct-ng.trunk/scripts/functions
===================================================================
--- ct-ng.trunk/scripts/functions	(revision 623)
+++ ct-ng.trunk/scripts/functions	(working copy)
@@ -283,22 +283,22 @@
     # With automated download as we are doing, it can be very dangerous to use
     # -c to continue the downloads. It's far better to simply overwrite the
     # destination file
-    # Some company networks have proxies to connect to to the internet, but
-    # it's not easy to detect them, and wget may never timeout while connecting,
-    # so force a global 120s timeout.
-    wget -T 120 -nc --progress=dot:binary --tries=3 --passive-ftp "$1"  \
-    || wget -T 120 -nc --progress=dot:binary --tries=3 "$1"             \
+    # Some company networks have firewalls to connect to the internet, but it's
+    # not easy to detect them, and wget does not timeout by default  while
+    # connecting, so force a global ${CT_CONNECT_TIMEOUT}-second timeout.
+    wget -T ${CT_CONNECT_TIMEOUT} -nc --progress=dot:binary --tries=3 --passive-ftp "$1"    \
+    || wget -T ${CT_CONNECT_TIMEOUT} -nc --progress=dot:binary --tries=3 "$1"               \
     || true
 }
 
 # Download an URL using curl
 # Usage: CT_DoGetFileCurl <URL>
 CT_DoGetFileCurl() {
-    # Note: comments about wget method are also valid here
+    # Note: comments about wget method (above) are also valid here
     # Plus: no good progress indicator is available with curl,
     #       so output is consigned to oblivion
-    curl --ftp-pasv -O --retry 3 "$1" --connect-timeout 120 >/dev/null  \
-    || curl -O --retry 3 "$1" --connect-timeout 120 >/dev/null          \
+    curl --ftp-pasv -O --retry 3 "$1" --connect-timeout ${CT_CONNECT_TIMEOUT} >/dev/null    \
+    || curl -O --retry 3 "$1" --connect-timeout ${CT_CONNECT_TIMEOUT} >/dev/null            \
     || true
 }
 
Index: ct-ng.trunk/config/global/download_extract.in
===================================================================
--- ct-ng.trunk/config/global/download_extract.in	(revision 623)
+++ ct-ng.trunk/config/global/download_extract.in	(working copy)
@@ -20,6 +20,31 @@
       
       Usefull to pre-retrieve the tarballs before going off-line.
 
+config CONNECT_TIMEOUT
+    int
+    prompt "connection timeout"
+    default 10
+    help
+      From the curl manual:
+        Maximum time in seconds that you allow the connection to the server to take.
+
+      The scenario is as follows;
+        - some enterprise networks have firewalls that prohibit FTP traffic, while
+          still allowing HTTP
+        - most download sites have http:// equivalent for the ftp:// URL
+        - after this number of seconds, it is considered that the connection could
+          not be established, and the next URL in the list is tried, until we reach
+          an URL that will go through the firewall, most probably an http:// URL.
+
+      If you have a slow network, you'd better set this value higher than the default
+      10s. If you know a firewall is blocking connections, but your network is globally
+      fast, you can try to lower this value to jump more quickly to allowed URLs. YMMV.
+
+      Not that this value applies equally to wget if you have that installed.
+
+      Of course, you'd be better off to use a proxy, as offered by the following
+      choice of options.
+
 choice
     bool
     prompt "Proxy type"

--
For unsubscribe information see http://sourceware.org/lists.html#faq

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]