Tag Archives: Drop List

Buying Drop Lists

Previously, I talked about Zone File AccessDAC Connections, then started a Quadrology of Articles; Part 1: What You Need To Build A Drop List, Part 2: Building A Drop List, Part 3: Maintaining A Drop List and Part 4: Accessing Drop Lists

Now I am going to cover the easiest, cheapest and least painful (on the pocket) method of obtaining drop lists.  I’m not going to discus if the lists are cherry picked or filtered or anything else, since its futile. All I’m going to say is, depending on how the list were built covers how complete it is, which accounts for missing names. I’ll cover how older lists were made briefly later on. 

Generally beyond a week will cost anywhere from £10-50 per month depending on the metrics (data points provided) and how far into the future the lists go. £10-15 per month will get 7-30 days, £15-25 will get you 30-60 days, £40+ per month will get you 60+ days. I’m not aware of anyone who is offering unrestricted access to complete zone files, most people who have this have built their own.

I am aware of 1 or 2 people who offer a PRSS style look up, where you give them a keyword and they return X many records which match. Prices are usually £2-3 per query, using my guide you could build your own system even without Dac Access. 

Free Drop Lists

Domain Lore Droplist

Domain Lore Droplist

This is by far the easiest option, but also the most limiting option. It really depends what you require from the drop lists.

The main limitation with free drop lists is the date range, I believe the longest publicly accessible range is 10 days, but I haven’t seen it with my own eyes. Most I have seen are 7 days, some are 7  Days + Today. Domain Lore and Caught are the first 2 in this group which comes to mind.

Domain Lore use the old PRSS Generated Zone file from 2006-ish, which were around 2 million names out of a current 11 million. The upside is, it contains the early names which are arguably the better names. It’s also shorter at around 700 per day.

It doesn’t include .UK at all, extremely limited .org.uk and .me.uk, the former will become more critical as we draw closer to 2019.  Domain Lore includes some metrics, and has some sort functions. 

Caught, I’m not sure where their data comes from, but likely built up over the years. Again I’ll cover the method I suspect here in the building part later on. Neither of the above have paid options or offer a complete drop lists at last I checked.

Domain View Quick Drops

Domain View Quick Drops

DomainView, which currently uses the Nominet zone file, and previously used the .com zonefile and word lists. They offer pretty complete metrics. Rob the developer is always open to suggestions for new metrics to add. They even go as far as showing you what has been booked with their public catching service within the scope of your membership. 

They offer 2 days (Today/Tomorrow) of drop lists, but no metrics at all for free. The lists are often in excess of 4000-5000 names, and include .uk, co.uk, org.uk, me.uk and net.uk, I haven’t noticed any plc/ltd/sch.uk domains. The list is updated with hourly sweeps to remove dropped domains so it shortens through the day. More than 2 days are available for a price, but thats the next section.

ExpiredDomains.net offers some basic metrics and but its limited. 

Paid Drop Lists

There are various paid services, the main 2 I’m aware of DomainView and Dropped.UK, I’m sure there are more than that, but that’s all I’m aware of. There were a few others such as DCE but they closed down.

Domain View Subs

Domain View Subs

DomainView (60 days, Zonefiles, Parking, Backorder Credits, Upto £15) offers a variety of subscription options beyond the 2 free days. These are 7, 14, 30 and 60 days ranging from £5-15 which include various other things. These extras include zonefile downloads (not .uk), backorder credits, parking system access and more.

The Current Metrics are… 

Main Domain Back Ordered, .UK BackOrdered, Domain Name, Length of Doamin, Register Year, Domain Keywords, Google Searches, Google Cost Per Click (CPC), Alexa Links, Alexa Rank, Google Page Rank, Registered in Other Extensions, Taken in .com, Taken in .net, Taken in .org, Taken in .us, Taken in .info, Taken in .biz, Taken in .mobi, Taken in .xxx, Majestic Backlinks, Majestic Referring Domains, Majestic .EDU Backlinks, Majestic Referring EDU Domains, Majestic .GOV Backlinks, Majestic GOV Referers, Majestic Trust Flow, Majestic Citation Flow, Moz Domain Authority, Moz Page Authority, Moz Backlinks, Moz Rank, Times in Archive.org, First Archive.org Date, IPS Tag, Scheduled Drop Date.

Some of the more interesting features is, they show you what domains have been booked by their backorder system and you can also catch under multiple names. 

Dropped.uk Paid Droplist

Dropped.uk Paid Droplist

Dropped (61 Days, Backorder Credits, upto £20), I’m not 100% sure what metrics Dropped.UK offer, beyond a few basics. They are a more expensive option than domain view.

I should really use the free trial to see what metrics they offer just for the sake of this post but really they should show what you get rather than harvesting contact data.

Dropped do have a few nice catches to their tag, but it seems they book certain names for themselves, which I’m not keen on. They are not the only catching company which does this, and this isn’t really within the remit of this article. 

I’m sure there are other droplist services but I think I have covered the main free lists and paid lists, and some good options there. 

If you know of any others do use the comments and I’ll update this post, same for any corrections or new information. 

Updating the Drop List Download Application

I received an email from someone who wanted to download the output of an MySQL Query from a DropList Database, but it were to use on a windows client. The code snippet series I posted used /n instead of the /r/n, and would require the use of unix2dos, even a simple str_replace, but while looking for a solution on another project I came across Stream Filters, which can be applied in this instance quite elegantly.

I would have gone around the issue by processing the query output before producing the download. However this email weren’t asking for that, so I thought of a different solution, of which I came up with 2 or 3 different ways. I always think of Larry Wall the creator of PERL who once said “Theres always more than one way to do it”, which is nigh on a PERL Mantra but in PHP there are usually half dozen different ways to do it, then a dozen more overkill ways on top. 

I hadn’t really used Stream Filters that much, and only then for handling affiliate feeds which are CSV files. This function’s code is by a chap called Torge, and can be found on this StackOverflow post Fputcvs and New Lines. Bending and fusing the function to this purpose is actually quite elegant… and I like it.  

<?php 
include('../con.sql.php'); 
$today = date("Y-m-d");

class StreamFilterNewlines extends php_user_filter {
    function filter($in, $out, &$consumed, $closing) {
        while ( $bucket = stream_bucket_make_writeable($in) ) {
            $bucket->data = preg_replace('/([^\r])\n/', "$1\r\n", $bucket->data);
            $consumed += $bucket->datalen;
            stream_bucket_append($out, $bucket);
        }
        return PSFS_PASS_ON;
    }
}

$query = "SELECT domain FROM `zonefile` WHERE `dropdate` = '$today';";
$result = mysql_query($query); 
$fp = fopen('php://output', 'w'); 
if ($fp && $result) {     
       stream_filter_register("newlines", "StreamFilterNewlines"); 
       stream_filter_append($fp, "newlines");        
       header('Content-Type: text/csv');
       header('Content-Disposition: attachment; filename="'.$today.'.txt"');
       header('Pragma: no-cache');    
       header('Expires: 0');

       while ($row = mysql_fetch_row($result)) {
          fputcsv($fp, array_values($row)); 
       }
die; 
}
?>

The above fusion of Torge’s code and my code creates a windows encoded text file based on the MySQL query output and starts/prompts a download of said file. It could be neatened up, and for public deployment certain needs some bomb proofing. 

How you modify the date / change the query is down to you. I would perhaps use a form with a submit via post with some sanitising, but this works for a simple download todays list script.

Extending The Drop List Application

You could even modify the download aspect and have it write to a file or email the download to you or a client.

In a similar tool, I have something like this which runs on a cron at midnight to create a downloadable file, which also becomes a historic record for the days list. This way multiple people can download the list without the server having to process the MySQL and increase the server load every time.

Before I used a Cron like above, the first user of the day ran the query and subsequent users downloaded the output. The problem were the first user could end up loading it during a critical time or it would take too long. 

I used a simple if…FileName…Exist…PromptDownload…else…RunQuery type switch, ultimately the CRON method is faster and better to maintain, the code above can be modified this way.

Accessing Your Drop Lists

In previous parts, I covered DAC Connections, then Part 1: What You Need To Build A Drop List, Part 2: Building A Drop List, Part 3: Maintaining A Drop List and now Part 4: Accessing Your Drop Lists. 

There really is no point to having a populated zone file, if you are unable to access it for any reason or produce your own drop lists. This article will cover those who have populated the zone file, but also for those who make lots of searches to see what exists. This latter point, is more about research and mark/rights protection, as well as finding potential buyers, and part of this article will suit your needs too.

Downloading Todays List

This is the basic, downloading todays drop list, you can of course modify this to download any day you choose. 

$today = date("Y-m-d");
$query = "SELECT domain FROM `zonefile` WHERE `dropdate` = '$today';";
$result = mysql_query($query); 
$fp = fopen('php://output', 'w'); 
if ($fp && $result) {     
       header('Content-Type: text/csv');
       header('Content-Disposition: attachment; filename="'.$today.'.txt"');
       header('Pragma: no-cache');    
       header('Expires: 0');
       while ($row = mysql_fetch_row($result)) {
          fputcsv($fp, array_values($row)); 
       }
}

The above will simply download todays file, I would suggest including a switch to change the date, but its up to you, how you do it. 

Searching The Database

Suppose you want to search the database and browse online, I have gone with absolute basics which is Domain and Drop Date only. A Very simple form is all that is needed.

<form id="search" method="post" action="./droplist.php">
<p>
<label for="drop" id="datelab">Search by Drop Date:</label> 
<input id="drop" name="drop" onClick="this.form.reset()"> (note todays renewal required is: <?php echo date('Y-m-d', strtotime('+ 92 days')); ?>, Suspended Date is <?php echo date('Y-m-d', strtotime('+ 60 days')); ?>)</p>
<p><label for="domain" id="domainlab">Search by Domain:</label> 
<input id="domain" name="domain" onClick="this.form.reset()"></p>
<p>
<input type="submit" name="submit" value="   SUBMIT   " /> &nbsp; <input type="reset" value="  RESET  ">
</p>
</form>

I have added some embellishments such as showing some useful dates to use for reference but these aren’t needed. I have also set it to clear the boxes when clicked to avoid crossed data on submit. 

Displaying Drop Lists and Data

Once you have submitted the search query using a form, or converted it to a GET[] rather than POST[] you need to display the results.

Firstly, you need to connect to the myql database…

$con = mysql_connect ('localhost', 'zone_zone', '9@55\/\/012D' or die ('I cannot connect to the database because: ' . mysql_error());
mysql_select_db ('zone_nzf) or die ('I cannot select the database because: ' . mysql_error());

I keep this in a function, with success or fail returned, but you an just bung it at the top of your file, its down to you. 

You will need to know if the form have been submitted and extract the Drop Date or Domain string.

if($_POST['submit']){
if($_POST['drop']){
$date = $_POST['drop'];
//
// drop date search code here.
//
} elseif($_POST['domain']) {
$doms = $_POST['domain'];
//
// domain search code here
//
} // close if drop
} //close submit

You don’t need to do this, but I always do it with any database query. Sometimes it only shows in debug mode, sometimes in the main view but I always have it in there. You’ll need to alter the Query a little for Drop Dates and Domain searches, but I’ll go with Domain Search code here.

$query=mysql_query("SELECT COUNT(*) AS `rows` FROM `zonefile` WHERE `domain` LIKE '%$domain%';");
$rows=mysql_num_rows($query);
echo "<p>This date ($date) has " . number_format($rows) . " records returns.</p>";

This is useful if you want to know how many records are returned on the given date or within the search query. You may deem it not worth while but thee codes there anyway. You should build this action in to the below code, rather than run it twice, but since you may not care how many lines I’ve omitted it below. I have also switched to dropdate search since I posted a search query above.

<table cellpadding="0" cellspacing="0" border="0">
<thead>
<th width="300">Domain</th>
<th width="150">Drop Date</th>
</thead>
<?php
$query = mysql_query("select * from zonefile where dropdate='$date' order by domain ASC;");
if(mysql_num_rows($query)>0){
	while($data=mysql_fetch_array($query,1)){
?>
<tr>
<td><strong><?php echo $data['domain'];?></strong> (<a href="http://webwhois.nic.uk/cgi-bin/webwhois.cgi?wvw7yesk=3hryr4hby3&wquery=<?php echo $data['domain'];?>">whois</a>)</td>
<td><?php echo $data['dropdate'];?></td>
</tr>
<?php } // while sql fetch
} //if rows ?> 
</table>

You probably should add a switch to adjust the background of each line to make it less headache inducing to read. I’m sure there are a dozen different ways to do this, but I went quick and easy to read code.

if($lalt==1){
	$row ='#888888';
	$fon='#FFFFFF';
	$lalt="1";
}else {
	$row ='#EEEEEE';
	$fon='#000000';
	$lalt="0";
}

Insert the above “if…then” statement into the “do…while” code block. Putting it there means on each iteration lalt (line alt) will change from 0 to 1 and the colour panels will change each row. Something like this would work…

<tr bgcolor="<?php echo $row; ?>">
<td><strong><font face="Courier New, Courier, monospace" color="<?php echo $fon; ?>"><?php echo $data['domain'];?></font></strong> (<a href="http://webwhois.nic.uk/cgi-bin/webwhois.cgi?wvw7yesk=3hryr4hby3&wquery=<?php echo $data['domain'];?>">whois</a>)</td>
<td><?php echo $data['dropdate'];?></td>
</tr>

So thats a simple displaying a given date. How you get the date to the above page is down to you. You can either use a GET[] or a POST[] function to get the date and pass it to the code. I would use a form to feed a POST[] array and then act, but a GET[] can be useful to quickly change date…

$date = $_GET[dd];

Then call the page with “dropdate.php?dd=2016-11-24” or similar. 

Other Applications

These are just some ideas I toyed with, Domain Watchlists, you could set up a Crontab for certain keyterms, and when the application detects the keyword in any given days list, it sends an email with the domain list. 

The download script I posted at the top of this could be very easily modified for this job. 

If you have your own tag, you could create a watchlist to watch your tag and track renewals and modifications or any other tag if you stored tag data. 

How much data you store, is subject to Nominets allowance and your own personal choice.

I have a few more things to write about this, but this concludes the basics. I’ll do a summery post and perhaps put the code together in to a workable solution but 95% of all the code you need in in these files and just needs sticking together.

You should now have a 2-3gb database, populated with around 11m records, and able to produce your own drop lists.

Important Notice

Before using this or any of the code I have posted you should sanitise it and enhance security, this code is NOT meant for public use. In order to keep the code simply, it has been aimed at private access so security hasn’t been a huge concern.

DO NOT DEPLOY THIS CODE.  

 

Building A Domain Drop List

Well here comes part 2, of the guide. What Do You Need is Part 1 which details the requirements, application process, etc. This article assumes you have Nominet Membership, EPP, DAC and Zonefile Access along with suitable hosting.

Drop List Building Applications

You could do this in 1 large application to handle it all, but I think that’s a mistake. Writing a collection of small tools each with a simple job, reduces server load and risk of timing out. Not to mention makes it easier to handle.

So first things first…

A Simple Database

You will need a simple database to hold the list and dates…

id(int), domain(varchar(136)), dropdate(timestamp), updated(timestamp) on update).

How simple is that database… I haven’t posted a database schema, since you may want to add Tag, Creation Date, Expiry Date, even break it down to show Keywords or Extensions, or anything else you require really. You could also include things like domain length, if it exists in other .UK family extensions and much more, so I have just given you the absolute basic.

I would personally include the domain length, the second level family extension, and possibly creation year.

Loading The List

The first small application is one which can read the 10.5 million names in the CSV and load these into the database. The odds are a shared hosting account wouldn’t be able to handle this kind of long resource hungry process, hence why you’ll need Suitable VPS hosting or similar.

This application is as simple as…

$file = "path/to.csv";
$handle = fopen($file,"r");
do{
$domain = strtolower(trim($data[0]));
$result = mysql_query("INSERT INTO `zonefile`.`droplist` (`id`, `domain`, `dropdate`, `updated`) 
VALUES (NULL, '$domain', NULL, CURRENT_TIMESTAMP);");
}
while($data=fgetcsv($handle,1000,",","'"));

This can take anywhere up to half hour I would guess, depending on the power of your server and available memory.

In order to add domain lenth, you would need to have added a length column to the database earlier. Once that’s done either with the MySQL command char_length() or the php command length(). The easiest would be…

$result = mysql_query("INSERT INTO `zonefile`.`droplist` (`id`, `domain`, `length`, `dropdate`, `updated`) 
VALUES (NULL, '$domain', CHAR_LENGTH($domain), NULL, CURRENT_TIMESTAMP);");

You could just as easily do…

$domain = strtolower(trim($data[0]));
$length = length($domain);
$result = mysql_query("INSERT INTO `zonefile`.`droplist` (`id`, `domain`,  `length`, `dropdate`, `updated`) 
VALUES (NULL, '$domain', '$length', NULL, CURRENT_TIMESTAMP);");

You’re choice entirely, adding the extension would work the same way.

Obtaining The Drop Dates

When the domain names are loaded into the database, you will need another small application to read them one by one or in clusters, poll them with the Nominet DAC, and populate the database with the returned data. Since I posted a Dac Query Snippet already, I’ll just link to that, and you can add in the MySQL and the loop yourself.

You could use something like

SELECT domain FROM 'zonefile'.'droplist' WHERE 'dropdate" IS NULL LIMIT 1400

I have selected the limit of 1400, this will take approximately 5 minutes at 200ms / 5x per second allowing for latency. A simple CRONJOB set to load the script every 5 minutes and you’re golden. You can do smaller or greater amounts but it will eat memory and resources potentially making the server sluggish. Experiment a little but remember to adjust your CRON and bear in mind the DAC limitations.

Assuming you have used my DAC Query Code and added the extra bits. You will need to use an SQL Query to extrapolate the dropdate from expiry date which is returned by the DAC. I’m going to assume you have moved the expiry into a variable, but you can work on the array value too.

UPDATE `zonefile`.`droplist` SET `dropdate` = date_add('$expiry',INTERVAL 92 DAY)
 WHERE `domain` = '$domain';

The above query updates the dropdate, where the domain matches, and adds 92 days on to the returned expiry date to create the expected drop date. You can add any other data you want based on the DAC output by adding to this query as you wish.

DAC Limitations and Rules

Now would be a good time to discus the Rules and Limitations of the Domain Availability Checker (DAC), the DAC Usage Instructions are here too.. You are limited to polling 432,000 queries per day, with a maximum of 16 names per second (1,000 per rolling minute). By queries that means ‘#limits’ or ‘#usage’ or actual domains they all count. Go over either of these limits and you will be blocked from DAC access until your quota recovers on a rolling 24 hour basis.

IF you do happen to hit a block, then the DAC will return a result like…

domain.co.uk,B,35065

A simple if…then trap, will be able to detect this and convert it from the number of seconds (35065) into a human readable time frame, I used to use this old code snippet.

if($response[1] == "B"){
	$blocktime = gmdate("G:i:s", $response[2]);
	$blocktime = explode(':', $blocktime);
	if($blocktime[0] == 00){
		$blocktime = $blocktime[1] . " minutes and ". $blocktime[2] ." seconds";
	}else{
		$blocktime = $blocktime[0] . " hours and " . $blocktime[1] . " minutes and ". $blocktime[2] ." seconds";
	}
	echo $blocktime;
	socket_send ($sock, "#exit\r\n", 9, 0);
	socket_shutdown($sock, 2);
	socket_close($sock);
	die();
}

Well that’s your drop list, built and populated with a little expense, a chunk of time and some basic coding, much of it done for you.

Depending on the efficiency of your code, server latency and some other factors, it could take up to 26 days scan the whole zone file, by this point your database would be up to 4 weeks out of date. This is because the zone file is 24 hours old when released. Part 3: Maintaining A Drop List, we will deal with this problem and work on updating the database.

What You Need to Build A Domain Drop List

Nomient Logo

Nomient Logo

This article is going to be a 4 part jobby, with a few side articles possibly, as it turned out to be somewhat longer than I expected. Most people reading this blog will know what a drop list is, but you may not know how to make one or how much effort and expense goes into it. Currently a lot less effort goes into since Nominet released the zone files. The old way will be one of the side articles I cover another time.

What is a drop list is quite a simple question; a list of domains due to expire on any given day. I’m going to talk about what you need to build one in this post, and in the next one how you build your own drop list and the costs you will likely incur in both parts. After that it will be an article on maintaining the drop list and buying drop lists in the final part. Some of the methods are hard earned lessons, which will save you time. I won’t be giving all my secrets away, some will be old methods, so there are better ways to do it, but they still work. I will also be dropping in some chunks of code too, the missing bits will be easy enough with basic coding skills which I assume you have.

Where To Start ?

Building your own drop list, isn’t too hard. It is however quite costly and time consuming, not to mention fraught with rules from Nominet. The rules are somewhat open to interpretation so I’m not going to go there, better to speak to Nominet directly about them.

Nominet Membership

Firstly a Nominet Tag is required, which is FREE, however this isn’t enough, a Nominet Membership is required. This membership costs £400+Vat to Join, then £100+Vat per year membership.

You will also need DAC Access which is £25+VAT per year, that’s the last of the Nominet costs, but not the end.

A list of Fee’s are available here… Nominet Fee Schedule, you can see the main benefit here is the cost of domains at wholesale prices, but direct access to Nominet systems is essential for list building and drop catching.

Suitable Hosting

Suitable hosting is quite subjective, but I would recommend a VPS Hosting Account. This is because shared hosting almost certainly won’t be suitable. You’ll hit your resource limits and get an get somewhat unhappy email from your host, if not asking you to upgrade or sling your hook.

A suitable VPS will cost you anywhere from £10-30 per month. This is assuming you are comfortable and able to manage a Linux Server and install PHP, MySQL, Apache and manage the required security updates yourself. Otherwise a Managed VPS will be possibly £30-80+ per month, do your own research and choose wisely.

An important factor here to remember is, unlike with Drop Catching where the speed between your server and Nominet is Critical, in this instance it doesn’t matter at all, so cheap with a decent reputation and good support is your objective.

Alternatives to VPS and Shared Hosting

I have heard of people doing this on a business hosting account, which is often half way between a low end VPS and a standard shared account or more simply a shared hosting account with more resources.

There are also a number of people who have claim they used an install of WAMP (Windows, Apache, MySQL and PHP 0r MAMP (Mac, Apache, MySQL an PHP) on a local machine, machine on their network or even on their own PC.

You could also build such a thing on a local NAS Server like a Synology NAS Server or qNAP or any other for that matter. I personally have a test environment on one of my Synology units and see no reason most 2-Bay units wouldn’t be able to handle a project of this size.

These routes are worth looking in to, but I can’t comment on any of them with regards to efficacy, as I haven’t done them.

Apply for Zone File Access

Once you’re 1, a Nominet Member with DAC Access, 2, have your hosting sorted, you need apply to Nominet for Zone File access. You have to be a Nominet Member to gain access to this. When you’ve been granted Zone File access, you need to download and process the file. I blogged on the .UK Zone File Release, to give you an idea of the process.

Nominet Zonefile Zip File Content

Nominet Zonefile Zip File Content

The file you will download is around 240mb; a zip file which contains 9 unique files inside (see right). These are individual Zone File for each available extension under the .UK ccTLD, all managed by Nominet. Exacting them all will consume just over 1.5GB of storage, more or less depending on destination disk format.

Even though there are 9 files in the archive there are only 2 types of file.

1, Zone Files, these contain details about the zone, along with domains and their name servers. We won’t be using these, for drop lists we don’t need name servers.

2, Database Dump, which is a Comma Separated Value (CSV) file.

The CSV file is a literally just a list of domains, with nothing in there which makes it very very easy to process and quite fast. It will look like the list below…

domain1.co.uk
domain2.co.uk
domain3.net.uk
domain4.org.uk
domain5.uk

Its important to note, neither the individual zone files, or the database dump contain any dates, tags or anything more other than domain names or domain names, zone data and name servers.

In Part 2, I will discuss bringing the above together to actually build a drop list.

Nominet to Release Zone File in May 2016

Nominet (the body which operates the .UK family of domain names) has decided to finally release a zone file for the .UK registry in line with most other registries. This appears to be a continuation of Nominets moves to align with ICANN.

For those who don’t know what a zone file is, its a list of every registered domain name under the .TLD (Top Level Domain) which in this case is .UK, so it should include third level .co.uk, .org.uk, .met.uk, .ltd.uk, .plc.uk, .net.uk and finally the second level .uk.

Why so important ?
Well, if you are a copyright holder, and want to see if people are abusing your copyright, a zone file allows you to quickly search for your string and spot any offenders.

More useful in my context and likely yours, is that it makes Drop Lists (a list of each days soon to be released domains) complete. The last time a Zone File were available, were around 2004/2005 which contains approx 2,000,000 names and averages around 800-1,400 drops per day.

Currently, I believe the largest databases out there are around 8.5 million names, which leaves a short fall of 2,000,000 names unaccounted.  On the 8.5m lists, there are around 3,000-4,000 domains per day released, so the amount missing could be 10,000 extra per week. Don’t think these are all heading to drop lists, lots will be renewed.

It could be easier than ever to find the perfect domain name from may onwards

Its not clear yet…
Its not 100% clear, what exactly Nominet are classing as the zone file. Historically, they have protected the zone file data as their IP (Intellectual Property), so its not clear how much they will release.